Compare commits

...

88 Commits

Author SHA1 Message Date
Spring Buildmaster
a96253f9a4 DATAMONGO-1247 - Release version 1.6.3.RELEASE (Evans SR3). 2015-06-30 16:04:49 -07:00
Oliver Gierke
a0884b8227 DATAMONGO-1247 - Prepare 1.6.3.RELEASE (Evans SR3). 2015-07-01 00:12:28 +02:00
Oliver Gierke
2e6efae925 DATAMONGO-1247 - Updated changelog. 2015-07-01 00:12:11 +02:00
Oliver Gierke
4e688574fc DATAMONGO-1248 - Updated changelog. 2015-06-30 14:00:01 +02:00
Oliver Gierke
3abd38953c DATAMONGO-1228 - Updated changelog. 2015-06-30 13:59:53 +02:00
Oliver Gierke
026f3fadd3 DATAMONGO-1189 - Updated changelog. 2015-06-30 13:59:34 +02:00
Oliver Gierke
14e352aeae DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:55:42 +02:00
Christoph Strobl
41aa498a1e DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:55:37 +02:00
Christoph Strobl
74cead91fc DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:51:32 +02:00
Christoph Strobl
559f160b21 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:47:22 +02:00
Christoph Strobl
62fdd3e519 DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:32 +02:00
Thomas Darimont
1a414fa2b6 DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:38:01 +02:00
Oliver Gierke
29a712e298 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:56:55 +02:00
Eddú Meléndez
65a164874a DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:38:45 +02:00
Oliver Gierke
d6ff5162b3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:26:54 +02:00
Christoph Strobl
28a84cafc5 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:26:40 +02:00
Stefan Ganzer
eb10b5c426 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:26:15 +02:00
Thomas Darimont
b7b24215e3 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-05-25 13:55:59 +02:00
Oliver Gierke
2a27eb7404 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:13:40 +02:00
Oliver Gierke
ad3ad1f65f DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:08:13 +02:00
Oliver Gierke
e7278888a7 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:53:01 +02:00
Oliver Gierke
374fa5a1f2 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:48:57 +02:00
Oliver Gierke
be92489c58 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:15:55 +02:00
Christoph Strobl
88172c8674 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:37 +01:00
Oliver Gierke
2dc2072fce DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:30 +01:00
Thomas Darimont
a7b70d68d7 DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:22 +01:00
Oliver Gierke
6dc2de03ad DATAMONGO-1173 - Updated changelog. 2015-03-06 08:02:08 +01:00
Thomas Darimont
dc0a03ee63 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:25:05 +01:00
Thomas Darimont
504c44cdf3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:27 +01:00
Oliver Gierke
9ea8b6f6d1 DATAMONGO-1144 - Fixed version numbers to point to the next bugfix release. 2015-02-03 11:54:58 +01:00
Oliver Gierke
37c9de6af9 DATAMONGO-1155 - Upgraded mongo-next build profile to driver version 2.13.0.
Related ticket: DATAMONGO-1154.
2015-01-29 21:16:48 +01:00
Oliver Gierke
9a1ee81f3f DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:03:46 +01:00
Oliver Gierke
c8331c3a1c DATAMONGO-1144 - After release cleanups. 2015-01-28 17:41:43 +01:00
Spring Buildmaster
b794d536c5 DATAMONGO-1144 - Prepare next development iteration. 2015-01-28 04:17:48 -08:00
Spring Buildmaster
afa05122d2 DATAMONGO-1144 - Release version 1.6.2.RELEASE (Evans SR2). 2015-01-28 04:17:39 -08:00
Oliver Gierke
cf06a6527c DATAMONGO-1144 - Prepare 1.6.2.RELEASE (Evans SR2). 2015-01-28 12:03:12 +01:00
Oliver Gierke
0760a8b425 DATAMONGO-1144 - Updated changelog. 2015-01-28 11:23:25 +01:00
Oliver Gierke
f65cd1d235 DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:30 +01:00
Oliver Gierke
6f86208111 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:59 +01:00
Oliver Gierke
0c33f7e01a DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:39:54 +01:00
alex-on-java
047df58724 DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 18:02:50 +01:00
Christoph Strobl
9716a93576 DATAMONGO-1144 - Upgrade dependencies.
mongo-java-driver: 2.12.3 > 2.12.5
2015-01-22 08:50:19 +01:00
Thomas Darimont
03e5ebb741 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:48:03 +01:00
Thomas Darimont
6ac908b72e DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:05:23 +01:00
Thomas Darimont
5f1049f2de DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:19:52 +01:00
Christoph Strobl
07c79e8ba9 DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:18:43 +01:00
Oliver Gierke
a94cca3494 DATAMONGO-1106 - Updated changelog. 2015-01-20 07:37:27 +01:00
Oliver Gierke
90842844d4 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:24:48 +01:00
Oliver Gierke
f4c27407ac DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:41 +01:00
Oliver Gierke
00e1ebb880 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:49:46 +01:00
Thomas Darimont
dd32479ad4 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:48:44 +01:00
Christoph Strobl
ce5fe50550 DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:48:28 +01:00
Christophe Fargette
3a04bcb5d3 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:46 +01:00
Oliver Gierke
acfe6ccc2c DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:54:29 +01:00
Oliver Gierke
3f3ec19364 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:38:17 +01:00
Thomas Darimont
24cbef9bc5 DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:08:43 +01:00
Oliver Gierke
b5f74444de DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:32 +01:00
Oliver Gierke
1b7e077296 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:38:06 +01:00
Thomas Darimont
7ebaf935d4 DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:37:55 +01:00
Oliver Gierke
295d7579cb DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:23:48 +01:00
Christoph Strobl
a99950d83e DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:55 +01:00
Christoph Strobl
0bf4ee2711 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Original pull request: #241.
2014-11-27 17:14:23 +01:00
Thomas Darimont
5b8da8dd41 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:46:24 +01:00
Mikhail Mikhaylenko
b56ca97f68 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:42:51 +01:00
Thomas Darimont
f183b5c7e6 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:36:03 +01:00
Oliver Gierke
1fdde8f0c3 DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:24 +01:00
Christoph Strobl
b7c3e69653 DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:38:21 +01:00
Oliver Gierke
e64d69b8f5 DATAMONGO-1079 - After release cleanups. 2014-10-31 21:44:01 +01:00
Spring Buildmaster
d81ed203db DATAMONGO-1079 - Prepare next development iteration. 2014-10-30 14:36:37 +01:00
Spring Buildmaster
eae463622c DATAMONGO-1079 - Release version 1.6.1.RELEASE (Evans SR1). 2014-10-30 14:36:13 +01:00
Oliver Gierke
6ba8144bca DATAMONGO-1079 - Prepare 1.6.1.RELEASE (Evans SR1). 2014-10-30 12:33:08 +01:00
Oliver Gierke
9d1c1a9fc5 DATAMONGO-1079 - Updated changelog. 2014-10-30 11:57:35 +01:00
Oliver Gierke
f552ca8073 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:36:02 +01:00
Thomas Darimont
d7bd82c643 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:34 +01:00
Christoph Strobl
078cca83e3 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:30:08 +01:00
Christoph Strobl
93ae6815bd DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:56:59 +02:00
Christoph Strobl
13dcb8cda1 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 12:06:51 +02:00
Oliver Gierke
a4497bcf8a DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:14:12 +02:00
Christoph Strobl
6a82c47a4d DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:57 +02:00
Christoph Strobl
8f8f5b7ce4 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:39 +02:00
Oliver Gierke
c683813a7a DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:35 +02:00
Christoph Strobl
dbe983c3cb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:35 +02:00
Oliver Gierke
8e11fe84df DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 16:29:48 +02:00
Christoph Strobl
9eb2856840 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:05:52 +02:00
Thomas Darimont
828b379f1f DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-10-01 09:44:48 +02:00
Christoph Strobl
161fd8c09d DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:43:46 +02:00
Oliver Gierke
dc037dfef6 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:17:02 +02:00
Spring Buildmaster
c41653f9da DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 14:09:48 +02:00
93 changed files with 3964 additions and 619 deletions

26
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.5.0.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
<version>1.5.3.RELEASE</version>
</parent>
<modules>
@@ -29,9 +28,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.9.0.RELEASE</springdata.commons>
<mongo>2.12.3</mongo>
<mongo.osgi>2.12.3</mongo.osgi>
<springdata.commons>1.9.3.RELEASE</springdata.commons>
<mongo>2.12.5</mongo>
<mongo.osgi>2.12.5</mongo.osgi>
</properties>
<developers>
@@ -108,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.12.5-SNAPSHOT</mongo>
<mongo>2.14.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -116,7 +115,16 @@
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.2</mongo>
</properties>
</profile>
@@ -149,7 +157,7 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>http://repo.spring.io/libs-release</url>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,7 +14,7 @@
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
</dependency>
<dependency>
@@ -59,8 +59,8 @@
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,18 +11,19 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.RELEASE</version>
<version>1.6.3.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -50,7 +51,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -144,6 +145,12 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -415,7 +415,9 @@ public interface MongoOperations {
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -424,7 +426,9 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -652,14 +656,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>

View File

@@ -335,7 +335,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CommandResult result = execute(new DbCallback<CommandResult>() {
public CommandResult doInDB(DB db) throws MongoException, DataAccessException {
return db.command(command, options);
return readPreference != null ? db.command(command, readPreference) : db.command(command);
}
});
@@ -566,7 +566,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
BasicDBObject command = new BasicDBObject("geoNear", collection);
command.putAll(near.toDBObject());
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
List<Object> results = (List<Object>) commandResult.get("results");
results = results == null ? Collections.emptyList() : results;
@@ -641,7 +641,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
public long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
@@ -763,27 +767,33 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doInsertAll(Collection<? extends T> listToSave, MongoWriter<T> writer) {
Map<String, List<T>> objs = new HashMap<String, List<T>>();
for (T o : listToSave) {
Map<String, List<T>> elementsByCollection = new HashMap<String, List<T>>();
for (T element : listToSave) {
if (element == null) {
continue;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(element.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(o.getClass());
if (entity == null) {
throw new InvalidDataAccessApiUsageException("No Persitent Entity information found for the class "
+ o.getClass().getName());
throw new InvalidDataAccessApiUsageException("No PersistentEntity information found for " + element.getClass());
}
String collection = entity.getCollection();
List<T> collectionElements = elementsByCollection.get(collection);
List<T> objList = objs.get(collection);
if (null == objList) {
objList = new ArrayList<T>();
objs.put(collection, objList);
if (null == collectionElements) {
collectionElements = new ArrayList<T>();
elementsByCollection.put(collection, collectionElements);
}
objList.add(o);
collectionElements.add(element);
}
for (Map.Entry<String, List<T>> entry : objs.entrySet()) {
for (Map.Entry<String, List<T>> entry : elementsByCollection.entrySet()) {
doInsertBatch(entry.getKey(), entry.getValue(), this.mongoConverter);
}
}
@@ -1007,8 +1017,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Calling update using query: %s and update: %s in collection: %s",
serializeToJsonSafely(queryObj), serializeToJsonSafely(updateObj), collectionName));
}
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
@@ -1187,7 +1197,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { dboq, collection.getName() });
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { serializeToJsonSafely(dboq),
collection.getName() });
}
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
@@ -1415,7 +1426,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
handleCommandError(commandResult, command);
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
@@ -1622,7 +1633,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
@@ -1660,8 +1671,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Class<T> entityClass) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndRemove using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, sort, entityClass, collectionName));
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
@@ -1685,8 +1696,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndModify using query: %s fields: %s sort: %s for class: %s and update: %s "
+ "in collection: %s", serializeToJsonSafely(mappedQuery), fields, sort, entityClass,
serializeToJsonSafely(mappedUpdate), collectionName));
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
@@ -1995,13 +2007,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " in db.collection: " + collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s in db.collection: %s", serializeToJsonSafely(query),
collection.getFullName()));
}
return collection.findOne(query);
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " fields: " + fields + " in db.collection: "
+ collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s fields: %s in db.collection: %s",
serializeToJsonSafely(query), fields, collection.getFullName()));
}
return collection.findOne(query, fields);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -291,6 +292,19 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*

View File

@@ -88,7 +88,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
@@ -107,7 +107,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
@@ -363,7 +363,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -84,6 +84,15 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -187,15 +196,24 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation field with the given name. As no target is set explicitly, the name will be used as target
* as well.
* Creates an aggregation field with the given {@code name}.
*
* @param key
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/
public AggregationField(String key) {
this(key, null);
public AggregationField(String name) {
this(name, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,17 +22,33 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
public GeoNearOperation(NearQuery nearQuery) {
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
@@ -41,6 +57,10 @@ public class GeoNearOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,10 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation
* Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,7 +22,11 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation
* Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,10 +28,13 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* Encapsulates the aggregation framework {@code $project}-operation.
* <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -46,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,14 +17,15 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -71,10 +72,13 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
private final List<Object> converters;
private final Map<ConvertiblePair, CacheValue> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue> customWriteTargetTypes;
private final Map<Class<?>, CacheValue> rawWriteTargetTypes;
/**
* Creates an empty {@link CustomConversions} object.
*/
@@ -94,7 +98,9 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue>();
List<Object> toRegister = new ArrayList<Object>();
@@ -238,70 +244,103 @@ public class CustomConversions {
* @param sourceType must not be {@literal null}
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType) {
return getCustomWriteTarget(sourceType, null);
public Class<?> getCustomWriteTarget(final Class<?> sourceType) {
return getOrCreateAndCache(sourceType, rawWriteTargetTypes, new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, null, writingPairs);
}
});
}
/**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
* Returns the target type we can readTargetWriteLocl an inject of the given source type to. The returned type might
* be a subclass of the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply
* return the first target type matching or {@literal null} if no conversion can be found.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
public Class<?> getCustomWriteTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
Assert.notNull(sourceType);
if (requestedTargetType == null) {
return getCustomWriteTarget(sourceType);
}
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customWriteTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
}
});
}
/**
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might
* be a subclass of the given expected type though.
* Returns whether we have a custom conversion registered to readTargetWriteLocl into a Mongo native type. The
* returned type might be a subclass of the given expected type though.
*
* @param sourceType must not be {@literal null}
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType) {
Assert.notNull(sourceType);
return hasCustomWriteTarget(sourceType, null);
}
/**
* Returns whether we have a custom conversion registered to write an object of the given source type into an object
* of the given Mongo native target type.
* Returns whether we have a custom conversion registered to readTargetWriteLocl an object of the given source type
* into an object of the given Mongo native target type.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
return getCustomWriteTarget(sourceType, requestedTargetType) != null;
}
/**
* Returns whether we have a custom conversion registered to read the given source into the given target type.
* Returns whether we have a custom conversion registered to readTargetReadLock the given source into the given target
* type.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType must not be {@literal null}
* @return
*/
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
Assert.notNull(requestedTargetType);
return getCustomReadTarget(sourceType, requestedTargetType) != null;
}
/**
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
if (requestedTargetType == null) {
return null;
}
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customReadTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, readingPairs);
}
});
}
/**
* Inspects the given {@link ConvertiblePair}s for ones that have a source compatible type as source. Additionally
* checks assignability of the target type if one is given.
*
* @param sourceType must not be {@literal null}.
@@ -310,11 +349,15 @@ public class CustomConversions {
* @return
*/
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
Iterable<ConvertiblePair> pairs) {
Collection<ConvertiblePair> pairs) {
Assert.notNull(sourceType);
Assert.notNull(pairs);
if (requestedTargetType != null && pairs.contains(new ConvertiblePair(sourceType, requestedTargetType))) {
return requestedTargetType;
}
for (ConvertiblePair typePair : pairs) {
if (typePair.getSourceType().isAssignableFrom(sourceType)) {
Class<?> targetType = typePair.getTargetType();
@@ -328,32 +371,31 @@ public class CustomConversions {
}
/**
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
* Will try to find a value for the given key in the given cache or produce one using the given {@link Producer} and
* store it in the cache.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @param key the key to lookup a potentially existing value, must not be {@literal null}.
* @param cache the cache to find the value in, must not be {@literal null}.
* @param producer the {@link Producer} to create values to cache, must not be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue> cache, Producer producer) {
Assert.notNull(sourceType);
CacheValue cacheValue = cache.get(key);
if (requestedTargetType == null) {
return null;
if (cacheValue != null) {
return cacheValue.getType();
}
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType);
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey);
Class<?> type = producer.get();
cache.put(key, CacheValue.of(type));
if (readTargetTypeValue != null) {
return readTargetTypeValue.getType();
}
return type;
}
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs));
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
private interface Producer {
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType();
Class<?> get();
}
@WritingConverter

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,7 @@ import com.mongodb.DBObject;
*/
class DBObjectAccessor {
private final DBObject dbObject;
private final BasicDBObject dbObject;
/**
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
@@ -46,7 +46,7 @@ class DBObjectAccessor {
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
this.dbObject = dbObject;
this.dbObject = (BasicDBObject) dbObject;
}
/**
@@ -62,6 +62,11 @@ class DBObjectAccessor {
Assert.notNull(prop, "MongoPersistentProperty must not be null!");
String fieldName = prop.getFieldName();
if (!fieldName.contains(".")) {
dbObject.put(fieldName, value);
return;
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
DBObject dbObject = this.dbObject;
@@ -87,12 +92,16 @@ class DBObjectAccessor {
* @param property must not be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<Object, Object> source = this.dbObject.toMap();
Map<String, Object> source = this.dbObject;
Object result = null;
while (source != null && parts.hasNext()) {
@@ -108,14 +117,14 @@ class DBObjectAccessor {
}
@SuppressWarnings("unchecked")
private Map<Object, Object> getAsMap(Object source) {
private Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) {
return ((DBObject) source).toMap();
return (BasicDBObject) source;
}
if (source instanceof Map) {
return (Map<Object, Object>) source;
return (Map<String, Object>) source;
}
return null;

View File

@@ -178,7 +178,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
@@ -192,6 +192,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
FINALIZE_METHOD = Object.class.getDeclaredMethod("finalize");
} catch (Exception e) {
throw new RuntimeException(e);
}
@@ -255,6 +256,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
// DATAMONGO-1076 - finalize methods should not trigger proxy initialization
if (FINALIZE_METHOD.equals(method)) {
return null;
}
}
Object target = ensureResolved();

View File

@@ -135,8 +135,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param typeMapper the typeMapper to set
*/
public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
mappingContext) : typeMapper;
this.typeMapper = typeMapper == null
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper;
}
/*
@@ -237,7 +237,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, path);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
path);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
@@ -284,7 +285,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getName());
Object value = dbo.get(property.getFieldName());
if (value == null) {
return;
@@ -505,8 +506,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
: new BasicDBObject();
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
? mappingContext.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
writeInternal(obj, propDbObj, entity);
accessor.put(prop, propDbObj);
@@ -586,7 +587,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
} else {
@@ -638,12 +639,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey();
Object val = entry.getValue();
if (conversions.isSimpleType(key.getClass())) {
// Don't use conversion service here as removal of ObjectToString converter results in some primitive types not
// being convertable
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key);
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection || val.getClass().isArray()) {
@@ -664,6 +666,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbo;
}
/**
* Prepares the given {@link Map} key to be converted into a {@link String}. Will invoke potentially registered custom
* conversions and escape dots from the result as they're not supported as {@link Map} key in MongoDB.
*
* @param key must not be {@literal null}.
* @return
*/
private String prepareMapKey(Object key) {
Assert.notNull(key, "Map key must not be null!");
String convertedKey = potentiallyConvertMapKey(key);
return potentiallyEscapeMapKey(convertedKey);
}
/**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured.
@@ -679,13 +696,31 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (mapKeyDotReplacement == null) {
throw new MappingException(String.format("Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!", source));
throw new MappingException(String.format(
"Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!",
source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
}
/**
* Returns a {@link String} representation of the given {@link Map} key
*
* @param key
* @return
*/
private String potentiallyConvertMapKey(Object key) {
if (key instanceof String) {
return (String) key;
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString();
}
/**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured.
@@ -766,7 +801,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) {
if (value == null || target == null) {
if (value == null || target == null || target.isAssignableFrom(value.getClass())) {
return value;
}
@@ -778,7 +813,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Enum.valueOf((Class<Enum>) target, value.toString());
}
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
return conversionService.convert(value, target);
}
protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
@@ -852,16 +887,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, rawComponentType, sourceValue.size());
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
path));
items.add(
DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem), path));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
@@ -944,7 +979,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
TypeInformation<?> typeHint = typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);
@@ -979,10 +1014,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.write(obj, newDbo);
if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo);
return removeTypeInfo(newDbo, true);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
if (typeInformation.getType().equals(NestedDocument.class)) {
return removeTypeInfo(newDbo, false);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfo(newDbo, true);
}
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
@@ -996,12 +1035,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
/**
* Removes the type information from the conversion result.
* Removes the type information from the entire conversion result.
*
* @param object
* @param recursively whether to apply the removal recursively
* @return
*/
private Object removeTypeInfoRecursively(Object object) {
private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof DBObject)) {
return object;
@@ -1009,19 +1049,29 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (recursively) {
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfo(element, recursively);
}
} else {
removeTypeInfo(value, recursively);
}
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element);
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (!recursively) {
break;
}
} else {
removeTypeInfoRecursively(value);
}
}
@@ -1085,8 +1135,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private class ConverterAwareSpELExpressionParameterValueProvider
extends SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final ObjectPath path;
@@ -1098,7 +1148,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, ObjectPath path) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate,
ObjectPath path) {
super(evaluator, conversionService, delegate);
this.path = path;
@@ -1157,4 +1208,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject readRef(DBRef ref) {
return ref.fetch();
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.
*
* @author Christoph Strobl
* @since 1.8
*/
static class NestedDocument {
}
}

View File

@@ -34,10 +34,13 @@ import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter.NestedDocument;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
@@ -58,6 +61,7 @@ public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
@@ -250,8 +254,8 @@ public class QueryMapper {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = keyword.getValue();
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue(
property.with(keyword.getKey()), value);
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property)
: getMappedValue(property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue);
}
@@ -334,7 +338,8 @@ public class QueryMapper {
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
return entity.hasIdProperty()
&& (type.equals(DBRef.class) || entity.getIdProperty().getActualType().isAssignableFrom(type));
}
/**
@@ -382,10 +387,16 @@ public class QueryMapper {
*/
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
if (property == null || source == null || source instanceof DBObject) {
return source;
}
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
return new DBRef(ref.getDB(), ref.getRef(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
@@ -457,13 +468,20 @@ public class QueryMapper {
*/
public Object convertId(Object id) {
try {
return conversionService.convert(id, ObjectId.class);
} catch (ConversionException e) {
// Ignore
if (id == null) {
return null;
}
return delegateConvertToMongoType(id, null);
if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
}
/**
@@ -643,6 +661,10 @@ public class QueryMapper {
public Association<MongoPersistentProperty> getAssociation() {
return null;
}
public TypeInformation<?> getTypeHint() {
return ClassTypeInformation.OBJECT;
}
}
/**
@@ -785,7 +807,7 @@ public class QueryMapper {
*/
@Override
public String getMappedKey() {
return path == null ? name : path.toDotPath(getPropertyConverter());
return path == null ? name : path.toDotPath(isAssociation() ? getAssociationConverter() : getPropertyConverter());
}
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
@@ -802,7 +824,7 @@ public class QueryMapper {
try {
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation());
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d", ""), entity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
@@ -837,5 +859,77 @@ public class QueryMapper {
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE;
}
/**
* Return the {@link Converter} to use for creating the mapped key of an association. Default implementation is
* {@link AssociationConverter}.
*
* @return
* @since 1.7
*/
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTypeHint()
*/
@Override
public TypeInformation<?> getTypeHint() {
MongoPersistentProperty property = getProperty();
if (property == null) {
return super.getTypeHint();
}
if (property.getActualType().isInterface()
|| java.lang.reflect.Modifier.isAbstract(property.getActualType().getModifiers())) {
return ClassTypeInformation.OBJECT;
}
return NESTED_DOCUMENT;
}
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
protected static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -65,8 +66,8 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
entity.getTypeInformation());
return converter.convertToMongoType(source,
entity == null ? ClassTypeInformation.OBJECT : getTypeHintForEntity(source, entity));
}
/*
@@ -97,14 +98,14 @@ public class UpdateMapper extends QueryMapper {
if (rawValue instanceof Modifier) {
value = getMappedValue((Modifier) rawValue);
value = getMappedValue(field, (Modifier) rawValue);
} else if (rawValue instanceof Modifiers) {
DBObject modificationOperations = new BasicDBObject();
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
modificationOperations.putAll(getMappedValue(modifier).toMap());
modificationOperations.putAll(getMappedValue(field, modifier).toMap());
}
value = modificationOperations;
@@ -132,12 +133,31 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
private DBObject getMappedValue(Field field, Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
TypeInformation<?> typeHint = field == null ? ClassTypeInformation.OBJECT : field.getTypeHint();
Object value = converter.convertToMongoType(modifier.getValue(), typeHint);
return new BasicDBObject(modifier.getKey(), value);
}
private TypeInformation<?> getTypeHintForEntity(Object source, MongoPersistentEntity<?> entity) {
return processTypeHintForNestedDocuments(source, entity.getTypeInformation());
}
private TypeInformation<?> processTypeHintForNestedDocuments(Object source, TypeInformation<?> info) {
Class<?> type = info.getActualType().getType();
if (type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
return info;
}
if (!type.equals(source.getClass())) {
return info;
}
return NESTED_DOCUMENT;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
@@ -146,8 +166,8 @@ public class UpdateMapper extends QueryMapper {
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
return entity == null ? super.createPropertyField(entity, key, mappingContext) : //
new MetadataBackedUpdateField(entity, key, mappingContext);
return entity == null ? super.createPropertyField(entity, key, mappingContext)
: new MetadataBackedUpdateField(entity, key, mappingContext);
}
/**
@@ -194,47 +214,76 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
return new UpdatePropertyConverter(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getAssociationConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new UpdateAssociationConverter(getAssociation(), key);
}
/**
* Converter to skip all properties after an association property was rendered.
* Special mapper handling positional parameter {@literal $} within property names.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.7
*/
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private static class UpdateKeyMapper {
private final MongoPersistentProperty property;
private boolean associationFound;
private final Iterator<String> iterator;
protected UpdateKeyMapper(String rawKey) {
Assert.hasText(rawKey, "Key must not be null or empty!");
this.iterator = Arrays.asList(rawKey.split("\\.")).iterator();
this.iterator.next();
}
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param association must not be {@literal null}.
* @param property
* @return
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
protected String mapPropertyName(MongoPersistentProperty property) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
boolean inspect = iterator.hasNext();
while (inspect) {
if (associationFound) {
return null;
String partial = iterator.next();
boolean isPositional = isPositionalParameter(partial);
if (isPositional) {
mappedName += "." + partial;
}
inspect = isPositional && iterator.hasNext();
}
if (property.equals(source)) {
associationFound = true;
return mappedName;
}
boolean isPositionalParameter(String partial) {
if (partial.equals("$")) {
return true;
}
return source.getFieldName();
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}
/**
@@ -242,10 +291,11 @@ public class UpdateMapper extends QueryMapper {
* contained in the source update key.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final Iterator<String> iterator;
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
@@ -256,8 +306,7 @@ public class UpdateMapper extends QueryMapper {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
this.iterator.next();
this.mapper = new UpdateKeyMapper(updateKey);
}
/*
@@ -266,9 +315,37 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
public String convert(MongoPersistentProperty property) {
return mapper.mapPropertyName(property);
}
}
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
/**
* {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
*
* @author Christoph Strobl
*/
protected static class UpdateAssociationConverter extends AssociationConverter {
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
super(association);
this.mapper = new UpdateKeyMapper(key);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return super.convert(source) == null ? null : mapper.mapPropertyName(source);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -60,4 +60,10 @@ public class MongoMappingEventPublisher implements ApplicationEventPublisher {
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(java.lang.Object)
*/
public void publishEvent(Object event) {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,6 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexRes
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
/**
@@ -43,8 +42,7 @@ import org.springframework.util.Assert;
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
public class MongoPersistentEntityIndexCreator implements ApplicationListener<MappingContextEvent<?, ?>> {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
@@ -54,7 +52,7 @@ public class MongoPersistentEntityIndexCreator implements
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -65,7 +63,7 @@ public class MongoPersistentEntityIndexCreator implements
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -92,7 +90,7 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<?, ?> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
@@ -102,7 +100,7 @@ public class MongoPersistentEntityIndexCreator implements
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes(event.getPersistentEntity());
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -132,8 +130,8 @@ public class MongoPersistentEntityIndexCreator implements
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection()).createIndex(indexDefinition.getIndexKeys(),
indexDefinition.getIndexOptions());
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -117,7 +117,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.add(indexDefinitionHolder);
}
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
});
@@ -155,7 +155,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(), propertyDotPath,
collection, guard));
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
@@ -205,7 +205,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
TextIndexDefinition indexDefinition = indexDefinitionBuilder.build();
@@ -220,7 +220,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
private void appendTextIndexInformation(final String dotPath,
final TextIndexDefinitionBuilder indexDefinitionBuilder, MongoPersistentEntity<?> entity,
final TextIndexDefinitionBuilder indexDefinitionBuilder, final MongoPersistentEntity<?> entity,
final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@@ -230,7 +230,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
guard.protect(persistentProperty, dotPath);
if (persistentProperty.isLanguageProperty()) {
if (persistentProperty.isExplicitLanguageProperty() && !StringUtils.hasText(dotPath)) {
indexDefinitionBuilder.withLanguageOverride(persistentProperty.getFieldName());
}
@@ -256,7 +256,11 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage(), e);
LOGGER.info(e.getMessage(), e);
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.info(
String.format("Potentially invalid index structure discovered. Breaking operation for %s.",
entity.getName()), e);
}
} else if (includeOptions.isForce() || indexed != null) {
indexDefinitionBuilder.onField(propertyDotPath, weight);
@@ -462,8 +466,9 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
for (Path existingPath : paths) {
if (existingPath.cycles(property, path)) {
if (existingPath.cycles(property, path) && property.isEntity()) {
paths.add(new Path(property, path));
throw new CyclicPropertyReferenceException(property.getFieldName(), property.getOwner().getType(),
existingPath.getPath());
}
@@ -536,7 +541,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return false;
}
return path.contains(this.path);
return path.equals(this.path) || path.contains(this.path + ".") || path.contains("." + this.path);
}
}
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.mapping;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
@@ -35,6 +36,7 @@ import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.common.LiteralExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
@@ -53,32 +55,37 @@ import org.springframework.util.StringUtils;
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @DocumentField annotation!";
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private static final SpelExpressionParser PARSER = new SpelExpressionParser();
private final String collection;
private final String language;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
private final Expression expression;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
* collection name to the entities simple type name.
*
* @param typeInformation
* @param typeInformation must not be {@literal null}.
*/
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
super(typeInformation, MongoPersistentPropertyComparator.INSTANCE);
this.parser = new SpelExpressionParser();
this.context = new StandardEvaluationContext();
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
if (rawType.isAnnotationPresent(Document.class)) {
Document d = rawType.getAnnotation(Document.class);
this.collection = StringUtils.hasText(d.collection()) ? d.collection() : fallback;
this.language = StringUtils.hasText(d.language()) ? d.language() : "";
Document document = rawType.getAnnotation(Document.class);
this.expression = detectExpression(document);
this.context = new StandardEvaluationContext();
if (document != null) {
this.collection = StringUtils.hasText(document.collection()) ? document.collection() : fallback;
this.language = StringUtils.hasText(document.language()) ? document.language() : "";
} else {
this.collection = fallback;
this.language = "";
@@ -101,8 +108,7 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
return expression == null ? collection : expression.getValue(context, String.class);
}
/*
@@ -236,6 +242,31 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return null;
}
/**
* Returns a SpEL {@link Expression} frór the collection String expressed in the given {@link Document} annotation if
* present or {@literal null} otherwise. Will also return {@literal null} it the collection {@link String} evaluates
* to a {@link LiteralExpression} (indicating that no subsequent evaluation is necessary).
*
* @param document can be {@literal null}
* @return
*/
private static Expression detectExpression(Document document) {
if (document == null) {
return null;
}
String collection = document.collection();
if (!StringUtils.hasText(collection)) {
return null;
}
Expression expression = PARSER.parseExpression(document.collection(), ParserContext.TEMPLATE_EXPRESSION);
return expression instanceof LiteralExpression ? null : expression;
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.
@@ -275,28 +306,44 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
*/
private static class PropertyTypeAssertionHandler implements PropertyHandler<MongoPersistentProperty> {
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.PropertyHandler#doWithPersistentProperty(org.springframework.data.mapping.PersistentProperty)
*/
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
potentiallyAssertTextScoreType(persistentProperty);
potentiallyAssertLanguageType(persistentProperty);
potentiallyAssertDBRefTargetType(persistentProperty);
}
private void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
private static void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isLanguageProperty()) {
if (persistentProperty.isExplicitLanguageProperty()) {
assertPropertyType(persistentProperty, String.class);
}
}
private void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
private static void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isTextScoreProperty()) {
assertPropertyType(persistentProperty, Float.class, Double.class);
}
}
private void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
private static void potentiallyAssertDBRefTargetType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isDbReference() && persistentProperty.getDBRef().lazy()) {
if (persistentProperty.isArray() || Modifier.isFinal(persistentProperty.getActualType().getModifiers())) {
throw new MappingException(String.format(
"Invalid lazy DBRef property for %s. Found %s which must not be an array nor a final class.",
persistentProperty.getField(), persistentProperty.getActualType()));
}
}
}
private static void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
for (Class<?> potentialMatch : validMatches) {
if (ClassUtils.isAssignable(potentialMatch, persistentProperty.getActualType())) {
@@ -304,10 +351,9 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
throw new MappingException(String.format("Missmatching types for %s. Found %s expected one of %s.",
persistentProperty.getField(), persistentProperty.getActualType(),
StringUtils.arrayToCommaDelimitedString(validMatches)));
throw new MappingException(
String.format("Missmatching types for %s. Found %s expected one of %s.", persistentProperty.getField(),
persistentProperty.getActualType(), StringUtils.arrayToCommaDelimitedString(validMatches)));
}
}
}

View File

@@ -190,9 +190,18 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
*/
@Override
public boolean isLanguageProperty() {
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isAnnotationPresent(Language.class);
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isExplicitLanguageProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isExplicitLanguageProperty()
*/
@Override
public boolean isExplicitLanguageProperty() {
return isAnnotationPresent(Language.class);
};
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isTextScoreProperty()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,8 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
private Boolean isIdProperty;
private Boolean isAssociation;
private String fieldName;
private Boolean usePropertyAccess;
private Boolean isTransient;
/**
* Creates a new {@link CachingMongoPersistentProperty}.
@@ -85,4 +87,32 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
return this.fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#usePropertyAccess()
*/
@Override
public boolean usePropertyAccess() {
if (this.usePropertyAccess == null) {
this.usePropertyAccess = super.usePropertyAccess();
}
return this.usePropertyAccess;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#isTransient()
*/
@Override
public boolean isTransient() {
if (this.isTransient == null) {
this.isTransient = super.isTransient();
}
return this.isTransient;
}
}

View File

@@ -45,7 +45,7 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
int getFieldOrder();
/**
* Returns whether the propert is a {@link com.mongodb.DBRef}. If this returns {@literal true} you can expect
* Returns whether the property is a {@link com.mongodb.DBRef}. If this returns {@literal true} you can expect
* {@link #getDBRef()} to return an non-{@literal null} value.
*
* @return
@@ -61,14 +61,22 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
boolean isExplicitIdProperty();
/**
* Returns whether the property indicates the documents language either by having a {@link #getFieldName()} equal to
* {@literal language} or being annotated with {@link Language}.
* Returns true whether the property indicates the documents language either by having a {@link #getFieldName()} equal
* to {@literal language} or being annotated with {@link Language}.
*
* @return
* @since 1.6
*/
boolean isLanguageProperty();
/**
* Returns true when property being annotated with {@link Language}.
*
* @return
* @since 1.6.1
*/
boolean isExplicitLanguageProperty();
/**
* Returns whether the property holds the documents score calculated by text search. <br/>
* It's marked with {@link TextScore}.

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
@@ -25,6 +27,7 @@ import com.mongodb.util.JSON;
* @author Thomas Risberg
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class BasicQuery extends Query {
@@ -97,4 +100,42 @@ public class BasicQuery extends Query {
protected void setFieldsObject(DBObject fieldsObject) {
this.fieldsObject = fieldsObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#equals(java.lang.Object)
*/
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (!(o instanceof BasicQuery)) {
return false;
}
BasicQuery that = (BasicQuery) o;
return querySettingsEquals(that) && //
nullSafeEquals(fieldsObject, that.fieldsObject) && //
nullSafeEquals(queryObject, that.queryObject) && //
nullSafeEquals(sortObject, that.sortObject);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#hashCode()
*/
@Override
public int hashCode() {
int result = super.hashCode();
result = 31 * result + nullSafeHashCode(queryObject);
result = 31 * result + nullSafeHashCode(fieldsObject);
result = 31 * result + nullSafeHashCode(sortObject);
return result;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.Collections;
import com.mongodb.BasicDBObject;
@@ -87,12 +88,8 @@ public class BasicUpdate extends Update {
@Override
public Update pullAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
DBObject keyValue = new BasicDBObject();
keyValue.put(key, convertedValues);
keyValue.put(key, Arrays.copyOf(values, values.length));
updateObject.put("$pullAll", keyValue);
return this;
}

View File

@@ -31,7 +31,9 @@ import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -515,8 +517,11 @@ public class Criteria implements CriteriaDefinition {
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
return criteriaChain.get(0).getSingleCriteriaObject();
} else if (CollectionUtils.isEmpty(this.criteriaChain) && !CollectionUtils.isEmpty(this.criteria)) {
return getSingleCriteriaObject();
} else {
DBObject criteriaObject = new BasicDBObject();
for (Criteria c : this.criteriaChain) {
@@ -550,6 +555,13 @@ public class Criteria implements CriteriaDefinition {
}
}
if (!StringUtils.hasText(this.key)) {
if (not) {
return new BasicDBObject("$not", dbo);
}
return dbo;
}
DBObject queryCriteria = new BasicDBObject();
if (!NOT_SET.equals(isValue)) {

View File

@@ -176,7 +176,7 @@ public class Query {
for (Order order : sort) {
if (order.isIgnoreCase()) {
throw new IllegalArgumentException(String.format("Gven sort contained an Order for %s with ignore case! "
throw new IllegalArgumentException(String.format("Given sort contained an Order for %s with ignore case! "
+ "MongoDB does not support sorting ignoreing case currently!", order.getProperty()));
}
}
@@ -385,12 +385,21 @@ public class Query {
return false;
}
Query that = (Query) obj;
return querySettingsEquals((Query) obj);
}
/**
* Tests whether the settings of the given {@link Query} are equal to this query.
*
* @param that
* @return
*/
protected boolean querySettingsEquals(Query that) {
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean fieldsEqual = nullSafeEquals(this.fieldSpec, that.fieldSpec);
boolean sortEqual = nullSafeEquals(this.sort, that.sort);
boolean hintEqual = nullSafeEquals(this.hint, that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
boolean metaEqual = nullSafeEquals(this.meta, that.meta);

View File

@@ -63,7 +63,7 @@ public class TextCriteria implements CriteriaDefinition {
}
/**
* For a full list of supported languages see the mongdodb reference manual for <a
* For a full list of supported languages see the mongodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param language

View File

@@ -64,7 +64,7 @@ public class Update {
}
/**
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exlude fields from making
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exclude fields from making
* it into the created {@link Update} object. Note, that this will set attributes directly and <em>not</em> use
* {@literal $set}. This means fields not given in the {@link DBObject} will be nulled when executing the update. To
* create an only-updating {@link Update} instance of a {@link DBObject}, call {@link #set(String, Object)} for each
@@ -189,12 +189,7 @@ public class Update {
* @return
*/
public Update pushAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
addMultiFieldOperation("$pushAll", key, convertedValues);
addMultiFieldOperation("$pushAll", key, Arrays.copyOf(values, values.length));
return this;
}
@@ -258,12 +253,7 @@ public class Update {
* @return
*/
public Update pullAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
addFieldOperation("$pullAll", key, convertedValues);
addFieldOperation("$pullAll", key, Arrays.copyOf(values, values.length));
return this;
}
@@ -495,13 +485,7 @@ public class Update {
return ((Collection<?>) values[0]).toArray();
}
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
return convertedValues;
return Arrays.copyOf(values, values.length);
}
/*

View File

@@ -27,6 +27,7 @@ import org.springframework.data.repository.PagingAndSortingRepository;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
@NoRepositoryBean
public interface MongoRepository<T, ID extends Serializable> extends PagingAndSortingRepository<T, ID> {
@@ -48,4 +49,26 @@ public interface MongoRepository<T, ID extends Serializable> extends PagingAndSo
* @see org.springframework.data.repository.PagingAndSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
List<T> findAll(Sort sort);
/**
* Inserts the given a given entity. Assumes the instance to be new to be able to apply insertion optimizations. Use
* the returned instance for further operations as the save operation might have changed the entity instance
* completely. Prefer using {@link #save(Object)} instead to avoid the usage of store-specific API.
*
* @param entity must not be {@literal null}.
* @return the saved entity
* @since 1.7
*/
<S extends T> S insert(S entity);
/**
* Inserts the given entities. Assumes the given entities to have not been persisted yet and thus will optimize the
* insert over a call to {@link #save(Iterable)}. Prefer using {@link #save(Iterable)} to avoid the usage of store
* specific API.
*
* @param entities must not be {@literal null}.
* @return the saved entities
* @since 1.7
*/
<S extends T> List<S> insert(Iterable<S> entities);
}

View File

@@ -38,7 +38,7 @@ import org.springframework.data.annotation.QueryAnnotation;
public @interface Query {
/**
* Takes a MongoDB JSON string to define the actual query to be executed. This one will take precendece over the
* Takes a MongoDB JSON string to define the actual query to be executed. This one will take precedence over the
* method name then.
*
* @return

View File

@@ -21,7 +21,6 @@ import java.util.List;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice;
import org.springframework.data.domain.SliceImpl;
@@ -88,39 +87,25 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
applyQueryMetaAttributesWhenPresent(query);
Object result = null;
if (isDeleteQuery()) {
result = new DeleteExecution().execute(query);
return new DeleteExecution().execute(query);
} else if (method.isGeoNearQuery() && method.isPageQuery()) {
MongoParameterAccessor countAccessor = new MongoParametersParameterAccessor(method, parameters);
Query countQuery = createCountQuery(new ConvertingParameterAccessor(operations.getConverter(), countAccessor));
result = new GeoNearExecution(accessor).execute(query, countQuery);
return new GeoNearExecution(accessor).execute(query, countQuery);
} else if (method.isGeoNearQuery()) {
result = new GeoNearExecution(accessor).execute(query);
return new GeoNearExecution(accessor).execute(query);
} else if (method.isSliceQuery()) {
result = new SlicedExecution(accessor.getPageable()).execute(query);
return new SlicedExecution(accessor.getPageable()).execute(query);
} else if (method.isCollectionQuery()) {
result = new CollectionExecution(accessor.getPageable()).execute(query);
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
result = new PagedExecution(accessor.getPageable()).execute(query);
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
result = new SingleEntityExecution(isCountQuery()).execute(query);
return new SingleEntityExecution(isCountQuery()).execute(query);
}
if (result == null) {
return result;
}
Class<?> expectedReturnType = method.getReturnType().getType();
if (expectedReturnType.isAssignableFrom(result.getClass())) {
return result;
}
return CONVERSION_SERVICE.convert(result, expectedReturnType);
}
private Query applyQueryMetaAttributesWhenPresent(Query query) {
@@ -211,6 +196,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* {@link Execution} for {@link Slice} query methods.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.5
*/
@@ -232,9 +218,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
int pageSize = pageable.getPageSize();
Pageable slicePageable = new PageRequest(pageable.getPageNumber(), pageSize + 1, pageable.getSort());
List result = operations.find(query.with(slicePageable), metadata.getJavaType(), metadata.getCollectionName());
// Apply Pageable but tweak limit to peek into next page
Query modifiedQuery = query.with(pageable).limit(pageSize + 1);
List result = operations.find(modifiedQuery, metadata.getJavaType(), metadata.getCollectionName());
boolean hasNext = result.size() > pageSize;
@@ -271,9 +259,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
Class<?> type = metadata.getJavaType();
int overallLimit = query.getLimit();
long count = operations.count(query, metadata.getCollectionName());
long count = operations.count(query, type, collectionName);
count = overallLimit != 0 ? Math.min(count, query.getLimit()) : count;
boolean pageableOutOfScope = pageable.getOffset() > count;
@@ -290,7 +280,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
query.limit(overallLimit - pageable.getOffset());
}
List<?> result = operations.find(query, metadata.getJavaType(), metadata.getCollectionName());
List<?> result = operations.find(query, type, collectionName);
return new PageImpl(result, pageable, count);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,13 +20,16 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.util.Arrays;
import java.util.Collection;
import java.util.Iterator;
import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -40,16 +43,19 @@ import org.springframework.data.repository.query.parser.Part.IgnoreCaseType;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Custom query creator to create Mongo criterias.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private static final Logger LOG = LoggerFactory.getLogger(MongoQueryCreator.class);
private static final Pattern PUNCTATION_PATTERN = Pattern.compile("\\p{Punct}");
private final MongoParameterAccessor accessor;
private final boolean isGeoNearQuery;
@@ -102,9 +108,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
Criteria criteria = from(part, property,
where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
Criteria criteria = from(part, property, where(path.toDotPath()), (PotentiallyConvertingIterator) iterator);
return criteria;
}
@@ -123,9 +127,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
return from(part, property,
base.and(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return from(part, property, base.and(path.toDotPath()), (PotentiallyConvertingIterator) iterator);
}
/*
@@ -194,7 +196,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
return addAppropriateLikeRegexTo(criteria, part, parameters.next().toString());
return createContainingCriteria(part, property, criteria, parameters);
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
@@ -212,7 +214,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
if (distance == null) {
return criteria.near(point);
} else {
if (distance.getMetric() != null) {
if (!Metrics.NEUTRAL.equals(distance.getMetric())) {
criteria.nearSphere(point);
} else {
criteria.near(point);
@@ -265,19 +267,23 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private Criteria createLikeRegexCriteriaOrThrow(Part part, MongoPersistentProperty property, Criteria criteria,
PotentiallyConvertingIterator parameters, boolean shouldNegateExpression) {
PropertyPath path = part.getProperty().getLeafProperty();
switch (part.shouldIgnoreCase()) {
case ALWAYS:
if (part.getProperty().getType() != String.class) {
throw new IllegalArgumentException(String.format("part %s must be of type String but was %s",
part.getProperty(), part.getType()));
if (path.getType() != String.class) {
throw new IllegalArgumentException(
String.format("Part %s must be of type String but was %s", path, path.getType()));
}
// fall-through
case WHEN_POSSIBLE:
if (shouldNegateExpression) {
criteria = criteria.not();
}
return addAppropriateLikeRegexTo(criteria, part, parameters.nextConverted(property).toString());
case NEVER:
@@ -288,6 +294,27 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
Arrays.asList(IgnoreCaseType.ALWAYS, IgnoreCaseType.WHEN_POSSIBLE), part.shouldIgnoreCase()));
}
/**
* If the target property of the comparison is of type String, then the operator checks for match using regular
* expression. If the target property of the comparison is a {@link Collection} then the operator evaluates to true if
* it finds an exact match within any member of the {@link Collection}.
*
* @param part
* @param property
* @param criteria
* @param parameters
* @return
*/
private Criteria createContainingCriteria(Part part, MongoPersistentProperty property, Criteria criteria,
PotentiallyConvertingIterator parameters) {
if (property.isCollectionLike()) {
return criteria.in(nextAsArray(parameters, property));
}
return addAppropriateLikeRegexTo(criteria, part, parameters.next().toString());
}
/**
* Creates an appropriate like-regex and appends it to the given criteria.
*
@@ -333,8 +360,8 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return (T) parameter;
}
throw new IllegalArgumentException(String.format("Expected parameter type of %s but got %s!", type,
parameter.getClass()));
throw new IllegalArgumentException(
String.format("Expected parameter type of %s but got %s!", type, parameter.getClass()));
}
private Object[] nextAsArray(PotentiallyConvertingIterator iterator, MongoPersistentProperty property) {
@@ -352,23 +379,57 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private String toLikeRegex(String source, Part part) {
Type type = part.getType();
String regex = prepareAndEscapeStringBeforeApplyingLikeRegex(source, part);
switch (type) {
case STARTING_WITH:
source = "^" + source;
regex = "^" + regex;
break;
case ENDING_WITH:
source = source + "$";
regex = regex + "$";
break;
case CONTAINING:
source = "*" + source + "*";
regex = ".*" + regex + ".*";
break;
case SIMPLE_PROPERTY:
case NEGATING_SIMPLE_PROPERTY:
source = "^" + source + "$";
regex = "^" + regex + "$";
default:
}
return source.replaceAll("\\*", ".*");
return regex;
}
private String prepareAndEscapeStringBeforeApplyingLikeRegex(String source, Part qpart) {
if (!ObjectUtils.nullSafeEquals(Type.LIKE, qpart.getType())) {
return PUNCTATION_PATTERN.matcher(source).find() ? Pattern.quote(source) : source;
}
if (source.equals("*")) {
return ".*";
}
StringBuilder sb = new StringBuilder();
boolean leadingWildcard = source.startsWith("*");
boolean trailingWildcard = source.endsWith("*");
String valueToUse = source.substring(leadingWildcard ? 1 : 0,
trailingWildcard ? source.length() - 1 : source.length());
if (PUNCTATION_PATTERN.matcher(valueToUse).find()) {
valueToUse = Pattern.quote(valueToUse);
}
if (leadingWildcard) {
sb.append(".*");
}
sb.append(valueToUse);
if (trailingWildcard) {
sb.append(".*");
}
return sb.toString();
}
}

View File

@@ -128,8 +128,7 @@ public class MongoQueryMethod extends QueryMethod {
MongoPersistentEntity<?> collectionEntity = domainClass.isAssignableFrom(returnedObjectType) ? returnedEntity
: managedEntity;
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(),
collectionEntity.getCollection());
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(), collectionEntity);
}
return this.metadata;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2014 the original author or authors.
* Copyright 2002-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -97,8 +97,8 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
return result;
} catch (JSONParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod(),
o_O));
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
/**
@@ -25,21 +26,22 @@ import org.springframework.util.Assert;
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final String collectionName;
private final MongoPersistentEntity<?> collectionEntity;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and collection name.
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and {@link MongoPersistentEntity} to use for
* collection lookups.
*
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param collectionEntity must not be {@literal null} or empty.
*/
public SimpleMongoEntityMetadata(Class<T> type, String collectionName) {
public SimpleMongoEntityMetadata(Class<T> type, MongoPersistentEntity<?> collectionEntity) {
Assert.notNull(type, "Type must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(collectionEntity, "Collection entity must not be null or empty!");
this.type = type;
this.collectionName = collectionName;
this.collectionEntity = collectionEntity;
}
/*
@@ -55,6 +57,6 @@ class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
* @see org.springframework.data.mongodb.repository.query.MongoEntityMetadata#getCollectionName()
*/
public String getCollectionName() {
return collectionName;
return collectionEntity.getCollection();
}
}

View File

@@ -29,6 +29,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.util.JSON;
/**
@@ -199,6 +200,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* {@link Collections#emptyList()}.
*
* @param input
* @param conversionService must not be {@literal null}.
* @return
*/
public List<ParameterBinding> parseParameterBindingsFrom(String input) {
@@ -229,14 +231,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
if (value instanceof String) {
String string = ((String) value).trim();
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(string);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
boolean quoted = (string.startsWith("'") && string.endsWith("'"))
|| (string.startsWith("\"") && string.endsWith("\""));
bindings.add(new ParameterBinding(paramIndex, quoted));
}
potentiallyAddBinding(string, bindings);
} else if (value instanceof Pattern) {
@@ -255,15 +250,37 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
bindings.add(new ParameterBinding(paramIndex, quoted));
}
} else if (value instanceof DBRef) {
DBRef dbref = (DBRef) value;
potentiallyAddBinding(dbref.getRef(), bindings);
potentiallyAddBinding(dbref.getId().toString(), bindings);
} else if (value instanceof DBObject) {
DBObject dbo = (DBObject) value;
for (String field : dbo.keySet()) {
collectParameterReferencesIntoBindings(bindings, field);
collectParameterReferencesIntoBindings(bindings, dbo.get(field));
}
}
}
private void potentiallyAddBinding(String source, List<ParameterBinding> bindings) {
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(source);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
boolean quoted = (source.startsWith("'") && source.endsWith("'"))
|| (source.startsWith("\"") && source.endsWith("\""));
bindings.add(new ParameterBinding(paramIndex, quoted));
}
}
}
/**

View File

@@ -35,7 +35,7 @@ import com.mysema.query.apt.Configuration;
import com.mysema.query.apt.DefaultConfiguration;
/**
* Annotation processor to create Querydsl query types for QueryDsl annoated classes.
* Annotation processor to create Querydsl query types for QueryDsl annotated classes.
*
* @author Oliver Gierke
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,8 +27,10 @@ import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QSort;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityInformation;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
@@ -43,18 +45,21 @@ import com.mysema.query.types.path.PathBuilder;
* Special QueryDsl based repository implementation that allows execution {@link Predicate}s in various forms.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleMongoRepository<T, ID> implements
QueryDslPredicateExecutor<T> {
private final PathBuilder<T> builder;
private final EntityInformation<T, ID> entityInformation;
private final MongoOperations mongoOperations;
/**
* Creates a new {@link QueryDslMongoRepository} for the given {@link EntityMetadata} and {@link MongoTemplate}. Uses
* the {@link SimpleEntityPathResolver} to create an {@link EntityPath} for the given domain class.
*
* @param entityInformation
* @param template
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
@@ -64,17 +69,21 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* Creates a new {@link QueryDslMongoRepository} for the given {@link MongoEntityInformation}, {@link MongoTemplate}
* and {@link EntityPathResolver}.
*
* @param entityInformation
* @param mongoOperations
* @param resolver
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations,
EntityPathResolver resolver) {
super(entityInformation, mongoOperations);
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.entityInformation = entityInformation;
this.mongoOperations = mongoOperations;
}
/*
@@ -98,7 +107,6 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, com.mysema.query.types.OrderSpecifier<?>[])
*/
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
return createQueryFor(predicate).orderBy(orders).list();
}
@@ -114,6 +122,28 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return new PageImpl<T>(applyPagination(query, pageable).list(), pageable, countQuery.count());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.SimpleMongoRepository#findAll(org.springframework.data.domain.Pageable)
*/
@Override
public Page<T> findAll(Pageable pageable) {
MongodbQuery<T> countQuery = createQuery();
MongodbQuery<T> query = createQuery();
return new PageImpl<T>(applyPagination(query, pageable).list(), pageable, countQuery.count());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.SimpleMongoRepository#findAll(org.springframework.data.domain.Sort)
*/
@Override
public List<T> findAll(Sort sort) {
return applySorting(createQuery(), sort).list();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#count(com.mysema.query.types.Predicate)
@@ -129,11 +159,16 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @return
*/
private MongodbQuery<T> createQueryFor(Predicate predicate) {
return createQuery().where(predicate);
}
Class<T> domainType = getEntityInformation().getJavaType();
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), domainType);
return query.where(predicate);
/**
* Creates a {@link MongodbQuery}.
*
* @return
*/
private MongodbQuery<T> createQuery() {
return new SpringDataMongodbQuery<T>(mongoOperations, entityInformation.getJavaType());
}
/**
@@ -166,6 +201,15 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return query;
}
// TODO: find better solution than instanceof check
if (sort instanceof QSort) {
List<OrderSpecifier<?>> orderSpecifiers = ((QSort) sort).getOrderSpecifiers();
query.orderBy(orderSpecifiers.toArray(new OrderSpecifier<?>[orderSpecifiers.size()]));
return query;
}
for (Order order : sort) {
query.orderBy(toOrder(order));
}

View File

@@ -19,6 +19,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
@@ -41,6 +42,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class SimpleMongoRepository<T, ID extends Serializable> implements MongoRepository<T, ID> {
@@ -48,7 +50,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
private final MongoEntityInformation<T, ID> entityInformation;
/**
* Creates a ew {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
* Creates a new {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
*
* @param metadata must not be {@literal null}.
* @param template must not be {@literal null}.
@@ -70,7 +72,12 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
Assert.notNull(entity, "Entity must not be null!");
mongoOperations.save(entity, entityInformation.getCollectionName());
if (entityInformation.isNew(entity)) {
mongoOperations.insert(entity, entityInformation.getCollectionName());
} else {
mongoOperations.save(entity, entityInformation.getCollectionName());
}
return entity;
}
@@ -82,11 +89,22 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<S> result = new ArrayList<S>();
List<S> result = convertIterableToList(entities);
boolean allNew = true;
for (S entity : entities) {
save(entity);
result.add(entity);
if (allNew && !entityInformation.isNew(entity)) {
allNew = false;
}
}
if (allNew) {
mongoOperations.insertAll(result);
} else {
for (S entity : result) {
save(entity);
}
}
return result;
@@ -181,7 +199,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public Iterable<T> findAll(Iterable<ID> ids) {
Set<ID> parameters = new HashSet<ID>();
Set<ID> parameters = new HashSet<ID>(tryDetermineRealSizeOrReturn(ids, 10));
for (ID id : ids) {
parameters.add(id);
}
@@ -209,6 +227,38 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
return findAll(new Query().with(sort));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#insert(java.lang.Object)
*/
@Override
public <S extends T> S insert(S entity) {
Assert.notNull(entity, "Entity must not be null!");
mongoOperations.insert(entity, entityInformation.getCollectionName());
return entity;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#insert(java.lang.Iterable)
*/
@Override
public <S extends T> List<S> insert(Iterable<S> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<S> list = convertIterableToList(entities);
if (list.isEmpty()) {
return list;
}
mongoOperations.insertAll(list);
return list;
}
private List<T> findAll(Query query) {
if (query == null) {
@@ -218,20 +268,27 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/**
* Returns the underlying {@link MongoOperations} instance.
*
* @return
*/
protected MongoOperations getMongoOperations() {
return this.mongoOperations;
private static <T> List<T> convertIterableToList(Iterable<T> entities) {
if (entities instanceof List) {
return (List<T>) entities;
}
int capacity = tryDetermineRealSizeOrReturn(entities, 10);
if (capacity == 0 || entities == null) {
return Collections.<T> emptyList();
}
List<T> list = new ArrayList<T>(capacity);
for (T entity : entities) {
list.add(entity);
}
return list;
}
/**
* @return the entityInformation
*/
protected MongoEntityInformation<T, ID> getEntityInformation() {
return entityInformation;
private static int tryDetermineRealSizeOrReturn(Iterable<?> iterable, int defaultSize) {
return iterable == null ? 0 : (iterable instanceof Collection) ? ((Collection<?>) iterable).size() : defaultSize;
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.repository.support;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.regex.Pattern;
import org.springframework.data.mapping.context.MappingContext;
@@ -41,7 +44,17 @@ import com.mysema.query.types.PathType;
*/
class SpringDataMongodbSerializer extends MongodbSerializer {
private final String ID_KEY = "_id";
private static final String ID_KEY = "_id";
private static final Set<PathType> PATH_TYPES;
static {
Set<PathType> pathTypes = new HashSet<PathType>();
pathTypes.add(PathType.VARIABLE);
pathTypes.add(PathType.PROPERTY);
PATH_TYPES = Collections.unmodifiableSet(pathTypes);
}
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -138,7 +151,7 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Path<?> parent = path.getMetadata().getParent();
if (parent == null) {
if (parent == null || !PATH_TYPES.contains(path.getMetadata().getPathType())) {
return null;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,7 @@ import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
@@ -49,11 +50,17 @@ public class ServerAddressPropertyEditorUnitTests {
/**
* @see DATAMONGO-454
* @see DATAMONGO-1062
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsAddressConfigWithoutASingleParsableServerAddress() {
public void rejectsAddressConfigWithoutASingleParsableAndResolvableServerAddress() {
editor.setAsText("foo, bar");
String unknownHost1 = "gugu.nonexistant.example.org";
String unknownHost2 = "gaga.nonexistant.example.org";
assertUnresolveableHostnames(unknownHost1, unknownHost2);
editor.setAsText(unknownHost1 + "," + unknownHost2);
}
/**
@@ -193,4 +200,16 @@ public class ServerAddressPropertyEditorUnitTests {
assertThat(addresses, hasItem(new ServerAddress(InetAddress.getByName(hostAddress), port)));
}
}
private void assertUnresolveableHostnames(String... hostnames) {
for (String hostname : hostnames) {
try {
InetAddress.getByName(hostname);
Assert.fail("Supposedly unresolveable hostname '" + hostname + "' can be resolved.");
} catch (UnknownHostException expected) {
// ok
}
}
}
}

View File

@@ -74,6 +74,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
@@ -185,9 +186,14 @@ public class MongoTemplateTests {
template.dropCollection(DocumentWithCollection.class);
template.dropCollection(DocumentWithCollectionOfSimpleType.class);
template.dropCollection(DocumentWithMultipleCollections.class);
template.dropCollection(DocumentWithNestedCollection.class);
template.dropCollection(DocumentWithEmbeddedDocumentWithCollection.class);
template.dropCollection(DocumentWithNestedList.class);
template.dropCollection(DocumentWithDBRefCollection.class);
template.dropCollection(SomeContent.class);
template.dropCollection(SomeTemplate.class);
template.dropCollection(Address.class);
template.dropCollection(DocumentWithCollectionOfSamples.class);
}
@Test
@@ -2221,6 +2227,243 @@ public class MongoTemplateTests {
assertThat(retrieved.model.value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenWholeCollectionIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models", Collections.singletonList(entry));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenFirstElementIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.0", entry);
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationOnDocumentWithNestedCollectionObjectInsertedAtSecondIndex() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.1", Collections.singletonMap("key2", new ModelA("value2")));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(1));
assertThat(retrieved.models.get(1).entrySet(), hasSize(1));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(1).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(1).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingPositionedElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.0", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingSecondElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.1", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(2));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value1"));
assertThat(retrieved.embeddedDocument.models.get(1).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenRewriting()
throws Exception {
List<Model> models = Arrays.<Model> asList(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument",
new DocumentWithCollection(Arrays.<Model> asList(new ModelA("value2"))));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnDocumentWithNestedLists() {
DocumentWithNestedList doc = new DocumentWithNestedList();
List<Model> entry = new ArrayList<Model>();
entry.add(new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithNestedList.class), notNullValue());
Update update = Update.update("models.0.1", new ModelA("value2"));
template.findAndModify(query, update, DocumentWithNestedList.class);
DocumentWithNestedList retrieved = template.findOne(query, DocumentWithNestedList.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0), hasSize(2));
assertThat(retrieved.models.get(0).get(0), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(0).value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get(1), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(1).value(), equalTo("value2"));
}
/**
* @see DATAMONGO-407
*/
@@ -2626,6 +2869,33 @@ public class MongoTemplateTests {
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyAddToSetWithEachShouldNotAddDuplicatesNorTypeHintForSimpleDocuments() {
DocumentWithCollectionOfSamples doc = new DocumentWithCollectionOfSamples();
doc.samples = Arrays.asList(new Sample(null, "sample1"));
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithCollectionOfSamples.class), notNullValue());
Update update = new Update().addToSet("samples").each(new Sample(null, "sample2"), new Sample(null, "sample1"));
template.findAndModify(query, update, DocumentWithCollectionOfSamples.class);
DocumentWithCollectionOfSamples retrieved = template.findOne(query, DocumentWithCollectionOfSamples.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.samples, hasSize(2));
assertThat(retrieved.samples.get(0).field, is("sample1"));
assertThat(retrieved.samples.get(1).field, is("sample2"));
}
/**
* @see DATAMONGO-888
*/
@@ -2740,6 +3010,23 @@ public class MongoTemplateTests {
assertThat(template.findAll(DBObject.class, "collection"), hasSize(0));
}
/**
* @see DATAMONGO-1207
*/
@Test
public void ignoresNullElementsForInsertAll() {
Address newYork = new Address("NY", "New York");
Address washington = new Address("DC", "Washington");
template.insertAll(Arrays.asList(newYork, null, washington));
List<Address> result = template.findAll(Address.class);
assertThat(result, hasSize(2));
assertThat(result, hasItems(newYork, washington));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@@ -2791,6 +3078,7 @@ public class MongoTemplateTests {
@Id public String id;
@Field("db_ref_list")/** @see DATAMONGO-1058 */
@org.springframework.data.mongodb.core.mapping.DBRef//
public List<Sample> dbRefAnnotatedList;
@@ -2814,12 +3102,36 @@ public class MongoTemplateTests {
List<String> values;
}
static class DocumentWithCollectionOfSamples {
@Id String id;
List<Sample> samples;
}
static class DocumentWithMultipleCollections {
@Id String id;
List<String> string1;
List<String> string2;
}
static class DocumentWithNestedCollection {
@Id String id;
List<Map<String, Model>> models = new ArrayList<Map<String, Model>>();
}
static class DocumentWithNestedList {
@Id String id;
List<List<Model>> models = new ArrayList<List<Model>>();
}
static class DocumentWithEmbeddedDocumentWithCollection {
@Id String id;
DocumentWithCollection embeddedDocument;
DocumentWithEmbeddedDocumentWithCollection(DocumentWithCollection embeddedDocument) {
this.embeddedDocument = embeddedDocument;
}
}
static interface Model {
String value();
@@ -2925,6 +3237,41 @@ public class MongoTemplateTests {
String state;
String city;
Address() {}
Address(String state, String city) {
this.state = state;
this.city = city;
}
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Address)) {
return false;
}
Address that = (Address) obj;
return ObjectUtils.nullSafeEquals(this.city, that.city) && //
ObjectUtils.nullSafeEquals(this.state, that.state);
}
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.city);
result += 31 * ObjectUtils.nullSafeHashCode(this.state);
return result;
}
}
static class VersionedPerson {

View File

@@ -43,7 +43,9 @@ import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.Version;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -52,18 +54,21 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCre
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
/**
* Unit tests for {@link MongoTemplate}.
@@ -353,6 +358,74 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(captor.getValue(), equalTo(new BasicDBObjectBuilder().add("foo", 1).get()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void aggregateShouldHonorReadPreferenceWhenSet() {
CommandResult result = mock(CommandResult.class);
when(result.get("result")).thenReturn(Collections.emptySet());
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class))).thenReturn(result);
when(db.command(Mockito.any(DBObject.class))).thenReturn(result);
template.setReadPreference(ReadPreference.secondary());
template.aggregate(Aggregation.newAggregation(Aggregation.unwind("foo")), "collection-1", Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class), eq(ReadPreference.secondary()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void aggregateShouldIgnoreReadPreferenceWhenNotSet() {
CommandResult result = mock(CommandResult.class);
when(result.get("result")).thenReturn(Collections.emptySet());
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class))).thenReturn(result);
when(db.command(Mockito.any(DBObject.class))).thenReturn(result);
template.aggregate(Aggregation.newAggregation(Aggregation.unwind("foo")), "collection-1", Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void geoNearShouldHonorReadPreferenceWhenSet() {
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class)))
.thenReturn(mock(CommandResult.class));
when(db.command(Mockito.any(DBObject.class))).thenReturn(mock(CommandResult.class));
template.setReadPreference(ReadPreference.secondary());
NearQuery query = NearQuery.near(new Point(1, 1));
template.geoNear(query, Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class), eq(ReadPreference.secondary()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void geoNearShouldIgnoreReadPreferenceWhenNotSet() {
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class)))
.thenReturn(mock(CommandResult.class));
when(db.command(Mockito.any(DBObject.class))).thenReturn(mock(CommandResult.class));
NearQuery query = NearQuery.near(new Point(1, 1));
template.geoNear(query, Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class));
}
class AutogenerateableId {
@Id BigInteger id;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -47,11 +47,14 @@ import org.springframework.core.io.ClassPathResource;
import org.springframework.dao.DataAccessException;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.geo.Metrics;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.Venue;
import org.springframework.data.mongodb.core.aggregation.AggregationTests.CarDescriptor.Entry;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.index.GeospatialIndex;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.data.util.Version;
@@ -120,6 +123,8 @@ public class AggregationTests {
mongoTemplate.dropCollection(User.class);
mongoTemplate.dropCollection(Person.class);
mongoTemplate.dropCollection(Reservation.class);
mongoTemplate.dropCollection(Venue.class);
mongoTemplate.dropCollection(MeterData.class);
}
/**
@@ -1018,6 +1023,53 @@ public class AggregationTests {
assertThat(dbo.get("dayOfYearPlus1DayManually"), is((Object) dateTime.plusDays(1).getDayOfYear()));
}
/**
* @see DATAMONGO-1127
*/
@Test
public void shouldSupportGeoNearQueriesForAggregationWithDistanceField() {
mongoTemplate.insert(new Venue("Penn Station", -73.99408, 40.75057));
mongoTemplate.insert(new Venue("10gen Office", -73.99171, 40.738868));
mongoTemplate.insert(new Venue("Flatiron Building", -73.988135, 40.741404));
mongoTemplate.indexOps(Venue.class).ensureIndex(new GeospatialIndex("location"));
NearQuery geoNear = NearQuery.near(-73, 40, Metrics.KILOMETERS).num(10).maxDistance(150);
Aggregation agg = newAggregation(Aggregation.geoNear(geoNear, "distance"));
AggregationResults<DBObject> result = mongoTemplate.aggregate(agg, Venue.class, DBObject.class);
assertThat(result.getMappedResults(), hasSize(3));
DBObject firstResult = result.getMappedResults().get(0);
assertThat(firstResult.containsField("distance"), is(true));
assertThat((Double) firstResult.get("distance"), closeTo(117.620092203928, 0.00001));
}
/**
* @see DATAMONGO-1133
*/
@Test
public void shouldHonorFieldAliasesForFieldReferences() {
mongoTemplate.insert(new MeterData("m1", "counter1", 42));
mongoTemplate.insert(new MeterData("m1", "counter1", 13));
mongoTemplate.insert(new MeterData("m1", "counter1", 45));
TypedAggregation<MeterData> agg = newAggregation(MeterData.class, //
match(where("resourceId").is("m1")), //
group("counterName").sum("counterVolume").as("totalValue"));
AggregationResults<DBObject> results = mongoTemplate.aggregate(agg, DBObject.class);
assertThat(results.getMappedResults(), hasSize(1));
DBObject result = results.getMappedResults().get(0);
assertThat(result.get("_id"), is(equalTo((Object) "counter1")));
assertThat(result.get("totalValue"), is(equalTo((Object) 100.0)));
}
private void assertLikeStats(LikeStats like, String id, long count) {
assertThat(like, is(notNullValue()));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,23 +22,30 @@ import org.junit.Test;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.query.NearQuery;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link GeoNearOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class GeoNearOperationUnitTests {
/**
* @see DATAMONGO-1127
*/
@Test
public void rendersNearQueryAsAggregationOperation() {
NearQuery query = NearQuery.near(10.0, 10.0);
GeoNearOperation operation = new GeoNearOperation(query);
GeoNearOperation operation = new GeoNearOperation(query, "distance");
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject nearClause = DBObjectTestUtils.getAsDBObject(dbObject, "$geoNear");
assertThat(nearClause, is(query.toDBObject()));
DBObject expected = (DBObject) new BasicDBObject(query.toDBObject().toMap()).append("distanceField", "distance");
assertThat(nearClause, is(expected));
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Field;
/**
* @author Thomas Darimont
*/
public class MeterData {
@Id String id;
String resourceId;
@Field("counter_name") String counterName;
double counterVolume;
public MeterData(String resourceId, String counterName, double counterVolume) {
this.resourceId = resourceId;
this.counterName = counterName;
this.counterVolume = counterVolume;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -169,6 +169,24 @@ public class TypeBasedAggregationOperationContextUnitTests {
assertThat(dbo.get("cursor"), is((Object) new BasicDBObject("foo", 1)));
}
/**
* @see DATAMONGO-1133
*/
@Test
public void shouldHonorAliasedFieldsInGroupExpressions() {
TypeBasedAggregationOperationContext context = getContext(MeterData.class);
TypedAggregation<MeterData> agg = newAggregation(MeterData.class,
group("counterName").sum("counterVolume").as("totalCounterVolume"));
DBObject dbo = agg.toDbObject("meterData", context);
DBObject group = getPipelineElementFromAggregationAt(dbo, 0);
DBObject definition = (DBObject) group.get("$group");
assertThat(definition.get("_id"), is(equalTo((Object) "$counter_name")));
}
@Document(collection = "person")
public static class FooPerson {

View File

@@ -541,6 +541,26 @@ public class DbRefMappingMongoConverterUnitTests {
assertProxyIsResolved(proxy, false);
}
/**
* @see DATAMONGO-1076
*/
@Test
public void shouldNotTriggerResolvingOfLazyLoadedProxyWhenFinalizeMethodIsInvoked() throws Exception {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(WithObjectMethodOverrideLazyDbRefs.class);
MongoPersistentProperty property = entity.getPersistentProperty("dbRefToConcreteTypeWithPropertyAccess");
String idValue = new ObjectId().toString();
DBRef dbRef = converter.toDBRef(new LazyDbRefTargetPropertyAccess(idValue), property);
WithObjectMethodOverrideLazyDbRefs result = converter.read(WithObjectMethodOverrideLazyDbRefs.class,
new BasicDBObject("dbRefToPlainObject", dbRef));
ReflectionTestUtils.invokeMethod(result.dbRefToPlainObject, "finalize");
assertProxyIsResolved(result.dbRefToPlainObject, false);
}
private Object transport(Object result) {
return SerializationUtils.deserialize(SerializationUtils.serialize(result));
}

View File

@@ -50,14 +50,18 @@ import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.ConversionNotSupportedException;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.annotation.TypeAlias;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.geo.Box;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
@@ -70,6 +74,7 @@ import org.springframework.data.mapping.model.MappingInstantiationException;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.convert.DBObjectAccessorUnitTests.NestedType;
import org.springframework.data.mongodb.core.convert.DBObjectAccessorUnitTests.ProjectingType;
import org.springframework.data.mongodb.core.convert.MappingMongoConverterUnitTests.ClassWithMapUsingEnumAsKey.FooBarEnum;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
@@ -1853,6 +1858,84 @@ public class MappingMongoConverterUnitTests {
converter.read(Item.class, source);
}
/**
* @see DATAMONGO-1058
*/
@Test
public void readShouldRespectExplicitFieldNameForDbRef() {
BasicDBObject source = new BasicDBObject();
source.append("explict-name-for-db-ref", new DBRef(mock(DB.class), "foo", "1"));
converter.read(ClassWithExplicitlyNamedDBRefProperty.class, source);
verify(resolver, times(1)).resolveDbRef(Mockito.any(MongoPersistentProperty.class), Mockito.any(DBRef.class),
Mockito.any(DbRefResolverCallback.class), Mockito.any(DbRefProxyHandler.class));
}
/**
* @see DATAMONGO-1118
*/
@Test
@SuppressWarnings("unchecked")
public void convertsMapKeyUsingCustomConverterForAndBackwards() {
MappingMongoConverter converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(new CustomConversions(Arrays.asList(new FooBarEnumToStringConverter(),
new StringToFooNumConverter())));
converter.afterPropertiesSet();
ClassWithMapUsingEnumAsKey source = new ClassWithMapUsingEnumAsKey();
source.map = new HashMap<FooBarEnum, String>();
source.map.put(FooBarEnum.FOO, "wohoo");
DBObject target = new BasicDBObject();
converter.write(source, target);
assertThat(converter.read(ClassWithMapUsingEnumAsKey.class, target).map, is(source.map));
}
/**
* @see DATAMONGO-1118
*/
@Test
public void writesMapKeyUsingCustomConverter() {
MappingMongoConverter converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(new CustomConversions(Arrays.asList(new FooBarEnumToStringConverter())));
converter.afterPropertiesSet();
ClassWithMapUsingEnumAsKey source = new ClassWithMapUsingEnumAsKey();
source.map = new HashMap<FooBarEnum, String>();
source.map.put(FooBarEnum.FOO, "spring");
source.map.put(FooBarEnum.BAR, "data");
DBObject target = new BasicDBObject();
converter.write(source, target);
DBObject map = DBObjectTestUtils.getAsDBObject(target, "map");
assertThat(map.containsField("foo-enum-value"), is(true));
assertThat(map.containsField("bar-enum-value"), is(true));
}
/**
* @see DATAMONGO-1118
*/
@Test
public void readsMapKeyUsingCustomConverter() {
MappingMongoConverter converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(new CustomConversions(Arrays.asList(new StringToFooNumConverter())));
converter.afterPropertiesSet();
DBObject source = new BasicDBObject("map", new BasicDBObject("foo-enum-value", "spring"));
ClassWithMapUsingEnumAsKey target = converter.read(ClassWithMapUsingEnumAsKey.class, source);
assertThat(target.map.get(FooBarEnum.FOO), is("spring"));
}
static class GenericType<T> {
T content;
}
@@ -2102,4 +2185,60 @@ public class MappingMongoConverterUnitTests {
@TextScore Float score;
}
class ClassWithExplicitlyNamedDBRefProperty {
@Field("explict-name-for-db-ref")//
@org.springframework.data.mongodb.core.mapping.DBRef//
ClassWithIntId dbRefProperty;
public ClassWithIntId getDbRefProperty() {
return dbRefProperty;
}
}
static class ClassWithMapUsingEnumAsKey {
static enum FooBarEnum {
FOO, BAR;
}
Map<FooBarEnum, String> map;
}
@WritingConverter
static class FooBarEnumToStringConverter implements Converter<FooBarEnum, String> {
@Override
public String convert(FooBarEnum source) {
if (source == null) {
return null;
}
return FooBarEnum.FOO.equals(source) ? "foo-enum-value" : "bar-enum-value";
}
}
@ReadingConverter
static class StringToFooNumConverter implements Converter<String, FooBarEnum> {
@Override
public FooBarEnum convert(String source) {
if (source == null) {
return null;
}
if (source.equals("foo-enum-value")) {
return FooBarEnum.FOO;
}
if (source.equals("bar-enum-value")) {
return FooBarEnum.BAR;
}
throw new ConversionNotSupportedException(source, String.class, null);
}
}
}

View File

@@ -658,6 +658,22 @@ public class QueryMapperUnitTests {
assertThat(dbo, equalTo(new BasicDBObjectBuilder().add("_id", 1).get()));
}
/**
* @see DATAMONGO-1070
*/
@Test
public void mapsIdReferenceToDBRefCorrectly() {
ObjectId id = new ObjectId();
DBObject query = new BasicDBObject("reference.id", new com.mongodb.DBRef(null, "reference", id.toString()));
DBObject result = mapper.getMappedObject(query, context.getPersistentEntity(WithDBRef.class));
assertThat(result.containsField("reference"), is(true));
com.mongodb.DBRef reference = getTypedValue(result, "reference", com.mongodb.DBRef.class);
assertThat(reference.getId(), is(instanceOf(ObjectId.class)));
}
@Document
public class Foo {
@Id private ObjectId id;

View File

@@ -20,9 +20,12 @@ import static org.hamcrest.collection.IsMapContaining.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.DBObjectTestUtils.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.hamcrest.Matcher;
import org.hamcrest.collection.IsIterableContainingInOrder;
@@ -358,8 +361,8 @@ public class UpdateMapperUnitTests {
public void rendersNestedDbRefCorrectly() {
Update update = new Update().pull("nested.dbRefAnnotatedList.id", "2");
DBObject mappedObject = mapper
.getMappedObject(update.getUpdateObject(), context.getPersistentEntity(Wrapper.class));
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(Wrapper.class));
DBObject pullClause = getAsDBObject(mappedObject, "$pull");
assertThat(pullClause.containsField("mapped.dbRefAnnotatedList"), is(true));
@@ -454,7 +457,6 @@ public class UpdateMapperUnitTests {
assertThat(((DBObject) updateValue).get("_class").toString(),
equalTo("org.springframework.data.mongodb.core.convert.UpdateMapperUnitTests$ModelImpl"));
}
}
/**
@@ -508,6 +510,200 @@ public class UpdateMapperUnitTests {
assertThat(list, equalTo(new BasicDBObjectBuilder().add("_id", "1").get()));
}
/**
* @see DATAMONGO-1077
*/
@Test
public void shouldNotRemovePositionalParameter() {
Update update = new Update();
update.unset("dbRefAnnotatedList.$");
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithDBRefCollection.class));
DBObject $unset = DBObjectTestUtils.getAsDBObject(mappedUpdate, "$unset");
assertThat($unset, equalTo(new BasicDBObjectBuilder().add("dbRefAnnotatedList.$", 1).get()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldNotAddTypeInfoForNonInterfaceNonAbstractTypes() {
Update update = new Update().addToSet("nestedDocs").each(new NestedDocument("nested-1"),
new NestedDocument("nested-2"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithNestedCollection.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[0]._class"));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[1]._class"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldAddTypeHintForInterfaceTypes() {
Update update = new Update().addToSet("models").each(new ModelImpl(1), new ModelImpl(2));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ListModelWrapper.class));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[0]._class", ModelImpl.class.getName()));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[1]._class", ModelImpl.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldAddTypeHintForAbstractTypes() {
Update update = new Update().addToSet("list").each(new ConcreteChildClass("foo", "one"),
new ConcreteChildClass("bar", "two"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[0]._class", ConcreteChildClass.class.getName()));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[1]._class", ConcreteChildClass.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingShouldOnlyRemoveTypeHintFromTopLevelTypeInCaseOfNestedDocument() {
WrapperAroundInterfaceType wait = new WrapperAroundInterfaceType();
wait.interfaceType = new ModelImpl(1);
Update update = new Update().addToSet("listHoldingConcretyTypeWithInterfaceTypeAttribute").each(wait);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWithListOfConcreteTypesHavingSingleInterfaceTypeAttribute.class));
assertThat(mappedUpdate,
isBsonObject().notContaining("$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0]._class"));
assertThat(mappedUpdate,
isBsonObject().containing(
"$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0].interfaceType._class",
ModelImpl.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingShouldRetainTypeInformationOfNestedListWhenUpdatingConcreteyParentType() {
ListModelWrapper lmw = new ListModelWrapper();
lmw.models = Collections.<Model> singletonList(new ModelImpl(1));
Update update = new Update().set("concreteTypeWithListAttributeOfInterfaceType", lmw);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteTypeWithListAttributeOfInterfaceType._class"));
assertThat(
mappedUpdate,
isBsonObject().containing("$set.concreteTypeWithListAttributeOfInterfaceType.models.[0]._class",
ModelImpl.class.getName()));
}
/**
* @see DATAMONGO-1236
*/
@Test
public void mappingShouldRetainTypeInformationForObjectValues() {
Update update = new Update().set("value", new NestedDocument("kaladin"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.value.name", "kaladin"));
assertThat(mappedUpdate, isBsonObject().containing("$set.value._class", NestedDocument.class.getName()));
}
/**
* @see DATAMONGO-1236
*/
@Test
public void mappingShouldNotRetainTypeInformationForConcreteValues() {
Update update = new Update().set("concreteValue", new NestedDocument("shallan"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteValue.name", "shallan"));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteValue._class"));
}
/**
* @see DATAMONGO-1236
*/
@Test
public void mappingShouldRetainTypeInformationForObjectValuesWithAlias() {
Update update = new Update().set("value", new NestedDocument("adolin"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithAliasedObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.renamed-value.name", "adolin"));
assertThat(mappedUpdate, isBsonObject().containing("$set.renamed-value._class", NestedDocument.class.getName()));
}
/**
* @see DATAMONGO-1236
*/
@Test
public void mappingShouldRetrainTypeInformationWhenValueTypeOfMapDoesNotMatchItsDeclaration() {
Map<Object, Object> map = Collections.<Object, Object> singletonMap("szeth", new NestedDocument("son-son-vallano"));
Update update = new Update().set("map", map);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.map.szeth.name", "son-son-vallano"));
assertThat(mappedUpdate, isBsonObject().containing("$set.map.szeth._class", NestedDocument.class.getName()));
}
/**
* @see DATAMONGO-1236
*/
@Test
public void mappingShouldNotContainTypeInformationWhenValueTypeOfMapMatchesDeclaration() {
Map<Object, NestedDocument> map = Collections.<Object, NestedDocument> singletonMap("jasnah", new NestedDocument(
"kholin"));
Update update = new Update().set("concreteMap", map);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteMap.jasnah.name", "kholin"));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteMap.jasnah._class"));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
static class DomainTypeWithListOfConcreteTypesHavingSingleInterfaceTypeAttribute {
List<WrapperAroundInterfaceType> listHoldingConcretyTypeWithInterfaceTypeAttribute;
}
static class WrapperAroundInterfaceType {
Model interfaceType;
}
@org.springframework.data.mongodb.core.mapping.Document(collection = "DocumentWithReferenceToInterface")
static interface DocumentWithReferenceToInterface {
@@ -544,7 +740,7 @@ public class UpdateMapperUnitTests {
private @Id String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
private InterfaceDocumentDefinitionWithoutId referencedDocument;
public String getId() {
@@ -605,10 +801,10 @@ public class UpdateMapperUnitTests {
String id;
@Field("aliased")//
@Field("aliased") //
List<? extends AbstractChildClass> list;
@Field//
@Field //
List<Model> listOfInterface;
public ParentClass(String id, List<? extends AbstractChildClass> list) {
@@ -641,6 +837,10 @@ public class UpdateMapperUnitTests {
static class DomainEntity {
List<NestedEntity> collectionOfNestedEntities;
public List<NestedEntity> getCollectionOfNestedEntities() {
return collectionOfNestedEntities;
}
}
static class NestedEntity {
@@ -666,10 +866,10 @@ public class UpdateMapperUnitTests {
@Id public String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
public List<Entity> dbRefAnnotatedList;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
public Entity dbRefProperty;
}
@@ -683,4 +883,35 @@ public class UpdateMapperUnitTests {
@Field("mapped") DocumentWithDBRefCollection nested;
}
static class DocumentWithNestedCollection {
List<NestedDocument> nestedDocs;
}
static class NestedDocument {
String name;
public NestedDocument(String name) {
super();
this.name = name;
}
}
static class EntityWithObject {
Object value;
NestedDocument concreteValue;
}
static class EntityWithAliasedObject {
@Field("renamed-value") Object value;
}
static class EntityWithObjectMap {
Map<Object, Object> map;
Map<Object, NestedDocument> concreteMap;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,10 +15,7 @@
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.collection.IsCollectionWithSize.*;
import static org.hamcrest.collection.IsEmptyCollection.*;
import static org.hamcrest.core.IsEqual.*;
import static org.hamcrest.core.IsInstanceOf.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
@@ -26,8 +23,6 @@ import java.lang.annotation.Annotation;
import java.util.Collections;
import java.util.List;
import org.hamcrest.collection.IsEmptyCollection;
import org.hamcrest.core.IsEqual;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
@@ -461,7 +456,7 @@ public class MongoPersistentEntityIndexResolverUnitTests {
indexDefinitions.get(0));
DBObject weights = DBObjectTestUtils.getAsDBObject(indexDefinitions.get(0).getIndexOptions(), "weights");
assertThat(weights.get("nested.foo"), IsEqual.<Object> equalTo(5F));
assertThat(weights.get("nested.foo"), is((Object) 5F));
}
/**
@@ -476,8 +471,8 @@ public class MongoPersistentEntityIndexResolverUnitTests {
"textIndexOnNestedWithMostSpecificValueRoot", indexDefinitions.get(0));
DBObject weights = DBObjectTestUtils.getAsDBObject(indexDefinitions.get(0).getIndexOptions(), "weights");
assertThat(weights.get("nested.foo"), IsEqual.<Object> equalTo(5F));
assertThat(weights.get("nested.bar"), IsEqual.<Object> equalTo(10F));
assertThat(weights.get("nested.foo"), is((Object) 5F));
assertThat(weights.get("nested.bar"), is((Object) 10F));
}
/**
@@ -487,17 +482,57 @@ public class MongoPersistentEntityIndexResolverUnitTests {
public void shouldSetDefaultLanguageCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithDefaultLanguage.class);
assertThat(indexDefinitions.get(0).getIndexOptions().get("default_language"), IsEqual.<Object> equalTo("spanish"));
assertThat(indexDefinitions.get(0).getIndexOptions().get("default_language"), is((Object) "spanish"));
}
/**
* @see DATAMONGO-937
* @see DATAMONGO-937, DATAMONGO-1049
*/
@Test
public void shouldResolveTextIndexLanguageOverrideCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithLanguageOverrideOnNestedElementRoot.class);
assertThat(indexDefinitions.get(0).getIndexOptions().get("language_override"), IsEqual.<Object> equalTo("lang"));
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithLanguageOverride.class);
assertThat(indexDefinitions.get(0).getIndexOptions().get("language_override"), is((Object) "lang"));
}
/**
* @see DATAMONGO-1049
*/
@Test
public void shouldIgnoreTextIndexLanguageOverrideOnNestedElements() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithLanguageOverrideOnNestedElement.class);
assertThat(indexDefinitions.get(0).getIndexOptions().get("language_override"), is(nullValue()));
}
/**
* @see DATAMONGO-1049
*/
@Test
public void shouldNotCreateIndexDefinitionWhenOnlyLanguageButNoTextIndexPresent() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithNoTextIndexPropertyButReservedFieldLanguage.class);
assertThat(indexDefinitions, is(empty()));
}
/**
* @see DATAMONGO-1049
*/
@Test
public void shouldNotCreateIndexDefinitionWhenOnlyAnnotatedLanguageButNoTextIndexPresent() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithNoTextIndexPropertyButReservedFieldLanguageAnnotated.class);
assertThat(indexDefinitions, is(empty()));
}
/**
* @see DATAMONGO-1049
*/
@Test
public void shouldPreferExplicitlyAnnotatedLanguageProperty() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(DocumentWithOverlappingLanguageProps.class);
assertThat(indexDefinitions.get(0).getIndexOptions().get("language_override"), is((Object) "lang"));
}
@Document
@@ -527,14 +562,12 @@ public class MongoPersistentEntityIndexResolverUnitTests {
static class TextIndexOnNested {
String foo;
}
@Document
static class TextIndexOnNestedWithWeightRoot {
@TextIndexed(weight = 5) TextIndexOnNested nested;
}
@Document
@@ -554,18 +587,39 @@ public class MongoPersistentEntityIndexResolverUnitTests {
}
@Document
static class DocumentWithLanguageOverrideOnNestedElementRoot {
static class DocumentWithLanguageOverrideOnNestedElement {
DocumentWithLanguageOverrideOnNestedElement nested;
DocumentWithLanguageOverride nested;
}
static class DocumentWithLanguageOverrideOnNestedElement {
@Document
static class DocumentWithLanguageOverride {
@TextIndexed String foo;
@Language String lang;
}
@Document
static class DocumentWithNoTextIndexPropertyButReservedFieldLanguage {
String language;
}
@Document
static class DocumentWithNoTextIndexPropertyButReservedFieldLanguageAnnotated {
@Field("language") String lang;
}
@Document
static class DocumentWithOverlappingLanguageProps {
@TextIndexed String foo;
String language;
@Language String lang;
}
}
public static class MixedIndexResolutionTests {
@@ -670,7 +724,7 @@ public class MongoPersistentEntityIndexResolverUnitTests {
public void shouldDetectSelfCycleViaCollectionTypeCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(SelfCyclingViaCollectionType.class);
assertThat(indexDefinitions, IsEmptyCollection.empty());
assertThat(indexDefinitions, empty());
}
/**
@@ -680,7 +734,7 @@ public class MongoPersistentEntityIndexResolverUnitTests {
public void shouldNotDetectCycleWhenTypeIsUsedMoreThanOnce() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(MultipleObjectsOfSameType.class);
assertThat(indexDefinitions, IsEmptyCollection.empty());
assertThat(indexDefinitions, empty());
}
/**
@@ -776,6 +830,47 @@ public class MongoPersistentEntityIndexResolverUnitTests {
assertThat((String) indexDefinitions.get(0).getIndexOptions().get("name"), equalTo("property_index"));
}
/**
* @see DATAMONGO-1087
*/
@Test
public void shouldAllowMultiplePropertiesOfSameTypeWithMatchingStartLettersOnRoot() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(MultiplePropertiesOfSameTypeWithMatchingStartLetters.class);
assertThat(indexDefinitions, hasSize(2));
assertThat((String) indexDefinitions.get(0).getIndexOptions().get("name"), equalTo("name.component"));
assertThat((String) indexDefinitions.get(1).getIndexOptions().get("name"), equalTo("nameLast.component"));
}
/**
* @see DATAMONGO-1087
*/
@Test
public void shouldAllowMultiplePropertiesOfSameTypeWithMatchingStartLettersOnNestedProperty() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(MultiplePropertiesOfSameTypeWithMatchingStartLettersOnNestedProperty.class);
assertThat(indexDefinitions, hasSize(2));
assertThat((String) indexDefinitions.get(0).getIndexOptions().get("name"), equalTo("component.nameLast"));
assertThat((String) indexDefinitions.get(1).getIndexOptions().get("name"), equalTo("component.name"));
}
/**
* @see DATAMONGO-1121
*/
@Test
public void shouldOnlyConsiderEntitiesAsPotentialCycleCandidates() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(OuterDocumentReferingToIndexedPropertyViaDifferentNonCyclingPaths.class);
assertThat(indexDefinitions, hasSize(2));
assertThat((String) indexDefinitions.get(0).getIndexOptions().get("name"), equalTo("path1.foo"));
assertThat((String) indexDefinitions.get(1).getIndexOptions().get("name"),
equalTo("path2.propertyWithIndexedStructure.foo"));
}
@Document
static class MixedIndexRoot {
@@ -916,6 +1011,41 @@ public class MongoPersistentEntityIndexResolverUnitTests {
TypeWithNamedIndex propertyOfTypeHavingNamedIndex;
}
@Document
public class MultiplePropertiesOfSameTypeWithMatchingStartLetters {
public class NameComponent {
@Indexed String component;
}
NameComponent name;
NameComponent nameLast;
}
@Document
public class MultiplePropertiesOfSameTypeWithMatchingStartLettersOnNestedProperty {
public class NameComponent {
@Indexed String nameLast;
@Indexed String name;
}
NameComponent component;
}
@Document
public static class OuterDocumentReferingToIndexedPropertyViaDifferentNonCyclingPaths {
NoCycleButIndenticallNamedPropertiesDeeplyNested path1;
AlternatePathToNoCycleButIndenticallNamedPropertiesDeeplyNestedDocument path2;
}
public static class AlternatePathToNoCycleButIndenticallNamedPropertiesDeeplyNestedDocument {
NoCycleButIndenticallNamedPropertiesDeeplyNested propertyWithIndexedStructure;
}
}
private static List<IndexDefinitionHolder> prepareMappingContextAndResolveIndexForType(Class<?> type) {

View File

@@ -24,6 +24,7 @@ import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.context.ApplicationContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.util.ClassTypeInformation;
/**
@@ -36,6 +37,7 @@ import org.springframework.data.util.ClassTypeInformation;
public class BasicMongoPersistentEntityUnitTests {
@Mock ApplicationContext context;
@Mock MongoPersistentProperty propertyMock;
@Test
public void subclassInheritsAtDocumentAnnotation() {
@@ -53,6 +55,9 @@ public class BasicMongoPersistentEntityUnitTests {
assertThat(entity.getCollection(), is("35"));
}
/**
* @see DATAMONGO-65, DATAMONGO-1108
*/
@Test
public void collectionAllowsReferencingSpringBean() {
@@ -67,6 +72,9 @@ public class BasicMongoPersistentEntityUnitTests {
entity.setApplicationContext(context);
assertThat(entity.getCollection(), is("reference"));
provider.collectionName = "otherReference";
assertThat(entity.getCollection(), is("otherReference"));
}
/**
@@ -80,6 +88,144 @@ public class BasicMongoPersistentEntityUnitTests {
assertThat(entity.getLanguage(), is("spanish"));
}
/**
* @see DATAMONGO-1053
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test(expected = MappingException.class)
public void verifyShouldThrowExceptionForInvalidTypeOfExplicitLanguageProperty() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
when(propertyMock.isExplicitLanguageProperty()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Number.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
}
/**
* @see DATAMONGO-1053
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test
public void verifyShouldPassForStringAsExplicitLanguageProperty() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
when(propertyMock.isExplicitLanguageProperty()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) String.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(propertyMock, times(1)).isExplicitLanguageProperty();
verify(propertyMock, times(1)).getActualType();
}
/**
* @see DATAMONGO-1053
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test
public void verifyShouldIgnoreNonExplicitLanguageProperty() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
when(propertyMock.isExplicitLanguageProperty()).thenReturn(false);
when(propertyMock.getActualType()).thenReturn((Class) Number.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(propertyMock, times(1)).isExplicitLanguageProperty();
verify(propertyMock, never()).getActualType();
}
/**
* @see DATAMONGO-1157
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test(expected = MappingException.class)
public void verifyShouldThrowErrorForLazyDBRefOnFinalClass() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Class.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
}
/**
* @see DATAMONGO-1157
*/
@Test(expected = MappingException.class)
public void verifyShouldThrowErrorForLazyDBRefArray() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.isArray()).thenReturn(true);
entity.addPersistentProperty(propertyMock);
entity.verify();
}
/**
* @see DATAMONGO-1157
*/
@Test
@SuppressWarnings({ "unchecked", "rawtypes" })
public void verifyShouldPassForLazyDBRefOnNonArrayNonFinalClass() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Object.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(propertyMock, times(1)).isDbReference();
}
/**
* @see DATAMONGO-1157
*/
@Test
@SuppressWarnings({ "unchecked", "rawtypes" })
public void verifyShouldPassForNonLazyDBRefOnFinalClass() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(false);
when(propertyMock.getActualType()).thenReturn((Class) Class.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(dbRefMock, times(1)).lazy();
}
@Document(collection = "contacts")
class Contact {
@@ -111,4 +257,8 @@ public class BasicMongoPersistentEntityUnitTests {
static class DocumentWithLanguage {
}
static class AnyDocument {
}
}

View File

@@ -106,6 +106,7 @@ public class MongoMappingContextUnitTests {
exception.expectMessage("firstname");
exception.expectMessage("lastname");
exception.expectMessage("foo");
exception.expectMessage("@Field");
MongoMappingContext context = new MongoMappingContext();
context.setApplicationContext(applicationContext);

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,17 +16,23 @@
package org.springframework.data.mongodb.core.mapping.event;
import java.util.ArrayList;
import java.util.List;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mongodb.core.mapping.PersonPojoStringId;
public class PersonBeforeSaveListener implements ApplicationListener<BeforeSaveEvent<PersonPojoStringId>> {
import com.mongodb.DBObject;
public final ArrayList<ApplicationEvent> seenEvents = new ArrayList<ApplicationEvent>();
public class PersonBeforeSaveListener extends AbstractMongoEventListener<PersonPojoStringId> {
public void onApplicationEvent(BeforeSaveEvent<PersonPojoStringId> event) {
this.seenEvents.add(event);
public final List<ApplicationEvent> seenEvents = new ArrayList<ApplicationEvent>();
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onBeforeSave(java.lang.Object, com.mongodb.DBObject)
*/
@Override
public void onBeforeSave(PersonPojoStringId source, DBObject dbo) {
seenEvents.add(new BeforeSaveEvent<PersonPojoStringId>(source, dbo));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import nl.jqno.equalsverifier.EqualsVerifier;
import nl.jqno.equalsverifier.Warning;
import org.junit.Test;
import org.springframework.data.domain.Sort.Direction;
@@ -29,6 +31,7 @@ import com.mongodb.DBObject;
* Unit tests for {@link BasicQuery}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class BasicQueryUnitTests {
@@ -58,4 +61,80 @@ public class BasicQueryUnitTests {
sortReference.put("lastname", 1);
assertThat(query.getSortObject(), is(sortReference));
}
/**
* @see DATAMONGO-1093
*/
@Test
public void equalsContract() {
BasicQuery query1 = new BasicQuery("{ \"name\" : \"Thomas\"}", "{\"name\":1, \"age\":1}");
query1.setSortObject(new BasicDBObject("name", -1));
BasicQuery query2 = new BasicQuery("{ \"name\" : \"Oliver\"}", "{\"name\":1, \"address\":1}");
query2.setSortObject(new BasicDBObject("name", 1));
EqualsVerifier.forExamples(query1, query2) //
.withRedefinedSuperclass() //
.suppress(Warning.NONFINAL_FIELDS, Warning.NULL_FIELDS, Warning.STRICT_INHERITANCE) //
.verify();
}
/**
* @see DATAMONGO-1093
*/
@Test
public void handlesEqualsAndHashCodeCorrectlyForExactCopies() {
String qry = "{ \"name\" : \"Thomas\"}";
String fields = "{\"name\":1, \"age\":1}";
BasicQuery query1 = new BasicQuery(qry, fields);
query1.setSortObject(new BasicDBObject("name", -1));
BasicQuery query2 = new BasicQuery(qry, fields);
query2.setSortObject(new BasicDBObject("name", -1));
assertThat(query1, is(equalTo(query1)));
assertThat(query1, is(equalTo(query2)));
assertThat(query1.hashCode(), is(query2.hashCode()));
}
/**
* @see DATAMONGO-1093
*/
@Test
public void handlesEqualsAndHashCodeCorrectlyWhenBasicQuerySettingsDiffer() {
String qry = "{ \"name\" : \"Thomas\"}";
String fields = "{\"name\":1, \"age\":1}";
BasicQuery query1 = new BasicQuery(qry, fields);
query1.setSortObject(new BasicDBObject("name", -1));
BasicQuery query2 = new BasicQuery(qry, fields);
query2.setSortObject(new BasicDBObject("name", 1));
assertThat(query1, is(not(equalTo(query2))));
assertThat(query1.hashCode(), is(not(query2.hashCode())));
}
/**
* @see DATAMONGO-1093
*/
@Test
public void handlesEqualsAndHashCodeCorrectlyWhenQuerySettingsDiffer() {
String qry = "{ \"name\" : \"Thomas\"}";
String fields = "{\"name\":1, \"age\":1}";
BasicQuery query1 = new BasicQuery(qry, fields);
query1.getMeta().setComment("foo");
BasicQuery query2 = new BasicQuery(qry, fields);
query2.getMeta().setComment("bar");
assertThat(query1, is(not(equalTo(query2))));
assertThat(query1.hashCode(), is(not(query2.hashCode())));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,12 +22,14 @@ import org.junit.Test;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Thomas Darimont
*/
/**
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class CriteriaTests {
@Test
@@ -72,50 +74,94 @@ public class CriteriaTests {
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateAndOperation() {
new Criteria() //
.not() //
.andOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateOrOperation() {
new Criteria() //
.not() //
.orOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateNorOperation() {
new Criteria() //
.not() //
.norOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test
public void shouldNegateFollowingSimpleExpression() {
Criteria c = Criteria.where("age").not().gt(18).and("status").is("student");
DBObject co = c.getCriteriaObject();
assertThat(co, is(notNullValue()));
assertThat(co.toString(), is("{ \"age\" : { \"$not\" : { \"$gt\" : 18}} , \"status\" : \"student\"}"));
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateAndOperation() {
new Criteria() //
.not() //
.andOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateOrOperation() {
new Criteria() //
.not() //
.orOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test(expected = IllegalArgumentException.class)
public void shouldThrowExceptionWhenTryingToNegateNorOperation() {
new Criteria() //
.not() //
.norOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
/**
* @see DATAMONGO-507
*/
@Test
public void shouldNegateFollowingSimpleExpression() {
Criteria c = Criteria.where("age").not().gt(18).and("status").is("student");
DBObject co = c.getCriteriaObject();
assertThat(co, is(notNullValue()));
assertThat(co.toString(), is("{ \"age\" : { \"$not\" : { \"$gt\" : 18}} , \"status\" : \"student\"}"));
}
/**
* @see DATAMONGO-1068
*/
@Test
public void getCriteriaObjectShouldReturnEmptyDBOWhenNoCriteriaSpecified() {
DBObject dbo = new Criteria().getCriteriaObject();
assertThat(dbo, equalTo(new BasicDBObjectBuilder().get()));
}
/**
* @see DATAMONGO-1068
*/
@Test
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresent() {
DBObject dbo = new Criteria().lt("foo").getCriteriaObject();
assertThat(dbo, equalTo(new BasicDBObjectBuilder().add("$lt", "foo").get()));
}
/**
* @see DATAMONGO-1068
*/
@Test
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresentButMultipleCriteriasPresent() {
DBObject dbo = new Criteria().lt("foo").gt("bar").getCriteriaObject();
assertThat(dbo, equalTo(new BasicDBObjectBuilder().add("$lt", "foo").add("$gt", "bar").get()));
}
/**
* @see DATAMONGO-1068
*/
@Test
public void getCriteriaObjectShouldRespectNotWhenNoKeyPresent() {
DBObject dbo = new Criteria().lt("foo").not().getCriteriaObject();
assertThat(dbo, equalTo(new BasicDBObjectBuilder().add("$not", new BasicDBObject("$lt", "foo")).get()));
}
}

View File

@@ -19,10 +19,12 @@ import static java.util.Arrays.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import org.hamcrest.Matchers;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
@@ -46,6 +48,7 @@ import org.springframework.data.geo.Polygon;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.querydsl.QSort;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
@@ -1023,6 +1026,115 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(result, is(notNullValue()));
assertThat(result.firstname, is("Carter"));
assertThat(result.lastname, is("Beauford"));
}
/**
* @see DATAMONGO-1057
*/
@Test
public void sliceShouldTraverseElementsWithoutSkippingOnes() {
repository.deleteAll();
List<Person> persons = new ArrayList<Person>(100);
for (int i = 0; i < 100; i++) {
// format firstname to assert sorting retains proper order
persons.add(new Person(String.format("%03d", i), "ln" + 1, 100));
}
repository.save(persons);
Slice<Person> slice = repository.findByAgeGreaterThan(50, new PageRequest(0, 20, Direction.ASC, "firstname"));
assertThat(slice, contains(persons.subList(0, 20).toArray()));
slice = repository.findByAgeGreaterThan(50, slice.nextPageable());
assertThat(slice, contains(persons.subList(20, 40).toArray()));
}
/**
* @see DATAMONGO-1072
*/
@Test
public void shouldBindPlaceholdersUsedAsKeysCorrectly() {
List<Person> persons = repository.findByKeyValue("firstname", alicia.getFirstname());
assertThat(persons, hasSize(1));
assertThat(persons, hasItem(alicia));
}
/**
* @see DATAMONGO-1085
*/
@Test
public void shouldSupportSortingByQueryDslOrderSpecifier() {
repository.deleteAll();
List<Person> persons = new ArrayList<Person>();
for (int i = 0; i < 3; i++) {
Person person = new Person(String.format("Siggi %s", i), "Bar", 30);
person.setAddress(new Address(String.format("Street %s", i), "12345", "SinCity"));
persons.add(person);
}
repository.save(persons);
QPerson person = QPerson.person;
Iterable<Person> result = repository.findAll(person.firstname.isNotNull(), person.address.street.desc());
assertThat(result, is(Matchers.<Person> iterableWithSize(persons.size())));
assertThat(result.iterator().next().getFirstname(), is(persons.get(2).getFirstname()));
}
/**
* @see DATAMONGO-1085
*/
@Test
public void shouldSupportSortingWithQSortByQueryDslOrderSpecifier() throws Exception {
repository.deleteAll();
List<Person> persons = new ArrayList<Person>();
for (int i = 0; i < 3; i++) {
Person person = new Person(String.format("Siggi %s", i), "Bar", 30);
person.setAddress(new Address(String.format("Street %s", i), "12345", "SinCity"));
persons.add(person);
}
repository.save(persons);
PageRequest pageRequest = new PageRequest(0, 2, new QSort(person.address.street.desc()));
Iterable<Person> result = repository.findAll(pageRequest);
assertThat(result, is(Matchers.<Person> iterableWithSize(2)));
assertThat(result.iterator().next().getFirstname(), is("Siggi 2"));
}
/**
* @see DATAMONGO-1085
*/
@Test
public void shouldSupportSortingWithQSort() throws Exception {
repository.deleteAll();
List<Person> persons = new ArrayList<Person>();
for (int i = 0; i < 3; i++) {
Person person = new Person(String.format("Siggi %s", i), "Bar", 30);
person.setAddress(new Address(String.format("Street %s", i), "12345", "SinCity"));
persons.add(person);
}
repository.save(persons);
Iterable<Person> result = repository.findAll(new QSort(person.address.street.desc()));
assertThat(result, is(Matchers.<Person> iterableWithSize(persons.size())));
assertThat(result.iterator().next().getFirstname(), is("Siggi 2"));
}
}

View File

@@ -0,0 +1,132 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import java.util.List;
import org.hamcrest.Matchers;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* @author Christoph Strobl
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class ComplexIdRepositoryIntegrationTests {
@Configuration
@EnableMongoRepositories
static class Config extends AbstractMongoConfiguration {
@Override
protected String getDatabaseName() {
return "complexIdTest";
}
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
}
@Autowired UserWithComplexIdRepository repo;
@Autowired MongoTemplate template;
MyId id;
UserWithComplexId userWithId;
@Before
public void setUp() {
repo.deleteAll();
id = new MyId();
id.val1 = "v1";
id.val2 = "v2";
userWithId = new UserWithComplexId();
userWithId.firstname = "foo";
userWithId.id = id;
}
/**
* @see DATAMONGO-1078
*/
@Test
public void annotatedFindQueryShouldWorkWhenUsingComplexId() {
repo.save(userWithId);
assertThat(repo.getUserByComplexId(id), is(userWithId));
}
/**
* @see DATAMONGO-1078
*/
@Test
public void annotatedFindQueryShouldWorkWhenUsingComplexIdWithinCollection() {
repo.save(userWithId);
List<UserWithComplexId> loaded = repo.findByUserIds(Collections.singleton(id));
assertThat(loaded, hasSize(1));
assertThat(loaded, contains(userWithId));
}
/**
* @see DATAMONGO-1078
*/
@Test
public void findOneShouldWorkWhenUsingComplexId() {
repo.save(userWithId);
assertThat(repo.findOne(id), is(userWithId));
}
/**
* @see DATAMONGO-1078
*/
@Test
public void findAllShouldWorkWhenUsingComplexId() {
repo.save(userWithId);
Iterable<UserWithComplexId> loaded = repo.findAll(Collections.singleton(id));
assertThat(loaded, is(Matchers.<UserWithComplexId> iterableWithSize(1)));
assertThat(loaded, contains(userWithId));
}
}

View File

@@ -20,7 +20,7 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
/**
* Sample contactt domain class.
* Sample contact domain class.
*
* @author Oliver Gierke
*/

View File

@@ -0,0 +1,59 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.io.Serializable;
import org.springframework.util.ObjectUtils;
/**
* @author Christoph Strobl
* @author Oliver Gierke
*/
public class MyId implements Serializable {
private static final long serialVersionUID = -7129201311241750831L;
String val1;
String val2;
@Override
public int hashCode() {
int result = 31;
result += 17 * ObjectUtils.nullSafeHashCode(val1);
result += 17 * ObjectUtils.nullSafeHashCode(val2);
return result;
}
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof MyId)) {
return false;
}
MyId that = (MyId) obj;
return ObjectUtils.nullSafeEquals(this.val1, that.val1) && ObjectUtils.nullSafeEquals(this.val2, that.val2);
}
}

View File

@@ -25,6 +25,7 @@ import org.springframework.data.mongodb.core.index.GeoSpatialIndexed;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
/**
* Sample domain class.
@@ -46,9 +47,11 @@ public class Person extends Contact {
@SuppressWarnings("unused") private Sex sex;
Date createdAt;
List<String> skills;
@GeoSpatialIndexed private Point location;
private Address address;
private @Field("add") Address address;
private Set<Address> shippingAddresses;
@DBRef User creator;
@@ -271,6 +274,14 @@ public class Person extends Contact {
this.creator = creator;
}
public void setSkills(List<String> skills) {
this.skills = skills;
}
public List<String> getSkills() {
return skills;
}
/*
* (non-Javadoc)
*

View File

@@ -317,4 +317,7 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
* @see DATAMONGO-1030
*/
PersonSummary findSummaryByLastname(String lastname);
@Query("{ ?0 : ?1 }")
List<Person> findByKeyValue(String key, String value);
}

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.util.ObjectUtils;
/**
* @author Christoph Strobl
* @author Oliver Gierke
*/
@Document
public class UserWithComplexId {
@Id MyId id;
String firstname;
@Override
public int hashCode() {
int result = 31;
result += 17 * ObjectUtils.nullSafeHashCode(id);
return result;
}
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof UserWithComplexId)) {
return false;
}
UserWithComplexId that = (UserWithComplexId) obj;
return ObjectUtils.nullSafeEquals(this.id, that.id);
}
}

View File

@@ -0,0 +1,33 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.util.Collection;
import java.util.List;
import org.springframework.data.repository.CrudRepository;
/**
* @author Christoph Strobl
*/
public interface UserWithComplexIdRepository extends CrudRepository<UserWithComplexId, MyId> {
@Query("{'_id': {$in: ?0}}")
List<UserWithComplexId> findByUserIds(Collection<MyId> ids);
@Query("{'_id': ?0}")
UserWithComplexId getUserByComplexId(MyId id);
}

View File

@@ -15,14 +15,16 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import java.util.Optional;
import org.bson.types.ObjectId;
import org.hamcrest.core.Is;
@@ -32,10 +34,13 @@ import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Matchers;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.Person;
@@ -50,6 +55,8 @@ import org.springframework.data.mongodb.repository.Meta;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.repository.core.RepositoryMetadata;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
import com.mongodb.WriteResult;
/**
@@ -93,8 +100,7 @@ public class AbstractMongoQueryUnitTests {
createQueryForMethod("deletePersonByLastname", String.class).setDeleteQuery(true).execute(new Object[] { "booh" });
verify(this.mongoOperationsMock, times(1)).remove(Matchers.any(Query.class), Matchers.eq(Person.class),
Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).remove(Matchers.any(Query.class), eq(Person.class), eq("persons"));
verify(this.mongoOperationsMock, times(0)).find(Matchers.any(Query.class), Matchers.any(Class.class),
Matchers.anyString());
}
@@ -112,8 +118,8 @@ public class AbstractMongoQueryUnitTests {
createQueryForMethod("deleteByLastname", String.class).setDeleteQuery(true).execute(new Object[] { "booh" });
verify(this.mongoOperationsMock, times(1)).findAllAndRemove(Matchers.any(Query.class), Matchers.eq(Person.class),
Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).findAllAndRemove(Matchers.any(Query.class), eq(Person.class),
eq("persons"));
}
/**
@@ -136,15 +142,14 @@ public class AbstractMongoQueryUnitTests {
public void testDeleteExecutionReturnsNrDocumentsDeletedFromWriteResult() {
when(writeResultMock.getN()).thenReturn(100);
when(this.mongoOperationsMock.remove(Matchers.any(Query.class), Matchers.eq(Person.class), Matchers.eq("persons")))
.thenReturn(writeResultMock);
when(this.mongoOperationsMock.remove(Matchers.any(Query.class), eq(Person.class), eq("persons"))).thenReturn(
writeResultMock);
MongoQueryFake query = createQueryForMethod("deletePersonByLastname", String.class);
query.setDeleteQuery(true);
assertThat(query.execute(new Object[] { "fake" }), is((Object) 100L));
verify(this.mongoOperationsMock, times(1)).remove(Matchers.any(Query.class), Matchers.eq(Person.class),
Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).remove(Matchers.any(Query.class), eq(Person.class), eq("persons"));
}
/**
@@ -158,8 +163,7 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(1))
.find(captor.capture(), Matchers.eq(Person.class), Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).find(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getValue().getMeta().getComment(), nullValue());
}
@@ -175,8 +179,7 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(1))
.find(captor.capture(), Matchers.eq(Person.class), Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).find(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
@@ -191,7 +194,7 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(1)).count(captor.capture(), Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).count(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
@@ -206,11 +209,89 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(1))
.find(captor.capture(), Matchers.eq(Person.class), Matchers.eq("persons"));
verify(this.mongoOperationsMock, times(1)).find(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
/**
* @see DATAMONGO-1057
*/
@Test
public void slicedExecutionShouldRetainNrOfElementsToSkip() {
MongoQueryFake query = createQueryForMethod("findByLastname", String.class, Pageable.class);
Pageable page1 = new PageRequest(0, 10);
Pageable page2 = page1.next();
query.execute(new Object[] { "fake", page1 });
query.execute(new Object[] { "fake", page2 });
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(2)).find(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getAllValues().get(0).getSkip(), is(0));
assertThat(captor.getAllValues().get(1).getSkip(), is(10));
}
/**
* @see DATAMONGO-1057
*/
@Test
public void slicedExecutionShouldIncrementLimitByOne() {
MongoQueryFake query = createQueryForMethod("findByLastname", String.class, Pageable.class);
Pageable page1 = new PageRequest(0, 10);
Pageable page2 = page1.next();
query.execute(new Object[] { "fake", page1 });
query.execute(new Object[] { "fake", page2 });
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(2)).find(captor.capture(), eq(Person.class), eq("persons"));
assertThat(captor.getAllValues().get(0).getLimit(), is(11));
assertThat(captor.getAllValues().get(1).getLimit(), is(11));
}
/**
* @see DATAMONGO-1057
*/
@Test
public void slicedExecutionShouldRetainSort() {
MongoQueryFake query = createQueryForMethod("findByLastname", String.class, Pageable.class);
Pageable page1 = new PageRequest(0, 10, Sort.Direction.DESC, "bar");
Pageable page2 = page1.next();
query.execute(new Object[] { "fake", page1 });
query.execute(new Object[] { "fake", page2 });
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(this.mongoOperationsMock, times(2)).find(captor.capture(), eq(Person.class), eq("persons"));
DBObject expectedSortObject = new BasicDBObjectBuilder().add("bar", -1).get();
assertThat(captor.getAllValues().get(0).getSortObject(), is(expectedSortObject));
assertThat(captor.getAllValues().get(1).getSortObject(), is(expectedSortObject));
}
/**
* @see DATAMONGO-1080
*/
@Test
public void doesNotTryToPostProcessQueryResultIntoWrapperType() {
Person reference = new Person();
when(mongoOperationsMock.findOne(Mockito.any(Query.class), eq(Person.class), eq("persons"))).//
thenReturn(reference);
AbstractMongoQuery query = createQueryForMethod("findByLastname", String.class);
assertThat(query.execute(new Object[] { "lastname" }), is((Object) reference));
}
private MongoQueryFake createQueryForMethod(String methodName, Class<?>... paramTypes) {
try {
@@ -272,5 +353,9 @@ public class AbstractMongoQueryUnitTests {
@org.springframework.data.mongodb.repository.Query("{}")
Page<Person> findByAnnotatedQuery(String firstnanme, Pageable pageable);
/** @see DATAMONGO-1057 */
Slice<Person> findByLastname(String lastname, Pageable page);
Optional<Person> findByLastname(String lastname);
}
}

View File

@@ -41,6 +41,7 @@ import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.Venue;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Field;
@@ -167,19 +168,6 @@ public class MongoQueryCreatorUnitTests {
assertThat(creator.createQuery(), is(reference));
}
/**
* @see DATAMONGO-291
*/
@Test
public void honoursMappingInformationForPropertyPaths() {
PartTree partTree = new PartTree("findByUsername", User.class);
MongoQueryCreator creator = new MongoQueryCreator(partTree, getAccessor(converter, "Oliver"), context);
Query reference = query(where("foo").is("Oliver"));
assertThat(creator.createQuery(), is(reference));
}
/**
* @see DATAMONGO-338
*/
@@ -268,7 +256,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Matt"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("foo").regex("^Matt"))));
assertThat(query, is(query(where("username").regex("^Matt"))));
}
/**
@@ -281,7 +269,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "ews"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("foo").regex("ews$"))));
assertThat(query, is(query(where("username").regex("ews$"))));
}
/**
@@ -294,7 +282,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "thew"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("foo").regex(".*thew.*"))));
assertThat(query, is(query(where("username").regex(".*thew.*"))));
}
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
@@ -438,6 +426,144 @@ public class MongoQueryCreatorUnitTests {
assertThat(query, is(query(where("firstName").regex("^dave$", "i").and("age").is(42))));
}
/**
* @see DATAMONGO-1075
*/
@Test
public void shouldCreateInClauseWhenUsingContainsOnCollectionLikeProperty() {
PartTree tree = new PartTree("findByEmailAddressesContaining", User.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "dave"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("emailAddresses").in("dave"))));
}
/**
* @see DATAMONGO-1139
*/
@Test
public void createsNonShericalNearForDistanceWithDefaultMetric() {
Point point = new Point(1.0, 1.0);
Distance distance = new Distance(1.0);
PartTree tree = new PartTree("findByLocationNear", Venue.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, point, distance), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("location").near(point).maxDistance(1.0))));
}
/**
* @see DATAMONGO-1229
*/
@Test
public void appliesIgnoreCaseToLeafProperty() {
PartTree tree = new PartTree("findByAddressStreetIgnoreCase", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "Street");
assertThat(new MongoQueryCreator(tree, accessor, context).createQuery(), is(notNullValue()));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void ignoreCaseShouldEscapeSource() {
PartTree tree = new PartTree("findByUsernameIgnoreCase", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "con.flux+");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex("^\\Qcon.flux+\\E$", "i"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void ignoreCaseShouldEscapeSourceWhenUsedForStartingWith() {
PartTree tree = new PartTree("findByUsernameStartingWithIgnoreCase", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "dawns.light+");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex("^\\Qdawns.light+\\E", "i"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void ignoreCaseShouldEscapeSourceWhenUsedForEndingWith() {
PartTree tree = new PartTree("findByUsernameEndingWithIgnoreCase", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "new.ton+");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex("\\Qnew.ton+\\E$", "i"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void likeShouldEscapeSourceWhenUsedWithLeadingAndTrailingWildcard() {
PartTree tree = new PartTree("findByUsernameLike", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "*fire.fight+*");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex(".*\\Qfire.fight+\\E.*"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void likeShouldEscapeSourceWhenUsedWithLeadingWildcard() {
PartTree tree = new PartTree("findByUsernameLike", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "*steel.heart+");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex(".*\\Qsteel.heart+\\E"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void likeShouldEscapeSourceWhenUsedWithTrailingWildcard() {
PartTree tree = new PartTree("findByUsernameLike", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "cala.mity+*");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex("\\Qcala.mity+\\E.*"))));
}
/**
* @see DATAMONGO-1232
*/
@Test
public void likeShouldBeTreatedCorrectlyWhenUsedWithWildcardOnly() {
PartTree tree = new PartTree("findByUsernameLike", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, "*");
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("username").regex(".*"))));
}
interface PersonRepository extends Repository<Person, Long> {
List<Person> findByLocationNearAndFirstname(Point location, Distance maxDistance, String firstname);
@@ -448,5 +574,14 @@ public class MongoQueryCreatorUnitTests {
@Field("foo") String username;
@DBRef User creator;
List<String> emailAddresses;
Address address;
}
class Address {
String street;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -45,6 +45,7 @@ import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.util.JSONParseException;
/**
* Unit tests for {@link PartTreeMongoQuery}.
@@ -135,6 +136,18 @@ public class PartTreeMongoQueryUnitTests {
assertThat(query, isTextQuery().searchingFor("search").where(new Criteria("firstname").is("text")));
}
/**
* @see DATAMONGO-1180
*/
@Test
public void propagatesRootExceptionForInvalidQuery() {
exception.expect(IllegalStateException.class);
exception.expectCause(is(org.hamcrest.Matchers.<Throwable> instanceOf(JSONParseException.class)));
deriveQueryFromMethod("findByAge", new Object[] { 1 });
}
private org.springframework.data.mongodb.core.query.Query deriveQueryFromMethod(String method, Object[] args) {
Class<?>[] types = new Class<?>[args.length];
@@ -179,5 +192,8 @@ public class PartTreeMongoQueryUnitTests {
Person findPersonByFirstnameAndLastname(String firstname, String lastname);
Person findPersonByFirstname(String firstname, TextCriteria fullText);
@Query(fields = "{ 'firstname }")
Person findByAge(Integer age);
}
}

View File

@@ -29,6 +29,7 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
@@ -42,7 +43,9 @@ import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* Unit tests for {@link StringBasedMongoQuery}.
@@ -255,6 +258,37 @@ public class StringBasedMongoQueryUnitTests {
assertThat(query.getQueryObject(), is(reference.getQueryObject()));
}
/**
* @see DATAMONGO-1070
*/
@Test
public void parsesDbRefDeclarationsCorrectly() throws Exception {
StringBasedMongoQuery mongoQuery = createQueryForMethod("methodWithManuallyDefinedDbRef", String.class);
ConvertingParameterAccessor parameterAccessor = StubParameterAccessor.getAccessor(converter, "myid");
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(parameterAccessor);
DBRef dbRef = DBObjectTestUtils.getTypedValue(query.getQueryObject(), "reference", DBRef.class);
assertThat(dbRef.getId(), is((Object) "myid"));
assertThat(dbRef.getRef(), is("reference"));
}
/**
* @see DATAMONGO-1072
*/
@Test
public void shouldParseJsonKeyReplacementCorrectly() throws Exception {
StringBasedMongoQuery mongoQuery = createQueryForMethod("methodWithPlaceholderInKeyOfJsonStructure", String.class,
String.class);
ConvertingParameterAccessor parameterAccessor = StubParameterAccessor.getAccessor(converter, "key", "value");
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(parameterAccessor);
assertThat(query.getQueryObject(), is(new BasicDBObjectBuilder().add("key", "value").get()));
}
private StringBasedMongoQuery createQueryForMethod(String name, Class<?>... parameters) throws Exception {
Method method = SampleRepository.class.getMethod(name, parameters);
@@ -291,7 +325,14 @@ public class StringBasedMongoQueryUnitTests {
@Query("{'title': { $regex : '^?0', $options : 'i'}}")
List<DBObject> findByTitleBeginsWithExplicitQuoting(String title);
@Query(value = "{$where: 'return this.date.getUTCMonth() == ?2 && this.date.getUTCDay() == ?3;'}")
@Query("{$where: 'return this.date.getUTCMonth() == ?2 && this.date.getUTCDay() == ?3;'}")
List<DBObject> findByQueryWithParametersInExpression(int param1, int param2, int param3, int param4);
@Query("{ 'reference' : { $ref : 'reference', $id : ?0 }}")
Object methodWithManuallyDefinedDbRef(String id);
@Query("{ ?0 : ?1}")
Object methodWithPlaceholderInKeyOfJsonStructure(String keyReplacement, String valueReplacement);
}
}

View File

@@ -34,6 +34,7 @@ import org.springframework.data.repository.query.ParameterAccessor;
class StubParameterAccessor implements MongoParameterAccessor {
private final Object[] values;
private Distance distance;
/**
* Creates a new {@link ConvertingParameterAccessor} backed by a {@link StubParameterAccessor} simply returning the
@@ -48,7 +49,14 @@ class StubParameterAccessor implements MongoParameterAccessor {
}
public StubParameterAccessor(Object... values) {
this.values = values;
for (Object value : values) {
if (value instanceof Distance) {
this.distance = (Distance) value;
}
}
}
/*
@@ -88,7 +96,7 @@ class StubParameterAccessor implements MongoParameterAccessor {
* @see org.springframework.data.mongodb.repository.MongoParameterAccessor#getMaxDistance()
*/
public Distance getMaxDistance() {
return null;
return distance;
}
/*

View File

@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Arrays;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -40,8 +42,7 @@ import com.mysema.query.mongodb.MongodbQuery;
@ContextConfiguration("classpath:infrastructure.xml")
public class QuerydslRepositorySupportUnitTests {
@Autowired
MongoOperations operations;
@Autowired MongoOperations operations;
Person person;
@Before
@@ -54,9 +55,26 @@ public class QuerydslRepositorySupportUnitTests {
@Test
public void providesMongoQuery() {
QPerson p = QPerson.person;
QuerydslRepositorySupport support = new QuerydslRepositorySupport(operations) {
};
QuerydslRepositorySupport support = new QuerydslRepositorySupport(operations) {};
MongodbQuery<Person> query = support.from(p).where(p.lastname.eq("Matthews"));
assertThat(query.uniqueResult(), is(person));
}
/**
* @see DATAMONGO-1063
*/
@Test
public void shouldAllowAny() {
person.setSkills(Arrays.asList("vocalist", "songwriter", "guitarist"));
operations.save(person);
QPerson p = QPerson.person;
QuerydslRepositorySupport support = new QuerydslRepositorySupport(operations) {};
MongodbQuery<Person> query = support.from(p).where(p.skills.any().in("guitarist"));
assertThat(query.uniqueResult(), is(person));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,10 +16,16 @@
package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.assertThat;
import static org.junit.Assert.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import org.junit.Before;
import org.junit.Test;
@@ -34,13 +40,13 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* @author <a href="mailto:kowsercse@gmail.com">A. B. M. Kowser</a>
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class SimpleMongoRepositoryTests {
@Autowired
private MongoTemplate template;
@Autowired private MongoTemplate template;
private Person oliver, dave, carter, boyd, stefan, leroi, alicia;
private List<Person> all;
@@ -69,7 +75,7 @@ public class SimpleMongoRepositoryTests {
List<Person> result = repository.findAll();
assertThat(result, hasSize(all.size()));
}
@Test
public void findOneFromCustomCollectionName() {
Person result = repository.findOne(dave.getId());
@@ -94,6 +100,74 @@ public class SimpleMongoRepositoryTests {
assertThat(result, not(hasItem(dave)));
}
/**
* @see DATAMONGO-1054
*/
@Test
public void shouldInsertSingle() {
String randomId = UUID.randomUUID().toString();
Person person1 = new Person("First1" + randomId, "Last2" + randomId, 42);
person1 = repository.insert(person1);
Person saved = repository.findOne(person1.getId());
assertThat(saved, is(equalTo(person1)));
}
/**
* @see DATAMONGO-1054
*/
@Test
public void shouldInsertMutlipleFromList() {
String randomId = UUID.randomUUID().toString();
Map<String, Person> idToPerson = new HashMap<String, Person>();
List<Person> persons = new ArrayList<Person>();
for (int i = 0; i < 10; i++) {
Person person = new Person("First" + i + randomId, "Last" + randomId + i, 42 + i);
idToPerson.put(person.getId(), person);
persons.add(person);
}
List<Person> saved = repository.insert(persons);
assertThat(saved, hasSize(persons.size()));
assertThatAllReferencePersonsWereStoredCorrectly(idToPerson, saved);
}
/**
* @see DATAMONGO-1054
*/
@Test
public void shouldInsertMutlipleFromSet() {
String randomId = UUID.randomUUID().toString();
Map<String, Person> idToPerson = new HashMap<String, Person>();
Set<Person> persons = new HashSet<Person>();
for (int i = 0; i < 10; i++) {
Person person = new Person("First" + i + randomId, "Last" + i + randomId, 42 + i);
idToPerson.put(person.getId(), person);
persons.add(person);
}
List<Person> saved = repository.insert(persons);
assertThat(saved, hasSize(persons.size()));
assertThatAllReferencePersonsWereStoredCorrectly(idToPerson, saved);
}
private void assertThatAllReferencePersonsWereStoredCorrectly(Map<String, Person> references, List<Person> saved) {
for (Person person : saved) {
Person reference = references.get(person.getId());
assertThat(person, is(equalTo(reference)));
}
}
private static class CustomizedPersonInformation implements MongoEntityInformation<Person, String> {
@Override

View File

@@ -0,0 +1,193 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.test.util;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import java.util.NoSuchElementException;
import org.bson.BSONObject;
import org.hamcrest.Description;
import org.hamcrest.TypeSafeMatcher;
import org.hamcrest.core.IsEqual;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.ClassUtils;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @param <T>
*/
public class IsBsonObject<T extends BSONObject> extends TypeSafeMatcher<T> {
private List<ExpectedBsonContent> expectations = new ArrayList<ExpectedBsonContent>();;
public static <T extends BSONObject> IsBsonObject<T> isBsonObject() {
return new IsBsonObject<T>();
}
@Override
protected void describeMismatchSafely(T item, Description mismatchDescription) {
mismatchDescription.appendText("was ").appendValue(SerializationUtils.serializeToJsonSafely(item));
}
@Override
public void describeTo(Description description) {
for (ExpectedBsonContent expectation : expectations) {
if (expectation.not) {
description.appendText(String.format("Path %s should not be present.", expectation.path));
} else if (expectation.value == null) {
description.appendText(String.format("Expected to find path %s.", expectation.path));
} else {
description.appendText(String.format("Expected to find %s for path %s.", expectation.value, expectation.path));
}
}
}
@Override
protected boolean matchesSafely(T item) {
if (expectations.isEmpty()) {
return true;
}
for (ExpectedBsonContent expectation : expectations) {
Object o = getValue(item, expectation.path);
if (o == null && expectation.not) {
return true;
}
if (o == null) {
return false;
}
if (expectation.type != null && !ClassUtils.isAssignable(expectation.type, o.getClass())) {
return false;
}
if (expectation.value != null && !new IsEqual<Object>(expectation.value).matches(o)) {
return false;
}
if (o != null && expectation.not) {
return false;
}
}
return true;
}
public IsBsonObject<T> containing(String key) {
ExpectedBsonContent expected = new ExpectedBsonContent();
expected.path = key;
this.expectations.add(expected);
return this;
}
public IsBsonObject<T> containing(String key, Class<?> type) {
ExpectedBsonContent expected = new ExpectedBsonContent();
expected.path = key;
expected.type = type;
this.expectations.add(expected);
return this;
}
public IsBsonObject<T> containing(String key, Object value) {
if (value == null) {
return notContaining(key);
}
ExpectedBsonContent expected = new ExpectedBsonContent();
expected.path = key;
expected.type = ClassUtils.getUserClass(value);
expected.value = value;
this.expectations.add(expected);
return this;
}
public IsBsonObject<T> notContaining(String key) {
ExpectedBsonContent expected = new ExpectedBsonContent();
expected.path = key;
expected.not = true;
this.expectations.add(expected);
return this;
}
static class ExpectedBsonContent {
String path;
Class<?> type;
Object value;
boolean not = false;
}
Object getValue(BSONObject source, String path) {
String[] fragments = path.split("\\.");
if (fragments.length == 1) {
return source.get(path);
}
Iterator<String> it = Arrays.asList(fragments).iterator();
Object current = source;
while (it.hasNext()) {
String key = it.next();
if (!(current instanceof BSONObject) && !key.startsWith("[")) {
return null;
}
if (key.startsWith("[")) {
String indexNumber = key.substring(1, key.indexOf("]"));
if (current instanceof List) {
current = ((List) current).get(Integer.valueOf(indexNumber));
}
if (!it.hasNext()) {
return current;
}
} else {
if (current instanceof DBObject) {
current = ((DBObject) current).get(key);
}
if (!it.hasNext()) {
return current;
}
}
}
throw new NoSuchElementException(String.format("Unable to find '%s' in %s.", path, source));
}
}

View File

Before

Width:  |  Height:  |  Size: 48 KiB

After

Width:  |  Height:  |  Size: 48 KiB

View File

@@ -4,9 +4,9 @@ Mark Pollack; Thomas Risberg; Oliver Gierke; Costin Leau; Jon Brisbin; Thomas Da
:revdate: {localdate}
:toc:
:toc-placement!:
:spring-data-commons-docs: https://raw.githubusercontent.com/spring-projects/spring-data-commons/master/src/main/asciidoc
:spring-data-commons-docs: ../../../../spring-data-commons/src/main/asciidoc
(C) 2008-2014 The original authors.
(C) 2008-2015 The original authors.
NOTE: _Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically._
@@ -15,6 +15,7 @@ toc::[]
include::preface.adoc[]
:leveloffset: +1
include::{spring-data-commons-docs}/dependencies.adoc[]
include::{spring-data-commons-docs}/repositories.adoc[]
:leveloffset: -1
@@ -26,6 +27,7 @@ include::reference/introduction.adoc[]
include::reference/mongodb.adoc[]
include::reference/mongo-repositories.adoc[]
include::{spring-data-commons-docs}/auditing.adoc[]
include::reference/mongo-auditing.adoc[]
include::reference/mapping.adoc[]
include::reference/cross-store.adoc[]
include::reference/logging.adoc[]

View File

@@ -26,7 +26,30 @@ MongoDB requires that you have an '_id' field for all documents. If you don't pr
The following outlines what field will be mapped to the '_id' document field:
* A field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the '_id' field.
* A field without an annotation but named `id` will be mapped to the '_id' field.
* A field without an annotation but named 'id' will be mapped to the '_id' field.
* The default field name for identifiers is '_id' and can be customized via the `@Field` annotation.
[cols="1,2", options="header"]
.Examples for the translation of '_id'-field definitions
|===
| Field definition
| Resulting Id-Fieldname in MongoDB
| `String` id
| `_id`
| `@Field` `String` id
| `_id`
| `@Field('x')` `String` id
| `x`
| `@Id` `String` x
| `_id`
| `@Field('x')` `@Id` `String` x
| `_id`
|===
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field.

View File

@@ -0,0 +1,33 @@
[[mongo.auditing]]
== General auditing configuration
Activating auditing functionality is just a matter of adding the Spring Data Mongo `auditing` namespace element to your configuration:
.Activating auditing using XML configuration
====
[source,xml]
----
<mongo:auditing mapping-context-ref="customMappingContext" auditor-aware-ref="yourAuditorAwareImpl"/>
----
====
Since Spring Data MongoDB 1.4 auditing can be enabled by annotating a configuration class with the `@EnableMongoAuditing` annotation.
.Activating auditing using JavaConfig
====
[source,java]
----
@Configuration
@EnableMongoAuditing
class Config {
@Bean
public AuditorAware<AuditableUser> myAuditorProvider() {
return new AuditorAwareImpl();
}
}
----
====
If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableJpaAuditing`.

View File

@@ -147,87 +147,87 @@ The first method shows a query for all people with the given lastname. The query
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
[cols="1,2,3", options="header"]
[cols="1,2,3", options="header"]
.Supported keywords for query methods
|===
| Keyword
| Sample
| Sample
| Logical result
| `GreaterThan`
| `findByAgeGreaterThan(int age)`
| `GreaterThan`
| `findByAgeGreaterThan(int age)`
| `{"age" : {"$gt" : age}}`
| `GreaterThanEqual`
| `findByAgeGreaterThanEqual(int age)`
| `GreaterThanEqual`
| `findByAgeGreaterThanEqual(int age)`
| `{"age" : {"$gte" : age}}`
| `LessThan`
| `findByAgeLessThan(int age)`
| `LessThan`
| `findByAgeLessThan(int age)`
| `{"age" : {"$lt" : age}}`
| `LessThanEqual`
| `findByAgeLessThanEqual(int age)`
| `LessThanEqual`
| `findByAgeLessThanEqual(int age)`
| `{"age" : {"$lte" : age}}`
| `Between`
| `findByAgeBetween(int from, int to)`
| `Between`
| `findByAgeBetween(int from, int to)`
| `{"age" : {"$gt" : from, "$lt" : to}}`
| `In`
| `In`
| `findByAgeIn(Collection ages)`
| `{"age" : {"$in" : [ages...]}}`
| `NotIn`
| `findByAgeNotIn(Collection ages)`
| `NotIn`
| `findByAgeNotIn(Collection ages)`
| `{"age" : {"$nin" : [ages...]}}`
| `IsNotNull, NotNull`
| `findByFirstnameNotNull()`
| `{"age" : {"$ne" : null}}`
| `IsNotNull, NotNull`
| `findByFirstnameNotNull()`
| `{"firstname" : {"$ne" : null}}`
| `IsNull, Null`
| `findByFirstnameNull()`
| `{"age" : null}`
| `IsNull, Null`
| `findByFirstnameNull()`
| `{"firstname" : null}`
| `Like`
| `Like`
| `findByFirstnameLike(String name)`
| `{"age" : age} ( age as regex)`
| `{"firstname" : name} ( name as regex)`
| `Regex`
| `findByFirstnameRegex(String firstname)`
| `Regex`
| `findByFirstnameRegex(String firstname)`
| `{"firstname" : {"$regex" : firstname }}`
| `(No keyword)`
| `(No keyword)`
| `findByFirstname(String name)`
| `{"age" : name}`
| `{"firstname" : name}`
| `Not`
| `findByFirstnameNot(String name)`
| `{"age" : {"$ne" : name}}`
| `Not`
| `findByFirstnameNot(String name)`
| `{"firstname" : {"$ne" : name}}`
| `Near`
| `findByLocationNear(Point point)`
| `Near`
| `findByLocationNear(Point point)`
| `{"location" : {"$near" : [x,y]}}`
| `Within`
| `findByLocationWithin(Circle circle)`
| `Within`
| `findByLocationWithin(Circle circle)`
| `{"location" : {"$within" : {"$center" : [ [x, y], distance]}}}`
| `Within`
| `findByLocationWithin(Box box)`
| `Within`
| `findByLocationWithin(Box box)`
| `{"location" : {"$within" : {"$box" : [ [x1, y1], x2, y2]}}}True`
| `IsTrue, True`
| `findByActiveIsTrue()`
| `IsTrue, True`
| `findByActiveIsTrue()`
| `{"active" : true}`
| `IsFalse, False`
| `findByActiveIsFalse()`
| `IsFalse, False`
| `findByActiveIsFalse()`
| `{"active" : false}`
| `Exists`
| `findByLocationExists(boolean exists)`
| `Exists`
| `findByLocationExists(boolean exists)`
| `{"location" : {"$exists" : exists }}`
|===
@@ -463,4 +463,4 @@ class RepositoryClient {
List<Person> people = repository.findAll();
}
}
----
----

View File

@@ -415,39 +415,6 @@ If you need to configure additional options on the `com.mongodb.Mongo` instance
</bean>
----
[[mongo.auditing]]
== General auditing configuration
Activating auditing functionality is just a matter of adding the Spring Data Mongo `auditing` namespace element to your configuration:
.Activating auditing using XML configuration
====
[source,xml]
----
<mongo:auditing mapping-context-ref="customMappingContext" auditor-aware-ref="yourAuditorAwareImpl"/>
----
====
Since Spring Data MongoDB 1.4 auditing can be enabled by annotating a configuration class with the `@EnableMongoAuditing` annotation.
.Activating auditing using JavaConfig
====
[source,java]
----
@Configuration
@EnableMongoAuditing
class Config {
@Bean
public AuditorAware<AuditableUser> myAuditorProvider() {
return new AuditorAwareImpl();
}
}
----
====
If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableJpaAuditing`.
[[mongo-template]]
== Introduction to MongoTemplate
@@ -1538,7 +1505,8 @@ Note that the aggregation operations not listed here are currently not supported
[[mongo.aggregation.projection]]
=== Projection Expressions
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class.
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class either by passing a list of `String`s or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method.
Note that one can also define fields with aliases via the static factory method `Fields.field` of the aggregation framework that can then be used to construct a new `Fields` instance.
.Projection expression examples
====

View File

@@ -1,6 +1,218 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.6.3.RELEASE (2015-07-01)
---------------------------------------------
* DATAMONGO-1247 - Release 1.6.3 (Evans).
* DATAMONGO-1242 - Update mongo-java-driver to 3.0.2 in mongo3 profile.
* DATAMONGO-1234 - Fix typos in JavaDoc.
* DATAMONGO-1232 - IgnoreCase should escape queries.
* DATAMONGO-1229 - MongoQueryCreator incorrectly rejects ignoreCase on nested String path.
* DATAMONGO-1224 - Assert Spring Framework 4.2 compatibility.
* DATAMONGO-1221 - Remove relative reference to parent POM to make sure the right Spring version is picked up.
* DATAMONGO-1213 - Include new section on Spring Data and Spring Framework dependencies in reference documentation.
* DATAMONGO-1210 - Inconsistent property order of _class type hint breaks document equality.
* DATAMONGO-1207 - MongoTemplate#doInsertAll throws NullPointerException when passed Collection contains a null item.
* DATAMONGO-1196 - Upgrade build profiles after MongoDB 3.0 Java driver release.
* DATAMONGO-1180 - Incorrect exception message creation in PartTreeMongoQuery.
* DATAMONGO-1166 - ReadPreference not used for Aggregations.
* DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
* DATAMONGO-1155 - Upgrade mongo-next build profiles to Java driver version 2.13.0.
* DATAMONGO-1153 - Fix documentation build.
* DATAMONGO-1133 - Field aliasing is not honored in Aggregation operations.
* DATAMONGO-1124 - Switch log level for cyclic reference index warnings from WARN to INFO.
* DATAMONGO-1081 - Improve documentation on field mapping semantics.
Changes in version 1.7.1.RELEASE (2015-06-30)
---------------------------------------------
* DATAMONGO-1248 - Release 1.7.1 (Fowler).
* DATAMONGO-1242 - Update mongo-java-driver to 3.0.2 in mongo3 profile.
* DATAMONGO-1234 - Fix typos in JavaDoc.
* DATAMONGO-1232 - IgnoreCase should escape queries.
* DATAMONGO-1229 - MongoQueryCreator incorrectly rejects ignoreCase on nested String path.
* DATAMONGO-1224 - Assert Spring Framework 4.2 compatibility.
* DATAMONGO-1221 - Remove relative reference to parent POM to make sure the right Spring version is picked up.
* DATAMONGO-1216 - Authentication mechanism PLAIN changes to SCRAM-SHA-1.
* DATAMONGO-1213 - Include new section on Spring Data and Spring Framework dependencies in reference documentation.
* DATAMONGO-1210 - Inconsistent property order of _class type hint breaks document equality.
* DATAMONGO-1208 - MongoTemplate.stream(…) does not consider limit, order, sort etc.
* DATAMONGO-1207 - MongoTemplate#doInsertAll throws NullPointerException when passed Collection contains a null item.
* DATAMONGO-1202 - Indexed annotation problems under generics.
* DATAMONGO-1196 - Upgrade build profiles after MongoDB 3.0 Java driver release.
* DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
* DATAMONGO-1166 - ReadPreference not used for Aggregations.
* DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
Changes in version 1.8.0.M1 (2015-06-02)
----------------------------------------
* DATAMONGO-1228 - Release 1.8 M1 (Gosling).
* DATAMONGO-1224 - Assert Spring Framework 4.2 compatibility.
* DATAMONGO-1221 - Remove relative reference to parent POM to make sure the right Spring version is picked up.
* DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
* DATAMONGO-1216 - Authentication mechanism PLAIN changes to SCRAM-SHA-1.
* DATAMONGO-1213 - Include new section on Spring Data and Spring Framework dependencies in reference documentation.
* DATAMONGO-1211 - Adapt API changes in Spring Data Commons to simplify custom repository base class registration.
* DATAMONGO-1210 - Inconsistent property order of _class type hint breaks document equality.
* DATAMONGO-1208 - MongoTemplate.stream(…) does not consider limit, order, sort etc.
* DATAMONGO-1207 - MongoTemplate#doInsertAll throws NullPointerException when passed Collection contains a null item.
* DATAMONGO-1202 - Indexed annotation problems under generics.
* DATAMONGO-1196 - Upgrade build profiles after MongoDB 3.0 Java driver release.
* DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
* DATAMONGO-1192 - Switch back to Spring 4.1's CollectionFactory.
* DATAMONGO-1134 - Add support for $geoIntersects.
* DATAMONGO-990 - Add support for SpEL expressions in @Query.
Changes in version 1.7.0.RELEASE (2015-03-23)
---------------------------------------------
* DATAMONGO-1189 - Release 1.7 GA.
* DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
* DATAMONGO-1180 - Incorrect exception message creation in PartTreeMongoQuery.
* DATAMONGO-1179 - Update reference documentation.
* DATAMONGO-1124 - Switch log level for cyclic reference index warnings from WARN to INFO.
* DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Changes in version 1.7.0.RC1 (2015-03-05)
-----------------------------------------
* DATAMONGO-1173 - Release 1.7 RC1.
* DATAMONGO-1167 - Add 'findAll' method to QueryDslMongoRepository which accepts a querydsl Predicate and a Sort.
* DATAMONGO-1165 - Add support for Java 8 Stream as return type in repositories.
* DATAMONGO-1162 - Adapt test cases to semantic changes in Spring Data Commons AuditingHandler API.
* DATAMONGO-1158 - Assert compatibility with MongoDB 3.0.
* DATAMONGO-1154 - Upgrade to MongoDB Java driver 2.13.0.
* DATAMONGO-1153 - Fix documentation build.
* DATAMONGO-1148 - Use EclipseLink provided JPA API JAR.
* DATAMONGO-1147 - Remove manual array copy.
* DATAMONGO-1146 - Add 'exists' method to QueryDslMongoRepository which accepts a querydsl Predicate.
* DATAMONGO-1145 - Upgrade MongoDB Java driver to 2.12.5.
* DATAMONGO-1139 - MongoQueryCreator must not create $nearSphere query for neutral Distance.
* DATAMONGO-1136 - Use $geoWithin instead of $within for geo queries.
* DATAMONGO-1135 - Add support for $geometry to support GeoJSON queries.
* DATAMONGO-1132 - The sample does not match the logical result in the MongoDB repositories section of the documentation.
* DATAMONGO-1131 - Register converters for ThreeTen back port by default.
* DATAMONGO-1129 - Upgrade to latest MongoDB Java driver.
* DATAMONGO-1127 - Add support for geoNear queries with distance information.
* DATAMONGO-1126 - Repository keyword query findByInId with pageable not returning correctly.
* DATAMONGO-1123 - geoNear, does not return all matching elements, it returns only a max of 100 documents.
* DATAMONGO-1121 - "Cycle found" false positive.
* DATAMONGO-1120 - Pageable queries timeout or return incorrect counts.
* DATAMONGO-1118 - Custom converters not used for map keys.
* DATAMONGO-1110 - Add support for $minDistance to NearQuery.
* DATAMONGO-1082 - Improve JavaDoc and reference documentation on alias usage in aggregation framework support.
* DATAMONGO-1081 - Improve documentation on field mapping semantics.
* DATAMONGO-712 - Another round of potential performance improvements.
* DATAMONGO-479 - Support calling of MongoDB stored javascripts.
Changes in version 1.6.2.RELEASE (2015-01-28)
---------------------------------------------
* DATAMONGO-1148 - Use EclipseLink provided JPA API JAR.
* DATAMONGO-1147 - Remove manual array copy.
* DATAMONGO-1145 - Upgrade MongoDB Java driver to 2.12.5.
* DATAMONGO-1144 - Release 1.6.2.
* DATAMONGO-1139 - MongoQueryCreator must not create $nearSphere query for neutral Distance.
* DATAMONGO-1132 - The sample does not match the logical result in the MongoDB repositories section of the documentation.
* DATAMONGO-1127 - Add support for geoNear queries with distance information.
* DATAMONGO-1126 - Repository keyword query findByInId with pageable not returning correctly.
* DATAMONGO-1123 - geoNear, does not return all matching elements, it returns only a max of 100 documents.
* DATAMONGO-1121 - "Cycle found" false positive.
* DATAMONGO-1120 - Pageable queries timeout or return incorrect counts.
* DATAMONGO-1118 - Custom converters not used for map keys.
* DATAMONGO-1108 - BasicMongoPersistentEntity doesn't need to parse expression on every invocation.
* DATAMONGO-1096 - RuntimeExceptions during debug query printing in MongoTemplate.
* DATAMONGO-1094 - Wrong reference to @DocumentField in error message.
* DATAMONGO-1093 - BasicQuery missing hashCode() and equals(…) methods.
* DATAMONGO-1087 - Incorrect warning for MongoPersistentEntityIndexResolver$CyclicPropertyReferenceException: Found cycle for field...
* DATAMONGO-1085 - Sort can not use the metamodel classes generated by QueryDSL.
* DATAMONGO-1082 - Improve JavaDoc and reference documentation on alias usage in aggregation framework support.
* DATAMONGO-1078 - @Query annotated repository query fails to map complex Id structure.
* DATAMONGO-1075 - Correctly evaluate CONTAINS keyword on collection properties.
* DATAMONGO-1054 - Improve performance of saving entities by using insert(…) if possible.
* DATAMONGO-1043 - SpEL Expressions in @Document annotations are not re-evaluated for query executions.
* DATAMONGO-712 - Another round of potential performance improvements.
Changes in version 1.5.5.RELEASE (2015-01-27)
---------------------------------------------
* DATAMONGO-1148 - Use EclipseLink provided JPA API JAR.
* DATAMONGO-1147 - Remove manual array copy.
* DATAMONGO-1143 - Release 1.5.5.
* DATAMONGO-1139 - MongoQueryCreator must not create $nearSphere query for neutral Distance.
* DATAMONGO-1123 - geoNear, does not return all matching elements, it returns only a max of 100 documents.
* DATAMONGO-1121 - "Cycle found" false positive.
* DATAMONGO-1118 - Custom converters not used for map keys.
* DATAMONGO-1096 - RuntimeExceptions during debug query printing in MongoTemplate.
* DATAMONGO-1094 - Wrong reference to @DocumentField in error message.
* DATAMONGO-1087 - Incorrect warning for MongoPersistentEntityIndexResolver$CyclicPropertyReferenceException: Found cycle for field...
* DATAMONGO-1078 - @Query annotated repository query fails to map complex Id structure.
* DATAMONGO-1075 - Correctly evaluate CONTAINS keyword on collection properties.
* DATAMONGO-1072 - Query placeholders in keys no longer correctly substituted.
* DATAMONGO-1068 - elemMatch of Class Criteria fails to build special cirteria.
* DATAMONGO-1063 - IllegalStateException using any().in().
* DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
* DATAMONGO-1058 - Using @Field("foo") with @Dbref breaking behavior.
* DATAMONGO-1045 - Make sure Spring Data MongoDB can build against Spring 4.1.
* DATAMONGO-1043 - SpEL Expressions in @Document annotations are not re-evaluated for query executions.
* DATAMONGO-1040 - deleteAll repository query don't use EntityMetadata collection name.
* DATAMONGO-1039 - Polish implementation for cleaning up after tests.
* DATAMONGO-712 - Another round of potential performance improvements.
Changes in version 1.7.0.M1 (2014-12-01)
----------------------------------------
* DATAMONGO-1108 - BasicMongoPersistentEntity doesn't need to parse expression on every invocation.
* DATAMONGO-1106 - Release 1.7 M1.
* DATAMONGO-1105 - Add implementation for new QueryDslPredicateExecutor.findAll(OrderSpecifier<?>... orders).
* DATAMONGO-1102 - Auto-register JSR-310 converters to support JDK 8 date/time types.
* DATAMONGO-1101 - Add support for $bit to Update.
* DATAMONGO-1100 - Adapt to new PersistentPropertyAccessor API.
* DATAMONGO-1097 - Add support for $mul to Update.
* DATAMONGO-1096 - RuntimeExceptions during debug query printing in MongoTemplate.
* DATAMONGO-1094 - Wrong reference to @DocumentField in error message.
* DATAMONGO-1093 - BasicQuery missing hashCode() and equals(…) methods.
* DATAMONGO-1092 - Ensure compatibility with MongoDB 2.8.0.rc0 and java driver 2.13.0-rc0.
* DATAMONGO-1087 - Incorrect warning for MongoPersistentEntityIndexResolver$CyclicPropertyReferenceException: Found cycle for field…
* DATAMONGO-1085 - Sort can not use the metamodel classes generated by QueryDSL.
* DATAMONGO-1080 - AbstractMongoQuery must not eagerly post-process results.
* DATAMONGO-1078 - @Query annotated repository query fails to map complex Id structure.
* DATAMONGO-1077 - Update removes positional operator $ in key when used on DBRef property.
* DATAMONGO-1076 - Finalizer hit db on lazy dbrefs.
* DATAMONGO-1075 - Correctly evaluate CONTAINS keyword on collection properties.
* DATAMONGO-1072 - Query placeholders in keys no longer correctly substituted.
* DATAMONGO-1070 - Query annotation with $oid leads to a parse error.
* DATAMONGO-1068 - elemMatch of Class Criteria fails to build special cirteria.
* DATAMONGO-1063 - IllegalStateException using any().in().
* DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
* DATAMONGO-1058 - Using @Field("foo") with @Dbref breaking behavior.
* DATAMONGO-1057 - AbstractMongoQuery.SlicedExecution#execute() skips every nth element.
* DATAMONGO-1054 - Improve performance of saving entities by using insert(…) if possible.
* DATAMONGO-1053 - In 1.6, any field in a mapped object named "language" will fail to map if it is a type other than String.
* DATAMONGO-1050 - SimpleMongoRepository.findById(id, class) don't return ids for nested documents.
* DATAMONGO-1049 - Reserved field name 'language' causes trouble.
* DATAMONGO-1043 - SpEL Expressions in @Document annotations are not re-evaluated for query executions.
* DATAMONGO-943 - Add support for $position to Update $push $each.
Changes in version 1.6.1.RELEASE (2014-10-30)
---------------------------------------------
* DATAMONGO-1080 - AbstractMongoQuery must not eagerly post-process results.
* DATAMONGO-1079 - Release 1.6.1.
* DATAMONGO-1077 - Update removes positional operator $ in key when used on DBRef property.
* DATAMONGO-1076 - Finalizer hit db on lazy dbrefs.
* DATAMONGO-1072 - Query placeholders in keys no longer correctly substituted.
* DATAMONGO-1070 - Query annotation with $oid leads to a parse error.
* DATAMONGO-1068 - elemMatch of Class Criteria fails to build special cirteria.
* DATAMONGO-1063 - IllegalStateException using any().in().
* DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
* DATAMONGO-1058 - Using @Field("foo") with @Dbref breaking behavior.
* DATAMONGO-1057 - AbstractMongoQuery.SlicedExecution#execute() skips every nth element.
* DATAMONGO-1053 - In 1.6, any field in a mapped object named "language" will fail to map if it is a type other than String.
* DATAMONGO-1049 - Reserved field name 'language' causes trouble.
Changes in version 1.6.0.RELEASE (2014-09-05)
---------------------------------------------
* DATAMONGO-1046 - Release 1.6 GA.

View File

@@ -1,8 +1,8 @@
Spring Data MongoDB 1.6 GA
Copyright (c) [2010-2014] Pivotal Software, Inc.
Spring Data MongoDB 1.6.3
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
You may not use this product except in compliance with the License.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
You may not use this product except in compliance with the License.
This product may include a number of subcomponents with
separate copyright notices and license terms. Your use of the source