Compare commits

..

150 Commits

Author SHA1 Message Date
Oliver Gierke
4cdd59ffd4 DATAMONGO-1304 - After release cleanups. 2015-10-14 14:18:24 +02:00
Spring Buildmaster
5160cdadb6 DATAMONGO-1304 - Prepare next development iteration. 2015-10-14 03:37:30 -07:00
Spring Buildmaster
296ca28efa DATAMONGO-1304 - Release version 1.6.4.RELEASE (Evans SR4). 2015-10-14 03:37:13 -07:00
Oliver Gierke
86fa1d902a DATAMONGO-1304 - Prepare 1.6.4.RELEASE (Evans SR4). 2015-10-14 11:25:06 +02:00
Oliver Gierke
80834d8321 DATAMONGO-1304 - Updated changelog. 2015-10-14 11:24:57 +02:00
Oliver Gierke
30b0ec8f57 DATAMONGO-1282 - Updated changelog. 2015-10-14 08:52:52 +02:00
Oliver Gierke
b157981507 DATAMONGO-1268 - Updated changelog. 2015-10-14 08:52:40 +02:00
Oliver Gierke
bfd96d529c DATAMONGO-1261 - Updated changelog. 2015-10-14 08:52:26 +02:00
Oliver Gierke
bf51bcd7ea DATAMONGO-1246 - Updated changelog. 2015-10-14 08:49:35 +02:00
Oliver Gierke
f9317b9f09 DATAMONGO-1250 - Fixed inline code formatting in reference docs. 2015-07-04 19:07:24 +02:00
Oliver Gierke
99bb92db4e DATAMONGO-1250 - Fixed accidental duplicate invocation of value conversion in UpdateMapper.
UpdateMapper.getMappedObjectForField(…) invokes the very same method of the super class but handed in an already mapped value so that value conversion was invoked twice.

This was especially problematic in cases a dedicated converter had been registered for an object that is already a Mongo-storable one (e.g. an enum-to-string converter and back) without indicating which of the tow converter is the reading or the writing one. This basically caused the source value converted back and forth during the update mapping creating the impression the value wasn't converted at all.

This is now fixed by removing the superfluous mapping.
2015-07-04 14:16:48 +02:00
Oliver Gierke
87d609e1c4 DATAMONGO-1247 - After release cleanups. 2015-07-01 01:41:45 +02:00
Spring Buildmaster
8f223bf12f DATAMONGO-1247 - Prepare next development iteration. 2015-06-30 16:04:51 -07:00
Spring Buildmaster
a96253f9a4 DATAMONGO-1247 - Release version 1.6.3.RELEASE (Evans SR3). 2015-06-30 16:04:49 -07:00
Oliver Gierke
a0884b8227 DATAMONGO-1247 - Prepare 1.6.3.RELEASE (Evans SR3). 2015-07-01 00:12:28 +02:00
Oliver Gierke
2e6efae925 DATAMONGO-1247 - Updated changelog. 2015-07-01 00:12:11 +02:00
Oliver Gierke
4e688574fc DATAMONGO-1248 - Updated changelog. 2015-06-30 14:00:01 +02:00
Oliver Gierke
3abd38953c DATAMONGO-1228 - Updated changelog. 2015-06-30 13:59:53 +02:00
Oliver Gierke
026f3fadd3 DATAMONGO-1189 - Updated changelog. 2015-06-30 13:59:34 +02:00
Oliver Gierke
14e352aeae DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:55:42 +02:00
Christoph Strobl
41aa498a1e DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:55:37 +02:00
Christoph Strobl
74cead91fc DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:51:32 +02:00
Christoph Strobl
559f160b21 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:47:22 +02:00
Christoph Strobl
62fdd3e519 DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:32 +02:00
Thomas Darimont
1a414fa2b6 DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:38:01 +02:00
Oliver Gierke
29a712e298 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:56:55 +02:00
Eddú Meléndez
65a164874a DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:38:45 +02:00
Oliver Gierke
d6ff5162b3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:26:54 +02:00
Christoph Strobl
28a84cafc5 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:26:40 +02:00
Stefan Ganzer
eb10b5c426 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:26:15 +02:00
Thomas Darimont
b7b24215e3 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-05-25 13:55:59 +02:00
Oliver Gierke
2a27eb7404 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:13:40 +02:00
Oliver Gierke
ad3ad1f65f DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:08:13 +02:00
Oliver Gierke
e7278888a7 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:53:01 +02:00
Oliver Gierke
374fa5a1f2 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:48:57 +02:00
Oliver Gierke
be92489c58 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:15:55 +02:00
Christoph Strobl
88172c8674 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:37 +01:00
Oliver Gierke
2dc2072fce DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:30 +01:00
Thomas Darimont
a7b70d68d7 DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:22 +01:00
Oliver Gierke
6dc2de03ad DATAMONGO-1173 - Updated changelog. 2015-03-06 08:02:08 +01:00
Thomas Darimont
dc0a03ee63 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:25:05 +01:00
Thomas Darimont
504c44cdf3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:27 +01:00
Oliver Gierke
9ea8b6f6d1 DATAMONGO-1144 - Fixed version numbers to point to the next bugfix release. 2015-02-03 11:54:58 +01:00
Oliver Gierke
37c9de6af9 DATAMONGO-1155 - Upgraded mongo-next build profile to driver version 2.13.0.
Related ticket: DATAMONGO-1154.
2015-01-29 21:16:48 +01:00
Oliver Gierke
9a1ee81f3f DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:03:46 +01:00
Oliver Gierke
c8331c3a1c DATAMONGO-1144 - After release cleanups. 2015-01-28 17:41:43 +01:00
Spring Buildmaster
b794d536c5 DATAMONGO-1144 - Prepare next development iteration. 2015-01-28 04:17:48 -08:00
Spring Buildmaster
afa05122d2 DATAMONGO-1144 - Release version 1.6.2.RELEASE (Evans SR2). 2015-01-28 04:17:39 -08:00
Oliver Gierke
cf06a6527c DATAMONGO-1144 - Prepare 1.6.2.RELEASE (Evans SR2). 2015-01-28 12:03:12 +01:00
Oliver Gierke
0760a8b425 DATAMONGO-1144 - Updated changelog. 2015-01-28 11:23:25 +01:00
Oliver Gierke
f65cd1d235 DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:30 +01:00
Oliver Gierke
6f86208111 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:59 +01:00
Oliver Gierke
0c33f7e01a DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:39:54 +01:00
alex-on-java
047df58724 DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 18:02:50 +01:00
Christoph Strobl
9716a93576 DATAMONGO-1144 - Upgrade dependencies.
mongo-java-driver: 2.12.3 > 2.12.5
2015-01-22 08:50:19 +01:00
Thomas Darimont
03e5ebb741 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:48:03 +01:00
Thomas Darimont
6ac908b72e DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:05:23 +01:00
Thomas Darimont
5f1049f2de DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:19:52 +01:00
Christoph Strobl
07c79e8ba9 DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:18:43 +01:00
Oliver Gierke
a94cca3494 DATAMONGO-1106 - Updated changelog. 2015-01-20 07:37:27 +01:00
Oliver Gierke
90842844d4 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:24:48 +01:00
Oliver Gierke
f4c27407ac DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:41 +01:00
Oliver Gierke
00e1ebb880 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:49:46 +01:00
Thomas Darimont
dd32479ad4 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:48:44 +01:00
Christoph Strobl
ce5fe50550 DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:48:28 +01:00
Christophe Fargette
3a04bcb5d3 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:46 +01:00
Oliver Gierke
acfe6ccc2c DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:54:29 +01:00
Oliver Gierke
3f3ec19364 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:38:17 +01:00
Thomas Darimont
24cbef9bc5 DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:08:43 +01:00
Oliver Gierke
b5f74444de DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:32 +01:00
Oliver Gierke
1b7e077296 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:38:06 +01:00
Thomas Darimont
7ebaf935d4 DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:37:55 +01:00
Oliver Gierke
295d7579cb DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:23:48 +01:00
Christoph Strobl
a99950d83e DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:55 +01:00
Christoph Strobl
0bf4ee2711 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Original pull request: #241.
2014-11-27 17:14:23 +01:00
Thomas Darimont
5b8da8dd41 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:46:24 +01:00
Mikhail Mikhaylenko
b56ca97f68 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:42:51 +01:00
Thomas Darimont
f183b5c7e6 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:36:03 +01:00
Oliver Gierke
1fdde8f0c3 DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:24 +01:00
Christoph Strobl
b7c3e69653 DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:38:21 +01:00
Oliver Gierke
e64d69b8f5 DATAMONGO-1079 - After release cleanups. 2014-10-31 21:44:01 +01:00
Spring Buildmaster
d81ed203db DATAMONGO-1079 - Prepare next development iteration. 2014-10-30 14:36:37 +01:00
Spring Buildmaster
eae463622c DATAMONGO-1079 - Release version 1.6.1.RELEASE (Evans SR1). 2014-10-30 14:36:13 +01:00
Oliver Gierke
6ba8144bca DATAMONGO-1079 - Prepare 1.6.1.RELEASE (Evans SR1). 2014-10-30 12:33:08 +01:00
Oliver Gierke
9d1c1a9fc5 DATAMONGO-1079 - Updated changelog. 2014-10-30 11:57:35 +01:00
Oliver Gierke
f552ca8073 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:36:02 +01:00
Thomas Darimont
d7bd82c643 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:34 +01:00
Christoph Strobl
078cca83e3 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:30:08 +01:00
Christoph Strobl
93ae6815bd DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:56:59 +02:00
Christoph Strobl
13dcb8cda1 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 12:06:51 +02:00
Oliver Gierke
a4497bcf8a DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:14:12 +02:00
Christoph Strobl
6a82c47a4d DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:57 +02:00
Christoph Strobl
8f8f5b7ce4 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:39 +02:00
Oliver Gierke
c683813a7a DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:35 +02:00
Christoph Strobl
dbe983c3cb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:35 +02:00
Oliver Gierke
8e11fe84df DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 16:29:48 +02:00
Christoph Strobl
9eb2856840 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:05:52 +02:00
Thomas Darimont
828b379f1f DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-10-01 09:44:48 +02:00
Christoph Strobl
161fd8c09d DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:43:46 +02:00
Oliver Gierke
dc037dfef6 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:17:02 +02:00
Spring Buildmaster
c41653f9da DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 14:09:48 +02:00
Spring Buildmaster
51607c5ed8 DATAMONGO-1046 - Release version 1.6.0.RELEASE (Evans GA). 2014-09-05 03:12:02 -07:00
Oliver Gierke
e2cbd3ee28 DATAMONGO-1046 - Prepare 1.6.0.RELEASE (Evans GA). 2014-09-05 11:43:58 +02:00
Oliver Gierke
5944e6b57e DATAMONGO-1046 - Updated changelog. 2014-09-05 09:23:31 +02:00
Oliver Gierke
efd46498ef DATAMONGO-1033 - Updated changelog. 2014-09-05 07:31:54 +02:00
Christoph Strobl
3d705a737f DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:47:13 +02:00
Christoph Strobl
996c57bccf DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:21:55 +02:00
Oliver Gierke
a31e72ff06 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:51:31 +02:00
Mark Paluch
f07d8fca8c DATAMONGO-1036 - Improved detection of custom implementations for CDI repositories.
Adapted to API changes in CDI extension.

Related ticket: DATACMNS-565.
2014-09-01 13:51:20 +02:00
Christoph Strobl
69dbdee01f DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:39 +02:00
Oliver Gierke
dedb9f3dc0 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.

Improved toString() method on ObjectPath to create more helpful output.
2014-08-26 20:07:46 +02:00
Oliver Gierke
7d69b840fe DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:16:18 +02:00
Christoph Strobl
4eaef300cb DATAMONGO-1025 - Fix creation of nested named index.
Deprecated collection attribute for @Indexed, @CompoundIndex, @GeoSpatialIndexed. Removed deprecated attribute `expireAfterSeconds` from @CompoundIndex.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Christoph Strobl
ec1a6b5edd DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Oliver Gierke
adc5485c09 DATAMONGO-1032 - Polished Asciidoctor documentation. 2014-08-26 14:24:51 +02:00
Oliver Gierke
f622b2916d DATAMONGO-1021 - After release cleanups. 2014-08-13 16:32:42 +02:00
Spring Buildmaster
26be0cf948 DATAMONGO-1021 - Prepare next development iteration. 2014-08-13 07:02:43 -07:00
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
189 changed files with 11783 additions and 7898 deletions

28
pom.xml
View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.5.0.M1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
<version>1.5.5.BUILD-SNAPSHOT</version>
</parent>
<modules>
@@ -29,9 +28,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.9.0.M1</springdata.commons>
<mongo>2.12.1</mongo>
<mongo.osgi>2.12.1</mongo.osgi>
<springdata.commons>1.9.5.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.12.5</mongo>
<mongo.osgi>2.12.5</mongo.osgi>
</properties>
<developers>
@@ -108,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.12.3-SNAPSHOT</mongo>
<mongo>2.14.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -120,6 +119,15 @@
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.2</mongo>
</properties>
</profile>
<profile>
<id>mongo-3-next</id>
@@ -148,8 +156,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
</repositories>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -2,22 +2,22 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<dependencies>
<!-- Spring -->
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
@@ -59,8 +59,8 @@
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
@@ -126,10 +126,11 @@
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration>
</plugin>
</plugins>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,6 +32,10 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.9.205">
<context version="7.1.10.209">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,6 +32,7 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
@@ -11,18 +11,19 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version>
<version>1.6.5.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -50,7 +51,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -77,7 +78,7 @@
<version>1.0</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<dependency>
<groupId>javax.enterprise</groupId>
@@ -86,21 +87,21 @@
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
@@ -115,7 +116,7 @@
<version>${validation}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
@@ -129,23 +130,29 @@
<version>4.2.0.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
@@ -189,9 +196,14 @@
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
@@ -42,6 +44,7 @@ public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -141,7 +144,7 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key);
if ("2d".equals(value)) {
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,11 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -52,7 +52,6 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
@@ -416,7 +415,9 @@ public interface MongoOperations {
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -425,7 +426,9 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -653,14 +656,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>

View File

@@ -53,6 +53,7 @@ import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
@@ -71,7 +72,6 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -335,7 +335,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CommandResult result = execute(new DbCallback<CommandResult>() {
public CommandResult doInDB(DB db) throws MongoException, DataAccessException {
return db.command(command, options);
return readPreference != null ? db.command(command, readPreference) : db.command(command);
}
});
@@ -566,7 +566,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
BasicDBObject command = new BasicDBObject("geoNear", collection);
command.putAll(near.toDBObject());
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
List<Object> results = (List<Object>) commandResult.get("results");
results = results == null ? Collections.emptyList() : results;
@@ -641,7 +641,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
public long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
@@ -763,27 +767,33 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doInsertAll(Collection<? extends T> listToSave, MongoWriter<T> writer) {
Map<String, List<T>> objs = new HashMap<String, List<T>>();
for (T o : listToSave) {
Map<String, List<T>> elementsByCollection = new HashMap<String, List<T>>();
for (T element : listToSave) {
if (element == null) {
continue;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(element.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(o.getClass());
if (entity == null) {
throw new InvalidDataAccessApiUsageException("No Persitent Entity information found for the class "
+ o.getClass().getName());
throw new InvalidDataAccessApiUsageException("No PersistentEntity information found for " + element.getClass());
}
String collection = entity.getCollection();
List<T> collectionElements = elementsByCollection.get(collection);
List<T> objList = objs.get(collection);
if (null == objList) {
objList = new ArrayList<T>();
objs.put(collection, objList);
if (null == collectionElements) {
collectionElements = new ArrayList<T>();
elementsByCollection.put(collection, collectionElements);
}
objList.add(o);
collectionElements.add(element);
}
for (Map.Entry<String, List<T>> entry : objs.entrySet()) {
for (Map.Entry<String, List<T>> entry : elementsByCollection.entrySet()) {
doInsertBatch(entry.getKey(), entry.getValue(), this.mongoConverter);
}
}
@@ -1007,8 +1017,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Calling update using query: %s and update: %s in collection: %s",
serializeToJsonSafely(queryObj), serializeToJsonSafely(updateObj), collectionName));
}
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
@@ -1187,7 +1197,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { dboq, collection.getName() });
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { serializeToJsonSafely(dboq),
collection.getName() });
}
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
@@ -1415,7 +1426,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
handleCommandError(commandResult, command);
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
@@ -1616,12 +1627,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
DBObject mappedFields = queryMapper.getMappedFields(fields, entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
@@ -1659,8 +1671,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Class<T> entityClass) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndRemove using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, sort, entityClass, collectionName));
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
@@ -1684,8 +1696,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndModify using query: %s fields: %s sort: %s for class: %s and update: %s "
+ "in collection: %s", serializeToJsonSafely(mappedQuery), fields, sort, entityClass,
serializeToJsonSafely(mappedUpdate), collectionName));
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
@@ -1969,8 +1982,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return null;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
return queryMapper.getMappedObject(query.getSortObject(), entity);
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
}
// Callback implementations
@@ -1995,13 +2007,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " in db.collection: " + collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s in db.collection: %s", serializeToJsonSafely(query),
collection.getFullName()));
}
return collection.findOne(query);
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " fields: " + fields + " in db.collection: "
+ collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s fields: %s in db.collection: %s",
serializeToJsonSafely(query), fields, collection.getFullName()));
}
return collection.findOne(query, fields);
}
@@ -2030,7 +2043,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public DBCursor doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (fields == null || fields.toMap().isEmpty()) {
return collection.find(query);
} else {
return collection.find(query, fields);
@@ -2185,11 +2199,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint())) {
&& !StringUtils.hasText(query.getHint()) && !query.getMeta().hasValues()) {
return cursor;
}
DBCursor cursorToUse = cursor;
DBCursor cursorToUse = cursor.copy();
try {
if (query.getSkip() > 0) {
@@ -2205,6 +2219,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
if (query.getMeta().hasValues()) {
for (Entry<String, Object> entry : query.getMeta().values()) {
cursorToUse = cursorToUse.addSpecial(entry.getKey(), entry.getValue());
}
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -291,6 +292,19 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*

View File

@@ -88,7 +88,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
@@ -107,7 +107,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
@@ -363,7 +363,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -84,6 +84,15 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -187,15 +196,24 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation field with the given name. As no target is set explicitly, the name will be used as target
* as well.
* Creates an aggregation field with the given {@code name}.
*
* @param key
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/
public AggregationField(String key) {
this(key, null);
public AggregationField(String name) {
this(name, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,17 +22,33 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
public GeoNearOperation(NearQuery nearQuery) {
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
@@ -41,6 +57,10 @@ public class GeoNearOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,14 +21,17 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation
* Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
class LimitOperation implements AggregationOperation {
public class LimitOperation implements AggregationOperation {
private final long maxElements;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,18 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation
* Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
@@ -32,17 +36,17 @@ import com.mongodb.DBObject;
*/
public class MatchOperation implements AggregationOperation {
private final Criteria criteria;
private final CriteriaDefinition criteriaDefinition;
/**
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
* Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
*/
public MatchOperation(Criteria criteria) {
public MatchOperation(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteria, "Criteria must not be null!");
this.criteria = criteria;
Assert.notNull(criteriaDefinition, "Criteria must not be null!");
this.criteriaDefinition = criteriaDefinition;
}
/*
@@ -51,6 +55,6 @@ public class MatchOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
return new BasicDBObject("$match", context.getMappedObject(criteriaDefinition.getCriteriaObject()));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,10 +28,13 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* Encapsulates the aggregation framework {@code $project}-operation.
* <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -46,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,14 +17,15 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -45,6 +46,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToS
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.TermToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
@@ -58,6 +60,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class CustomConversions {
@@ -69,10 +72,13 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
private final List<Object> converters;
private final Map<ConvertiblePair, CacheValue> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue> customWriteTargetTypes;
private final Map<Class<?>, CacheValue> rawWriteTargetTypes;
/**
* Creates an empty {@link CustomConversions} object.
*/
@@ -92,7 +98,9 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue>();
List<Object> toRegister = new ArrayList<Object>();
@@ -106,7 +114,8 @@ public class CustomConversions {
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.add(TermToStringConverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());
@@ -235,70 +244,103 @@ public class CustomConversions {
* @param sourceType must not be {@literal null}
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType) {
return getCustomWriteTarget(sourceType, null);
public Class<?> getCustomWriteTarget(final Class<?> sourceType) {
return getOrCreateAndCache(sourceType, rawWriteTargetTypes, new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, null, writingPairs);
}
});
}
/**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
* Returns the target type we can readTargetWriteLocl an inject of the given source type to. The returned type might
* be a subclass of the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply
* return the first target type matching or {@literal null} if no conversion can be found.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
public Class<?> getCustomWriteTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
Assert.notNull(sourceType);
if (requestedTargetType == null) {
return getCustomWriteTarget(sourceType);
}
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customWriteTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
}
});
}
/**
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might
* be a subclass of the given expected type though.
* Returns whether we have a custom conversion registered to readTargetWriteLocl into a Mongo native type. The
* returned type might be a subclass of the given expected type though.
*
* @param sourceType must not be {@literal null}
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType) {
Assert.notNull(sourceType);
return hasCustomWriteTarget(sourceType, null);
}
/**
* Returns whether we have a custom conversion registered to write an object of the given source type into an object
* of the given Mongo native target type.
* Returns whether we have a custom conversion registered to readTargetWriteLocl an object of the given source type
* into an object of the given Mongo native target type.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
return getCustomWriteTarget(sourceType, requestedTargetType) != null;
}
/**
* Returns whether we have a custom conversion registered to read the given source into the given target type.
* Returns whether we have a custom conversion registered to readTargetReadLock the given source into the given target
* type.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType must not be {@literal null}
* @return
*/
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
Assert.notNull(requestedTargetType);
return getCustomReadTarget(sourceType, requestedTargetType) != null;
}
/**
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
if (requestedTargetType == null) {
return null;
}
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customReadTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, readingPairs);
}
});
}
/**
* Inspects the given {@link ConvertiblePair}s for ones that have a source compatible type as source. Additionally
* checks assignability of the target type if one is given.
*
* @param sourceType must not be {@literal null}.
@@ -307,11 +349,15 @@ public class CustomConversions {
* @return
*/
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
Iterable<ConvertiblePair> pairs) {
Collection<ConvertiblePair> pairs) {
Assert.notNull(sourceType);
Assert.notNull(pairs);
if (requestedTargetType != null && pairs.contains(new ConvertiblePair(sourceType, requestedTargetType))) {
return requestedTargetType;
}
for (ConvertiblePair typePair : pairs) {
if (typePair.getSourceType().isAssignableFrom(sourceType)) {
Class<?> targetType = typePair.getTargetType();
@@ -325,32 +371,31 @@ public class CustomConversions {
}
/**
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
* Will try to find a value for the given key in the given cache or produce one using the given {@link Producer} and
* store it in the cache.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @param key the key to lookup a potentially existing value, must not be {@literal null}.
* @param cache the cache to find the value in, must not be {@literal null}.
* @param producer the {@link Producer} to create values to cache, must not be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue> cache, Producer producer) {
Assert.notNull(sourceType);
CacheValue cacheValue = cache.get(key);
if (requestedTargetType == null) {
return null;
if (cacheValue != null) {
return cacheValue.getType();
}
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType);
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey);
Class<?> type = producer.get();
cache.put(key, CacheValue.of(type));
if (readTargetTypeValue != null) {
return readTargetTypeValue.getType();
}
return type;
}
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs));
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
private interface Producer {
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType();
Class<?> get();
}
@WritingConverter

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,7 @@ import com.mongodb.DBObject;
*/
class DBObjectAccessor {
private final DBObject dbObject;
private final BasicDBObject dbObject;
/**
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
@@ -46,7 +46,7 @@ class DBObjectAccessor {
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
this.dbObject = dbObject;
this.dbObject = (BasicDBObject) dbObject;
}
/**
@@ -62,6 +62,11 @@ class DBObjectAccessor {
Assert.notNull(prop, "MongoPersistentProperty must not be null!");
String fieldName = prop.getFieldName();
if (!fieldName.contains(".")) {
dbObject.put(fieldName, value);
return;
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
DBObject dbObject = this.dbObject;
@@ -87,12 +92,16 @@ class DBObjectAccessor {
* @param property must not be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<Object, Object> source = this.dbObject.toMap();
Map<String, Object> source = this.dbObject;
Object result = null;
while (source != null && parts.hasNext()) {
@@ -108,14 +117,14 @@ class DBObjectAccessor {
}
@SuppressWarnings("unchecked")
private Map<Object, Object> getAsMap(Object source) {
private Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) {
return ((DBObject) source).toMap();
return (BasicDBObject) source;
}
if (source instanceof Map) {
return (Map<Object, Object>) source;
return (Map<String, Object>) source;
}
return null;

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,16 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}
public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy);
}

View File

@@ -39,7 +39,8 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
class DefaultDbRefProxyHandler implements DbRefProxyHandler {
private final SpELContext spELContext;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ValueResolver resolver;
/**
* @param spELContext must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public DefaultDbRefProxyHandler(SpELContext spELContext,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, ValueResolver resolver) {
this.spELContext = spELContext;
this.mappingContext = mappingContext;
this.resolver = resolver;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/
@Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) {
if (source == null) {
return proxy;
}
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(property);
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
if(idProperty.usePropertyAccess()) {
return proxy;
}
SpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(proxy, spELContext);
BeanWrapper<Object> proxyWrapper = BeanWrapper.create(proxy, null);
DBObject object = new BasicDBObject(idProperty.getFieldName(), source.getId());
ObjectPath objectPath = ObjectPath.ROOT.push(proxy, persistentEntity, null);
proxyWrapper.setProperty(idProperty, resolver.getValueInternal(idProperty, object, evaluator, objectPath));
return proxy;
}
}

View File

@@ -77,13 +77,14 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback);
return createLazyLoadingProxy(property, dbref, callback, handler);
}
return callback.resolve(property);
@@ -112,7 +113,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
@@ -122,7 +124,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
factory.setCallbacks(new Callback[] { interceptor });
return factory;
return handler.populateId(property, dbref, factory);
}
ProxyFactory proxyFactory = new ProxyFactory();
@@ -135,7 +137,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
return handler.populateId(property, dbref, proxyFactory.getProxy());
}
/**
@@ -171,11 +173,12 @@ public class DefaultDbRefResolver implements DbRefResolver {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
@@ -187,8 +190,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
FINALIZE_METHOD = Object.class.getDeclaredMethod("finalize");
} catch (Exception e) {
throw new RuntimeException(e);
}
@@ -252,6 +256,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
// DATAMONGO-1076 - finalize methods should not trigger proxy initialization
if (FINALIZE_METHOD.equals(method)) {
return null;
}
}
Object target = ensureResolved();

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Default implementation of {@link DbRefResolverCallback}.
*
* @author Oliver Gierke
*/
class DefaultDbRefResolverCallback implements DbRefResolverCallback {
private final DBObject surroundingObject;
private final ObjectPath path;
private final ValueResolver resolver;
private final SpELExpressionEvaluator evaluator;
/**
* Creates a new {@link DefaultDbRefResolverCallback} using the given {@link DBObject}, {@link ObjectPath},
* {@link ValueResolver} and {@link SpELExpressionEvaluator}.
*
* @param surroundingObject must not be {@literal null}.
* @param path must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public DefaultDbRefResolverCallback(DBObject surroundingObject, ObjectPath path, SpELExpressionEvaluator evaluator,
ValueResolver resolver) {
this.surroundingObject = surroundingObject;
this.path = path;
this.resolver = resolver;
this.evaluator = evaluator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolverCallback#resolve(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public Object resolve(MongoPersistentProperty property) {
return resolver.getValueInternal(property, surroundingObject, evaluator, path);
}
}

View File

@@ -57,18 +57,15 @@ abstract class GeoConverters {
*
* @return
*/
@SuppressWarnings("unchecked")
public static Collection<? extends Object> getConvertersToRegister() {
return Arrays.asList( //
BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE //
@@ -91,13 +88,11 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) {
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
(Double) source.get("y"));
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
}
}
@@ -151,7 +146,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
* Converts a {@link BasicDBList} into a {@link Box}.
*
* @author Thomas Darimont
* @since 1.5
@@ -166,7 +161,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) {
if (source == null) {
@@ -176,7 +170,7 @@ abstract class GeoConverters {
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new org.springframework.data.mongodb.core.geo.Box(first, second);
return new Box(first, second);
}
}
@@ -210,7 +204,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
* Converts a {@link DBObject} into a {@link Circle}.
*
* @author Thomas Darimont
* @since 1.5
@@ -251,71 +245,6 @@ abstract class GeoConverters {
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
@@ -422,7 +351,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
* Converts a {@link BasicDBList} into a {@link Polygon}.
*
* @author Thomas Darimont
* @since 1.5
@@ -437,7 +366,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings({ "deprecation", "unchecked" })
@SuppressWarnings({ "unchecked" })
public Polygon convert(DBObject source) {
if (source == null) {
@@ -453,7 +382,7 @@ abstract class GeoConverters {
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
}
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
return new Polygon(newPoints);
}
}
@@ -472,7 +401,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public DBObject convert(GeoCommand source) {
if (source == null) {
@@ -493,10 +421,10 @@ abstract class GeoConverters {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
} else if (shape instanceof Circle) {
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius());
} else if (shape instanceof Polygon) {

View File

@@ -23,6 +23,7 @@ import com.mongodb.DBRef;
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.5
*/
public interface LazyLoadingProxy {
@@ -33,7 +34,7 @@ public interface LazyLoadingProxy {
* @return
* @since 1.5
*/
Object initialize();
Object getTarget();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.

View File

@@ -31,7 +31,7 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
@@ -56,6 +56,7 @@ import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import com.mongodb.BasicDBList;
@@ -73,7 +74,9 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s";
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
@@ -81,6 +84,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
@@ -93,11 +97,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(DbRefResolver dbRefResolver,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(ConversionServiceFactory.createDefaultConversionService());
super(new DefaultConversionService());
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
@@ -132,8 +135,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param typeMapper the typeMapper to set
*/
public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
mappingContext) : typeMapper;
this.typeMapper = typeMapper == null
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper;
}
/*
@@ -184,11 +187,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) {
return read(type, dbo, null);
return read(type, dbo, ObjectPath.ROOT);
}
@SuppressWarnings("unchecked")
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo, Object parent) {
private <S extends Object> S read(TypeInformation<S> type, DBObject dbo, ObjectPath path) {
if (null == dbo) {
return null;
@@ -206,11 +209,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, parent);
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, path);
}
if (typeToUse.isMap()) {
return (S) readMap(typeToUse, dbo, parent);
return (S) readMap(typeToUse, dbo, path);
}
if (dbo instanceof BasicDBList) {
throw new MappingException(String.format(INCOMPATIBLE_TYPES, dbo, BasicDBList.class, typeToUse.getType(), path));
}
// Retrieve persistent entity info
@@ -220,41 +227,56 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for " + rawType.getName());
}
return read(persistentEntity, dbo, parent);
return read(persistentEntity, dbo, path);
}
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
DBObject source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent);
entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
path);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, parent);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, path);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
final S result = wrapper.getBean();
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
wrapper.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
// we skip the id property since it was already set
if (idProperty != null && idProperty.equals(prop)) {
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
Object obj = getValueInternal(prop, dbo, evaluator, result);
wrapper.setProperty(prop, obj);
wrapper.setProperty(prop, getValueInternal(prop, dbo, evaluator, currentPath));
}
});
@@ -262,19 +284,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty property = association.getInverse();
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getFieldName());
if (value == null) {
return;
}
Object value = dbo.get(property.getName());
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override
public Object resolve(MongoPersistentProperty property) {
return getValueInternal(property, dbo, evaluator, parent);
}
});
DbRefProxyHandler handler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
MappingMongoConverter.this);
DbRefResolverCallback callback = new DefaultDbRefResolverCallback(dbo, currentPath, evaluator,
MappingMongoConverter.this);
wrapper.setProperty(property, obj);
wrapper.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
}
});
@@ -314,14 +338,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
boolean handledByCustomConverter = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(obj.getClass());
Class<?> entityType = obj.getClass();
boolean handledByCustomConverter = conversions.getCustomWriteTarget(entityType, DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(entityType);
if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) {
typeMapper.writeType(type, dbo);
}
writeInternal(obj, dbo, type);
Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).getTarget() : obj;
writeInternal(target, dbo, type);
}
/**
@@ -337,7 +364,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
Class<?> customTarget = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class);
Class<?> entityType = obj.getClass();
Class<?> customTarget = conversions.getCustomWriteTarget(entityType, DBObject.class);
if (customTarget != null) {
DBObject result = conversionService.convert(obj, DBObject.class);
@@ -345,17 +373,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
if (Map.class.isAssignableFrom(obj.getClass())) {
if (Map.class.isAssignableFrom(entityType)) {
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityType);
writeInternal(obj, dbo, entity);
addCustomTypeKeyIfNecessary(typeHint, obj, dbo);
}
@@ -462,7 +490,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize();
obj = ((LazyLoadingProxy) obj).getTarget();
}
// Lookup potential custom target type
@@ -478,8 +506,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
: new BasicDBObject();
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
? mappingContext.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
writeInternal(obj, propDbObj, entity);
accessor.put(prop, propDbObj);
@@ -559,7 +587,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
} else {
@@ -611,12 +639,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey();
Object val = entry.getValue();
if (conversions.isSimpleType(key.getClass())) {
// Don't use conversion service here as removal of ObjectToString converter results in some primitive types not
// being convertable
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key);
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection || val.getClass().isArray()) {
@@ -637,6 +666,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbo;
}
/**
* Prepares the given {@link Map} key to be converted into a {@link String}. Will invoke potentially registered custom
* conversions and escape dots from the result as they're not supported as {@link Map} key in MongoDB.
*
* @param key must not be {@literal null}.
* @return
*/
private String prepareMapKey(Object key) {
Assert.notNull(key, "Map key must not be null!");
String convertedKey = potentiallyConvertMapKey(key);
return potentiallyEscapeMapKey(convertedKey);
}
/**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured.
@@ -652,13 +696,31 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (mapKeyDotReplacement == null) {
throw new MappingException(String.format("Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!", source));
throw new MappingException(String.format(
"Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!",
source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
}
/**
* Returns a {@link String} representation of the given {@link Map} key
*
* @param key
* @return
*/
private String potentiallyConvertMapKey(Object key) {
if (key instanceof String) {
return (String) key;
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString();
}
/**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured.
@@ -682,10 +744,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
TypeInformation<?> actualType = type != null ? type.getActualType() : null;
Class<?> reference = actualType == null ? Object.class : actualType.getType();
Class<?> valueType = ClassUtils.getUserClass(value.getClass());
boolean notTheSameClass = !value.getClass().equals(reference);
boolean notTheSameClass = !valueType.equals(reference);
if (notTheSameClass) {
typeMapper.writeType(value.getClass(), dbObject);
typeMapper.writeType(valueType, dbObject);
}
}
@@ -738,7 +801,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) {
if (value == null || target == null) {
if (value == null || target == null || target.isAssignableFrom(value.getClass())) {
return value;
}
@@ -750,7 +813,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Enum.valueOf((Class<Enum>) target, value.toString());
}
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
return conversionService.convert(value, target);
}
protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
@@ -791,11 +854,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
idMapper.convertId(id));
}
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
Object parent) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(dbo, spELContext, parent);
return provider.getPropertyValue(prop);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.ValueResolver#getValueInternal(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, com.mongodb.DBObject, org.springframework.data.mapping.model.SpELExpressionEvaluator, java.lang.Object)
*/
@Override
public Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath path) {
return new MongoDbPropertyValueProvider(dbo, evaluator, path).getPropertyValue(prop);
}
/**
@@ -803,11 +869,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
Assert.notNull(targetType);
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType();
@@ -819,18 +887,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, rawComponentType, sourceValue.size());
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
parent));
items.add(
DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem), path));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent));
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
@@ -843,13 +911,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
*
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject
* @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
* @return
*/
@SuppressWarnings("unchecked")
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) {
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, ObjectPath path) {
Assert.notNull(dbObject);
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
@@ -876,7 +946,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
map.put(key, read(valueType, (DBObject) value, path));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
} else {
@@ -888,21 +958,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map;
}
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {
Object obj = dbList.get(i);
if (obj instanceof BasicDBList) {
rootList.add(unwrapList((BasicDBList) obj, targetType.getComponentType()));
} else if (obj instanceof DBObject) {
rootList.add(read(targetType, (DBObject) obj));
} else {
rootList.add(obj);
}
}
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
@@ -924,7 +979,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
TypeInformation<?> typeHint = typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);
@@ -959,10 +1014,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.write(obj, newDbo);
if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo);
return removeTypeInfo(newDbo, true);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
if (typeInformation.getType().equals(NestedDocument.class)) {
return removeTypeInfo(newDbo, false);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfo(newDbo, true);
}
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
@@ -976,12 +1035,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
/**
* Removes the type information from the conversion result.
* Removes the type information from the entire conversion result.
*
* @param object
* @param recursively whether to apply the removal recursively
* @return
*/
private Object removeTypeInfoRecursively(Object object) {
private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof DBObject)) {
return object;
@@ -989,19 +1049,29 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (recursively) {
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfo(element, recursively);
}
} else {
removeTypeInfo(value, recursively);
}
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element);
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (!recursively) {
break;
}
} else {
removeTypeInfoRecursively(value);
}
}
@@ -1012,24 +1082,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbObject;
}
/**
* {@link PropertyValueProvider} to evaluate a SpEL expression if present on the property or simply accesses the field
* of the configured source {@link DBObject}.
*
* @author Oliver Gierke
*/
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
private final DBObjectAccessor source;
private final SpELExpressionEvaluator evaluator;
private final Object parent;
private final ObjectPath path;
public MongoDbPropertyValueProvider(DBObject source, SpELContext factory, Object parent) {
this(source, new DefaultSpELExpressionEvaluator(source, factory), parent);
}
public MongoDbPropertyValueProvider(DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
/**
* Creates a new {@link MongoDbPropertyValueProvider} for the given source, {@link SpELExpressionEvaluator} and
* {@link ObjectPath}.
*
* @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param path can be {@literal null}.
*/
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, ObjectPath path) {
Assert.notNull(source);
Assert.notNull(evaluator);
this.source = new DBObjectAccessor(source);
this.evaluator = evaluator;
this.parent = parent;
this.path = path;
}
/*
@@ -1045,7 +1125,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
return readValue(value, property.getTypeInformation(), parent);
return readValue(value, property.getTypeInformation(), path);
}
}
@@ -1055,10 +1135,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private class ConverterAwareSpELExpressionParameterValueProvider
extends SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent;
private final ObjectPath path;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
@@ -1068,10 +1148,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate,
ObjectPath path) {
super(evaluator, conversionService, delegate);
this.parent = parent;
this.path = path;
}
/*
@@ -1080,28 +1161,44 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent);
return readValue(object, parameter.getType(), path);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
private <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
return (T) readCollectionOrArray(type, (BasicDBList) value, path);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
return (T) read(type, (DBObject) value, path);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
@SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
if (rawType.equals(DBRef.class)) {
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getRef());
if (object != null) {
return (T) object;
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
}
/**
* Performs the fetch operation for the given {@link DBRef}.
*
@@ -1111,4 +1208,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject readRef(DBRef ref) {
return ref.fetch();
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.
*
* @author Christoph Strobl
* @since 1.8
*/
static class NestedDocument {
}
}

View File

@@ -25,6 +25,8 @@ import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -34,6 +36,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
abstract class MongoConverters {
@@ -160,4 +163,19 @@ abstract class MongoConverters {
return source == null ? null : source.toString();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@Override
public String convert(Term source) {
return source == null ? null : source.getFormatted();
}
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* A path of objects nested into each other. The type allows access to all parent objects currently in creation even
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
private ObjectPath() {
this.items = Collections.emptyList();
}
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
* @param parent can be {@literal null}.
* @param item
*/
private ObjectPath(ObjectPath parent, ObjectPath.ObjectPathItem item) {
List<ObjectPath.ObjectPathItem> items = new ArrayList<ObjectPath.ObjectPathItem>(parent.items);
items.add(item);
this.items = Collections.unmodifiableList(items);
}
/**
* Returns a copy of the {@link ObjectPath} with the given {@link Object} as current object.
*
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
ObjectPathItem item = new ObjectPathItem(object, id, entity.getCollection());
return new ObjectPath(this, item);
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
public Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())) {
return object;
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
* @return
*/
public Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
if (items.isEmpty()) {
return "[empty]";
}
List<String> strings = new ArrayList<String>(items.size());
for (ObjectPathItem item : items) {
strings.add(item.object.toString());
}
return StringUtils.collectionToDelimitedString(strings, " -> ");
}
/**
* An item in an {@link ObjectPath}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
}
}

View File

@@ -34,10 +34,13 @@ import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter.NestedDocument;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
@@ -57,6 +60,12 @@ import com.mongodb.DBRef;
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
}
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -119,6 +128,61 @@ public class QueryMapper {
return result;
}
/**
* Maps fields used for sorting to the {@link MongoPersistentEntity}s properties. <br />
* Also converts properties to their {@code $meta} representation if present.
*
* @param sortObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedSort(DBObject sortObject, MongoPersistentEntity<?> entity) {
if (sortObject == null) {
return null;
}
DBObject mappedSort = getMappedObject(sortObject, entity);
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
}
/**
* Maps fields to retrieve to the {@link MongoPersistentEntity}s properties. <br />
* Also onverts and potentially adds missing property {@code $meta} representation.
*
* @param fieldsObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedFields(DBObject fieldsObject, MongoPersistentEntity<?> entity) {
DBObject mappedFields = fieldsObject != null ? getMappedObject(fieldsObject, entity) : new BasicDBObject();
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields.keySet().isEmpty() ? null : mappedFields;
}
private void mapMetaAttributes(DBObject source, MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
if (entity == null || source == null) {
return;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsField(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
}
private DBObject getMappedTextScoreField(MongoPersistentProperty property) {
return new BasicDBObject(property.getFieldName(), META_TEXT_SCORE);
}
/**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
*
@@ -190,8 +254,8 @@ public class QueryMapper {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = keyword.getValue();
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue(
property.with(keyword.getKey()), value);
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property)
: getMappedValue(property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue);
}
@@ -274,7 +338,8 @@ public class QueryMapper {
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
return entity.hasIdProperty()
&& (type.equals(DBRef.class) || entity.getIdProperty().getActualType().isAssignableFrom(type));
}
/**
@@ -322,10 +387,16 @@ public class QueryMapper {
*/
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
if (property == null || source == null || source instanceof DBObject) {
return source;
}
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
return new DBRef(ref.getDB(), ref.getRef(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
@@ -397,13 +468,20 @@ public class QueryMapper {
*/
public Object convertId(Object id) {
try {
return conversionService.convert(id, ObjectId.class);
} catch (ConversionException e) {
// Ignore
if (id == null) {
return null;
}
return delegateConvertToMongoType(id, null);
if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
}
/**
@@ -583,6 +661,10 @@ public class QueryMapper {
public Association<MongoPersistentProperty> getAssociation() {
return null;
}
public TypeInformation<?> getTypeHint() {
return ClassTypeInformation.OBJECT;
}
}
/**
@@ -725,7 +807,7 @@ public class QueryMapper {
*/
@Override
public String getMappedKey() {
return path == null ? name : path.toDotPath(getPropertyConverter());
return path == null ? name : path.toDotPath(isAssociation() ? getAssociationConverter() : getPropertyConverter());
}
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
@@ -742,7 +824,7 @@ public class QueryMapper {
try {
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation());
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d", ""), entity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
@@ -777,5 +859,77 @@ public class QueryMapper {
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE;
}
/**
* Return the {@link Converter} to use for creating the mapped key of an association. Default implementation is
* {@link AssociationConverter}.
*
* @return
* @since 1.7
*/
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTypeHint()
*/
@Override
public TypeInformation<?> getTypeHint() {
MongoPersistentProperty property = getProperty();
if (property == null) {
return super.getTypeHint();
}
if (property.getActualType().isInterface()
|| java.lang.reflect.Modifier.isAbstract(property.getActualType().getModifiers())) {
return ClassTypeInformation.OBJECT;
}
return NESTED_DOCUMENT;
}
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
protected static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -65,8 +66,8 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
entity.getTypeInformation());
return converter.convertToMongoType(source,
entity == null ? ClassTypeInformation.OBJECT : getTypeHintForEntity(source, entity));
}
/*
@@ -89,7 +90,7 @@ public class UpdateMapper extends QueryMapper {
return getMappedUpdateModifier(field, rawValue);
}
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
return super.getMappedObjectForField(field, rawValue);
}
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
@@ -97,14 +98,14 @@ public class UpdateMapper extends QueryMapper {
if (rawValue instanceof Modifier) {
value = getMappedValue((Modifier) rawValue);
value = getMappedValue(field, (Modifier) rawValue);
} else if (rawValue instanceof Modifiers) {
DBObject modificationOperations = new BasicDBObject();
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
modificationOperations.putAll(getMappedValue(modifier).toMap());
modificationOperations.putAll(getMappedValue(field, modifier).toMap());
}
value = modificationOperations;
@@ -132,12 +133,31 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
private DBObject getMappedValue(Field field, Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
TypeInformation<?> typeHint = field == null ? ClassTypeInformation.OBJECT : field.getTypeHint();
Object value = converter.convertToMongoType(modifier.getValue(), typeHint);
return new BasicDBObject(modifier.getKey(), value);
}
private TypeInformation<?> getTypeHintForEntity(Object source, MongoPersistentEntity<?> entity) {
return processTypeHintForNestedDocuments(source, entity.getTypeInformation());
}
private TypeInformation<?> processTypeHintForNestedDocuments(Object source, TypeInformation<?> info) {
Class<?> type = info.getActualType().getType();
if (type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
return info;
}
if (!type.equals(source.getClass())) {
return info;
}
return NESTED_DOCUMENT;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
@@ -146,8 +166,8 @@ public class UpdateMapper extends QueryMapper {
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
return entity == null ? super.createPropertyField(entity, key, mappingContext) : //
new MetadataBackedUpdateField(entity, key, mappingContext);
return entity == null ? super.createPropertyField(entity, key, mappingContext)
: new MetadataBackedUpdateField(entity, key, mappingContext);
}
/**
@@ -194,47 +214,76 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
return new UpdatePropertyConverter(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getAssociationConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new UpdateAssociationConverter(getAssociation(), key);
}
/**
* Converter to skip all properties after an association property was rendered.
* Special mapper handling positional parameter {@literal $} within property names.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.7
*/
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private static class UpdateKeyMapper {
private final MongoPersistentProperty property;
private boolean associationFound;
private final Iterator<String> iterator;
protected UpdateKeyMapper(String rawKey) {
Assert.hasText(rawKey, "Key must not be null or empty!");
this.iterator = Arrays.asList(rawKey.split("\\.")).iterator();
this.iterator.next();
}
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param association must not be {@literal null}.
* @param property
* @return
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
protected String mapPropertyName(MongoPersistentProperty property) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
boolean inspect = iterator.hasNext();
while (inspect) {
if (associationFound) {
return null;
String partial = iterator.next();
boolean isPositional = isPositionalParameter(partial);
if (isPositional) {
mappedName += "." + partial;
}
inspect = isPositional && iterator.hasNext();
}
if (property.equals(source)) {
associationFound = true;
return mappedName;
}
boolean isPositionalParameter(String partial) {
if (partial.equals("$")) {
return true;
}
return source.getFieldName();
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}
/**
@@ -242,10 +291,11 @@ public class UpdateMapper extends QueryMapper {
* contained in the source update key.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final Iterator<String> iterator;
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
@@ -256,8 +306,7 @@ public class UpdateMapper extends QueryMapper {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
this.iterator.next();
this.mapper = new UpdateKeyMapper(updateKey);
}
/*
@@ -266,9 +315,37 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
public String convert(MongoPersistentProperty property) {
return mapper.mapPropertyName(property);
}
}
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
/**
* {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
*
* @author Christoph Strobl
*/
protected static class UpdateAssociationConverter extends AssociationConverter {
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
super(association);
this.mapper = new UpdateKeyMapper(key);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return super.convert(source) == null ? null : mapper.mapPropertyName(source);
}
}
}

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Internal API to trigger the resolution of properties.
*
* @author Oliver Gierke
*/
interface ValueResolver {
/**
* Resolves the value for the given {@link MongoPersistentProperty} within the given {@link DBObject} using the given
* {@link SpELExpressionEvaluator} and {@link ObjectPath}.
*
* @param prop
* @param dbo
* @param evaluator
* @param parent
* @return
*/
Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath parent);
}

View File

@@ -1,75 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
super(lowerLeft, upperRight);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
}

View File

@@ -1,154 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -1,37 +0,0 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
super(multiplier);
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
*
* @param value
*/
public Distance(double value) {
this(value, Metrics.NEUTRAL);
}
public Distance(double value, Metric metric) {
super(value, metric);
}
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
/**
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
* from.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
private static final long serialVersionUID = 23421312312412L;
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
*
* @param content must not be {@literal null}.
*/
public GeoPage(GeoResults<T> results) {
super(results);
}
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}, {@link Pageable} and total.
*
* @param results must not be {@literal null}.
* @param pageable must not be {@literal null}.
* @param total
*/
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
super(results, pageable, total);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Calue object capturing some arbitrary object plus a distance.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
/**
* Creates a new {@link GeoResult} for the given content and distance.
*
* @param content must not be {@literal null}.
* @param distance must not be {@literal null}.
*/
public GeoResult(T content, Distance distance) {
super(content, distance);
}
}

View File

@@ -1,60 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
/**
* Value object to capture {@link GeoResult}s as well as the average distance they have.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
* to be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
/**
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
* given {@link GeoResult}s.
*
* @param results must not be {@literal null}.
*/
public GeoResults(List<? extends GeoResult<T>> results) {
super(results);
}
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
super(results, metric);
}
/**
* Creates a new {@link GeoResults} instance from the given {@link GeoResult}s and average distance.
*
* @param results must not be {@literal null}.
* @param averageDistance
*/
@PersistenceConstructor
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
super(results, averageDistance);
}
}

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.mongodb.core.query.NearQuery;
/**
* Commonly used {@link Metrics} for {@link NearQuery}s.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public enum Metrics implements Metric {
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
private final double multiplier;
private Metrics(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -1,55 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
/**
* Represents a geospatial point value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Point extends org.springframework.data.geo.Point {
@PersistenceConstructor
public Point(double x, double y) {
super(x, y);
}
public Point(org.springframework.data.geo.Point point) {
super(point);
}
public double[] asArray() {
return new double[] { getX(), getY() };
}
public List<Double> asList() {
return asList(this);
}
public static List<Double> asList(org.springframework.data.geo.Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -1,92 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Simple value object to represent a {@link Polygon}.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
public static final String COMMAND = "$polygon";
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param x
* @param y
* @param z
* @param others
*/
public <P extends Point> Polygon(P x, P y, P z, P... others) {
super(x, y, z, others);
}
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param points
*/
public <P extends Point> Polygon(List<P> points) {
super(points);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return asList(this);
}
/**
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
*
* @param polygon
* @return
*/
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
List<Point> points = polygon.getPoints();
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
for (Point point : points) {
tuples.add(Arrays.asList(point.getX(), point.getY()));
}
return tuples;
}
}

View File

@@ -1,45 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Shape extends org.springframework.data.geo.Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
}

View File

@@ -22,6 +22,7 @@ import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.util.Assert;
/**
@@ -30,7 +31,6 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class Sphere implements Shape {
public static final String COMMAND = "$centerSphere";
@@ -73,23 +73,13 @@ public class Sphere implements Shape {
this(circle.getCenter(), circle.getRadius());
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
@Deprecated
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public org.springframework.data.mongodb.core.geo.Point getCenter() {
return new org.springframework.data.mongodb.core.geo.Point(this.center);
public Point getCenter() {
return new Point(this.center);
}
/**
@@ -141,20 +131,21 @@ public class Sphere implements Shape {
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
@Override
public List<? extends Object> asList() {
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
@Override
public String getCommand() {
return COMMAND;
}

View File

@@ -73,7 +73,43 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
* The name of the index to be created. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* &#64;CompoundIndex(name = "compound_index", def = "{'h1': 1, 'h2': 1}")
* class Hybrid {
* String h1, h2;
* }
*
* &#64;CompoundIndex(name = "compound_index", def = "{'n1': 1, 'n2': 1}")
* class Nested {
* String n1, n2;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.ensureIndex( { hybrid.h1: 1, hybrid.h2: 1 } , { name: "hybrid.compound_index" } )
* db.root.ensureIndex( { nested.n1: 1, nested.n2: 1 } , { name: "nested.compound_index" } )
* db.hybrid.ensureIndex( { h1: 1, h2: 1 } , { name: "compound_index" } )
* </code>
* </pre>
*
* @return
*/
@@ -93,7 +129,10 @@ public @interface CompoundIndex {
* stored in.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**
@@ -104,14 +143,4 @@ public @interface CompoundIndex {
*/
boolean background() default false;
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @deprecated TTL cannot be defined for {@link CompoundIndex} having more than one field as key. Will be removed in
* 1.6.
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
@Deprecated
int expireAfterSeconds() default -1;
}

View File

@@ -32,7 +32,41 @@ import java.lang.annotation.Target;
public @interface GeoSpatialIndexed {
/**
* Name of the property in the document that contains the [x, y] or radial coordinates to index.
* Index name. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* class Hybrid {
* &#64;GeoSpatialIndexed(name="index") Point h1;
* }
*
* class Nested {
* &#64;GeoSpatialIndexed(name="index") Point n1;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.ensureIndex( { hybrid.h1: "2d" } , { name: "hybrid.index" } )
* db.root.ensureIndex( { nested.n1: "2d" } , { name: "nested.index" } )
* db.hybrid.ensureIndex( { h1: "2d" } , { name: "index" } )
* </code>
* </pre>
*
* @return
*/
@@ -51,7 +85,10 @@ public @interface GeoSpatialIndexed {
* Name of the collection in which to create the index.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -58,7 +58,41 @@ public @interface Indexed {
boolean dropDups() default false;
/**
* Index name.
* Index name. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* class Hybrid {
* &#64;Indexed(name="index") String h1;
* }
*
* class Nested {
* &#64;Indexed(name="index") String n1;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.ensureIndex( { hybrid.h1: 1 } , { name: "hybrid.index" } )
* db.root.ensureIndex( { nested.n1: 1 } , { name: "nested.index" } )
* db.hybrid.ensureIndex( { h1: 1} , { name: "index" } )
* </code>
* </pre>
*
* @return
*/
@@ -74,10 +108,13 @@ public @interface Indexed {
boolean useGeneratedName() default false;
/**
* Colleciton name for index to be created on.
* Collection name for index to be created on.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -60,4 +60,10 @@ public class MongoMappingEventPublisher implements ApplicationEventPublisher {
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(java.lang.Object)
*/
public void publishEvent(Object event) {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,6 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexRes
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
/**
@@ -43,8 +42,7 @@ import org.springframework.util.Assert;
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
public class MongoPersistentEntityIndexCreator implements ApplicationListener<MappingContextEvent<?, ?>> {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
@@ -54,7 +52,7 @@ public class MongoPersistentEntityIndexCreator implements
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -65,7 +63,7 @@ public class MongoPersistentEntityIndexCreator implements
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -92,7 +90,7 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<?, ?> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
@@ -102,7 +100,7 @@ public class MongoPersistentEntityIndexCreator implements
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes(event.getPersistentEntity());
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -132,8 +130,8 @@ public class MongoPersistentEntityIndexCreator implements
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection()).createIndex(indexDefinition.getIndexKeys(),
indexDefinition.getIndexOptions());
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,6 @@ import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.PropertyHandler;
@@ -96,7 +95,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
Assert.notNull(document, "Given entity is not collection root.");
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root.getType()));
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root));
indexInformation.addAll(potentiallyCreateTextIndexDefinition(root));
final CycleGuard guard = new CycleGuard();
@@ -118,7 +117,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.add(indexDefinitionHolder);
}
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
});
@@ -138,10 +137,11 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
private List<IndexDefinitionHolder> resolveIndexForClass(final Class<?> type, final String path,
final String collection, final CycleGuard guard) {
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions(path, collection, type));
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions(path, collection, entity));
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
@@ -155,7 +155,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(), propertyDotPath,
collection, guard));
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
@@ -183,14 +183,13 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
private List<IndexDefinitionHolder> potentiallyCreateCompoundIndexDefinitions(String dotPath, String collection,
Class<?> type) {
MongoPersistentEntity<?> entity) {
if (AnnotationUtils.findAnnotation(type, CompoundIndexes.class) == null
&& AnnotationUtils.findAnnotation(type, CompoundIndex.class) == null) {
if (entity.findAnnotation(CompoundIndexes.class) == null && entity.findAnnotation(CompoundIndex.class) == null) {
return Collections.emptyList();
}
return createCompoundIndexDefinitions(dotPath, collection, type);
return createCompoundIndexDefinitions(dotPath, collection, entity);
}
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(MongoPersistentEntity<?> root) {
@@ -206,7 +205,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
TextIndexDefinition indexDefinition = indexDefinitionBuilder.build();
@@ -221,7 +220,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
private void appendTextIndexInformation(final String dotPath,
final TextIndexDefinitionBuilder indexDefinitionBuilder, MongoPersistentEntity<?> entity,
final TextIndexDefinitionBuilder indexDefinitionBuilder, final MongoPersistentEntity<?> entity,
final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@@ -231,7 +230,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
guard.protect(persistentProperty, dotPath);
if (persistentProperty.isLanguageProperty()) {
if (persistentProperty.isExplicitLanguageProperty() && !StringUtils.hasText(dotPath)) {
indexDefinitionBuilder.withLanguageOverride(persistentProperty.getFieldName());
}
@@ -257,7 +256,11 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage(), e);
LOGGER.info(e.getMessage(), e);
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.info(
String.format("Potentially invalid index structure discovered. Breaking operation for %s.",
entity.getName()), e);
}
} else if (includeOptions.isForce() || indexed != null) {
indexDefinitionBuilder.onField(propertyDotPath, weight);
@@ -278,21 +281,21 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
* @return
*/
protected List<IndexDefinitionHolder> createCompoundIndexDefinitions(String dotPath, String fallbackCollection,
Class<?> type) {
MongoPersistentEntity<?> entity) {
List<IndexDefinitionHolder> indexDefinitions = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
CompoundIndexes indexes = AnnotationUtils.findAnnotation(type, CompoundIndexes.class);
CompoundIndexes indexes = entity.findAnnotation(CompoundIndexes.class);
if (indexes != null) {
for (CompoundIndex index : indexes.value()) {
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index));
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index, entity));
}
}
CompoundIndex index = AnnotationUtils.findAnnotation(type, CompoundIndex.class);
CompoundIndex index = entity.findAnnotation(CompoundIndex.class);
if (index != null) {
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index));
indexDefinitions.add(createCompoundIndexDefinition(dotPath, fallbackCollection, index, entity));
}
return indexDefinitions;
@@ -300,13 +303,13 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
@SuppressWarnings("deprecation")
protected IndexDefinitionHolder createCompoundIndexDefinition(String dotPath, String fallbackCollection,
CompoundIndex index) {
CompoundIndex index, MongoPersistentEntity<?> entity) {
CompoundIndexDefinition indexDefinition = new CompoundIndexDefinition(resolveCompoundIndexKeyFromStringDefinition(
dotPath, index.def()));
if (!index.useGeneratedName()) {
indexDefinition.named(index.name());
indexDefinition.named(pathAwareIndexName(index.name(), dotPath, null));
}
if (index.unique()) {
@@ -321,16 +324,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexDefinition.background();
}
int ttl = index.expireAfterSeconds();
if (ttl >= 0) {
if (indexDefinition.getIndexKeys().keySet().size() > 1) {
LOGGER.warn("TTL is not supported for compound index with more than one key. TTL={} will be ignored.", ttl);
} else {
indexDefinition.expire(ttl, TimeUnit.SECONDS);
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
@@ -377,7 +370,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
IndexDirection.ASCENDING.equals(index.direction()) ? Sort.Direction.ASC : Sort.Direction.DESC);
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : dotPath);
indexDefinition.named(pathAwareIndexName(index.name(), dotPath, persitentProperty));
}
if (index.unique()) {
@@ -419,7 +412,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexDefinition.withMin(index.min()).withMax(index.max());
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : persistentProperty.getName());
indexDefinition.named(pathAwareIndexName(index.name(), dotPath, persistentProperty));
}
indexDefinition.typed(index.type()).withBucketSize(index.bucketSize()).withAdditionalField(index.additionalField());
@@ -427,6 +420,23 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
private String pathAwareIndexName(String indexName, String dotPath, MongoPersistentProperty property) {
String nameToUse = StringUtils.hasText(indexName) ? indexName : "";
if (!StringUtils.hasText(dotPath) || (property != null && dotPath.equals(property.getFieldName()))) {
return StringUtils.hasText(nameToUse) ? nameToUse : dotPath;
}
if (StringUtils.hasText(dotPath)) {
nameToUse = StringUtils.hasText(nameToUse) ? (property != null ? dotPath.replace("." + property.getFieldName(),
"") : dotPath) + "." + nameToUse : dotPath;
}
return nameToUse;
}
/**
* {@link CycleGuard} holds information about properties and the paths for accessing those. This information is used
* to detect potential cycles within the references.
@@ -456,8 +466,9 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
for (Path existingPath : paths) {
if (existingPath.cycles(property, path)) {
if (existingPath.cycles(property, path) && property.isEntity()) {
paths.add(new Path(property, path));
throw new CyclicPropertyReferenceException(property.getFieldName(), property.getOwner().getType(),
existingPath.getPath());
}
@@ -530,7 +541,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return false;
}
return path.contains(this.path);
return path.equals(this.path) || path.contains(this.path + ".") || path.contains("." + this.path);
}
}
}

View File

@@ -276,7 +276,7 @@ public class TextIndexDefinition implements IndexDefinition {
* @return
*/
public TextIndexDefinitionBuilder onField(String fieldname) {
return onField(fieldname, Float.NaN);
return onField(fieldname, 1F);
}
/**

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.mapping;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
@@ -35,6 +36,7 @@ import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.common.LiteralExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
@@ -53,32 +55,37 @@ import org.springframework.util.StringUtils;
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @DocumentField annotation!";
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private static final SpelExpressionParser PARSER = new SpelExpressionParser();
private final String collection;
private final String language;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
private final Expression expression;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
* collection name to the entities simple type name.
*
* @param typeInformation
* @param typeInformation must not be {@literal null}.
*/
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
super(typeInformation, MongoPersistentPropertyComparator.INSTANCE);
this.parser = new SpelExpressionParser();
this.context = new StandardEvaluationContext();
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
if (rawType.isAnnotationPresent(Document.class)) {
Document d = rawType.getAnnotation(Document.class);
this.collection = StringUtils.hasText(d.collection()) ? d.collection() : fallback;
this.language = StringUtils.hasText(d.language()) ? d.language() : "";
Document document = rawType.getAnnotation(Document.class);
this.expression = detectExpression(document);
this.context = new StandardEvaluationContext();
if (document != null) {
this.collection = StringUtils.hasText(document.collection()) ? document.collection() : fallback;
this.language = StringUtils.hasText(document.language()) ? document.language() : "";
} else {
this.collection = fallback;
this.language = "";
@@ -101,8 +108,7 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
return expression == null ? collection : expression.getValue(context, String.class);
}
/*
@@ -114,6 +120,24 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return this.language;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getTextScoreProperty()
*/
@Override
public MongoPersistentProperty getTextScoreProperty() {
return getPersistentProperty(TextScore.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasTextScoreProperty()
*/
@Override
public boolean hasTextScoreProperty() {
return getTextScoreProperty() != null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.BasicPersistentEntity#verify()
@@ -218,6 +242,31 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return null;
}
/**
* Returns a SpEL {@link Expression} frór the collection String expressed in the given {@link Document} annotation if
* present or {@literal null} otherwise. Will also return {@literal null} it the collection {@link String} evaluates
* to a {@link LiteralExpression} (indicating that no subsequent evaluation is necessary).
*
* @param document can be {@literal null}
* @return
*/
private static Expression detectExpression(Document document) {
if (document == null) {
return null;
}
String collection = document.collection();
if (!StringUtils.hasText(collection)) {
return null;
}
Expression expression = PARSER.parseExpression(document.collection(), ParserContext.TEMPLATE_EXPRESSION);
return expression instanceof LiteralExpression ? null : expression;
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.
@@ -257,28 +306,44 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
*/
private static class PropertyTypeAssertionHandler implements PropertyHandler<MongoPersistentProperty> {
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.PropertyHandler#doWithPersistentProperty(org.springframework.data.mapping.PersistentProperty)
*/
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
potentiallyAssertTextScoreType(persistentProperty);
potentiallyAssertLanguageType(persistentProperty);
potentiallyAssertDBRefTargetType(persistentProperty);
}
private void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
private static void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isLanguageProperty()) {
if (persistentProperty.isExplicitLanguageProperty()) {
assertPropertyType(persistentProperty, String.class);
}
}
private void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
private static void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isTextScoreProperty()) {
assertPropertyType(persistentProperty, Float.class, Double.class);
}
}
private void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
private static void potentiallyAssertDBRefTargetType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isDbReference() && persistentProperty.getDBRef().lazy()) {
if (persistentProperty.isArray() || Modifier.isFinal(persistentProperty.getActualType().getModifiers())) {
throw new MappingException(String.format(
"Invalid lazy DBRef property for %s. Found %s which must not be an array nor a final class.",
persistentProperty.getField(), persistentProperty.getActualType()));
}
}
}
private static void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
for (Class<?> potentialMatch : validMatches) {
if (ClassUtils.isAssignable(potentialMatch, persistentProperty.getActualType())) {
@@ -286,10 +351,9 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
throw new MappingException(String.format("Missmatching types for %s. Found %s expected one of %s.",
persistentProperty.getField(), persistentProperty.getActualType(),
StringUtils.arrayToCommaDelimitedString(validMatches)));
throw new MappingException(
String.format("Missmatching types for %s. Found %s expected one of %s.", persistentProperty.getField(),
persistentProperty.getActualType(), StringUtils.arrayToCommaDelimitedString(validMatches)));
}
}
}

View File

@@ -190,9 +190,18 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
*/
@Override
public boolean isLanguageProperty() {
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isAnnotationPresent(Language.class);
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isExplicitLanguageProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isExplicitLanguageProperty()
*/
@Override
public boolean isExplicitLanguageProperty() {
return isAnnotationPresent(Language.class);
};
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isTextScoreProperty()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,8 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
private Boolean isIdProperty;
private Boolean isAssociation;
private String fieldName;
private Boolean usePropertyAccess;
private Boolean isTransient;
/**
* Creates a new {@link CachingMongoPersistentProperty}.
@@ -85,4 +87,32 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
return this.fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#usePropertyAccess()
*/
@Override
public boolean usePropertyAccess() {
if (this.usePropertyAccess == null) {
this.usePropertyAccess = super.usePropertyAccess();
}
return this.usePropertyAccess;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#isTransient()
*/
@Override
public boolean isTransient() {
if (this.isTransient == null) {
this.isTransient = super.isTransient();
}
return this.isTransient;
}
}

View File

@@ -40,4 +40,21 @@ public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersi
*/
String getLanguage();
/**
* Returns the property holding text score value.
*
* @since 1.6
* @see #hasTextScoreProperty()
* @return {@literal null} if not present.
*/
MongoPersistentProperty getTextScoreProperty();
/**
* Returns whether the entity has a {@link TextScore} property.
*
* @since 1.6
* @return true if property annotated with {@link TextScore} is present.
*/
boolean hasTextScoreProperty();
}

View File

@@ -45,7 +45,7 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
int getFieldOrder();
/**
* Returns whether the propert is a {@link com.mongodb.DBRef}. If this returns {@literal true} you can expect
* Returns whether the property is a {@link com.mongodb.DBRef}. If this returns {@literal true} you can expect
* {@link #getDBRef()} to return an non-{@literal null} value.
*
* @return
@@ -61,14 +61,22 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
boolean isExplicitIdProperty();
/**
* Returns whether the property indicates the documents language either by having a {@link #getFieldName()} equal to
* {@literal language} or being annotated with {@link Language}.
* Returns true whether the property indicates the documents language either by having a {@link #getFieldName()} equal
* to {@literal language} or being annotated with {@link Language}.
*
* @return
* @since 1.6
*/
boolean isLanguageProperty();
/**
* Returns true when property being annotated with {@link Language}.
*
* @return
* @since 1.6.1
*/
boolean isExplicitLanguageProperty();
/**
* Returns whether the property holds the documents score calculated by text search. <br/>
* It's marked with {@link TextScore}.

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
@@ -25,6 +27,7 @@ import com.mongodb.util.JSON;
* @author Thomas Risberg
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class BasicQuery extends Query {
@@ -50,8 +53,12 @@ public class BasicQuery extends Query {
this.fieldsObject = fieldsObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#addCriteria(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public Query addCriteria(Criteria criteria) {
public Query addCriteria(CriteriaDefinition criteria) {
this.queryObject.putAll(criteria.getCriteriaObject());
return this;
}
@@ -93,4 +100,42 @@ public class BasicQuery extends Query {
protected void setFieldsObject(DBObject fieldsObject) {
this.fieldsObject = fieldsObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#equals(java.lang.Object)
*/
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (!(o instanceof BasicQuery)) {
return false;
}
BasicQuery that = (BasicQuery) o;
return querySettingsEquals(that) && //
nullSafeEquals(fieldsObject, that.fieldsObject) && //
nullSafeEquals(queryObject, that.queryObject) && //
nullSafeEquals(sortObject, that.sortObject);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#hashCode()
*/
@Override
public int hashCode() {
int result = super.hashCode();
result = 31 * result + nullSafeHashCode(queryObject);
result = 31 * result + nullSafeHashCode(fieldsObject);
result = 31 * result + nullSafeHashCode(sortObject);
return result;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.Collections;
import com.mongodb.BasicDBObject;
@@ -87,12 +88,8 @@ public class BasicUpdate extends Update {
@Override
public Update pullAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
DBObject keyValue = new BasicDBObject();
keyValue.put(key, convertedValues);
keyValue.put(key, Arrays.copyOf(values, values.length));
updateObject.put("$pullAll", keyValue);
return this;
}

View File

@@ -31,7 +31,9 @@ import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -388,20 +390,6 @@ public class Criteria implements CriteriaDefinition {
return this;
}
/**
* @see Criteria#withinSphere(Circle)
* @param circle
* @return
* @deprecated As of 1.5, Use {@link #withinSphere(Circle)}. This method is scheduled to be removed in the next major
* release.
*/
@Deprecated
public Criteria withinSphere(org.springframework.data.mongodb.core.geo.Circle circle) {
Assert.notNull(circle);
criteria.put("$within", new GeoCommand(new Sphere(circle)));
return this;
}
/**
* Creates a geospatial criterion using a {@literal $within} operation.
*
@@ -529,8 +517,11 @@ public class Criteria implements CriteriaDefinition {
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
return criteriaChain.get(0).getSingleCriteriaObject();
} else if (CollectionUtils.isEmpty(this.criteriaChain) && !CollectionUtils.isEmpty(this.criteria)) {
return getSingleCriteriaObject();
} else {
DBObject criteriaObject = new BasicDBObject();
for (Criteria c : this.criteriaChain) {
@@ -564,6 +555,13 @@ public class Criteria implements CriteriaDefinition {
}
}
if (!StringUtils.hasText(this.key)) {
if (not) {
return new BasicDBObject("$not", dbo);
}
return dbo;
}
DBObject queryCriteria = new BasicDBObject();
if (!NOT_SET.equals(isValue)) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,25 @@ package org.springframework.data.mongodb.core.query;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface CriteriaDefinition {
/**
* Get {@link DBObject} representation.
*
* @return
*/
DBObject getCriteriaObject();
}
/**
* Get the identifying {@literal key}.
*
* @return
* @since 1.6
*/
String getKey();
}

View File

@@ -66,17 +66,16 @@ public class GeoCommand {
* @param shape must not be {@literal null}.
* @return
*/
@SuppressWarnings("deprecation")
private String getCommand(Shape shape) {
Assert.notNull(shape, "Shape must not be null!");
if (shape instanceof Box) {
return org.springframework.data.mongodb.core.geo.Box.COMMAND;
} else if (shape instanceof Circle || shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
return org.springframework.data.mongodb.core.geo.Circle.COMMAND;
return "$box";
} else if (shape instanceof Circle) {
return "$center";
} else if (shape instanceof Polygon) {
return org.springframework.data.mongodb.core.geo.Polygon.COMMAND;
return "$polygon";
} else if (shape instanceof Sphere) {
return org.springframework.data.mongodb.core.geo.Sphere.COMMAND;
}

View File

@@ -0,0 +1,193 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* Meta-data for {@link Query} instances.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.6
*/
public class Meta {
private enum MetaKey {
MAX_TIME_MS("$maxTimeMS"), MAX_SCAN("$maxScan"), COMMENT("$comment"), SNAPSHOT("$snapshot");
private String key;
private MetaKey(String key) {
this.key = key;
}
}
private final Map<String, Object> values = new LinkedHashMap<String, Object>(2);
/**
* @return {@literal null} if not set.
*/
public Long getMaxTimeMsec() {
return getValue(MetaKey.MAX_TIME_MS.key);
}
/**
* Set the maximum time limit in milliseconds for processing operations.
*
* @param maxTimeMsec
*/
public void setMaxTimeMsec(long maxTimeMsec) {
setMaxTime(maxTimeMsec, TimeUnit.MILLISECONDS);
}
/**
* Set the maximum time limit for processing operations.
*
* @param timeout
* @param timeUnit
*/
public void setMaxTime(long timeout, TimeUnit timeUnit) {
setValue(MetaKey.MAX_TIME_MS.key, (timeUnit != null ? timeUnit : TimeUnit.MILLISECONDS).toMillis(timeout));
}
/**
* @return {@literal null} if not set.
*/
public Long getMaxScan() {
return getValue(MetaKey.MAX_SCAN.key);
}
/**
* Only scan the specified number of documents.
*
* @param maxScan
*/
public void setMaxScan(long maxScan) {
setValue(MetaKey.MAX_SCAN.key, maxScan);
}
/**
* Add a comment to the query.
*
* @param comment
*/
public void setComment(String comment) {
setValue(MetaKey.COMMENT.key, comment);
}
/**
* @return {@literal null} if not set.
*/
public String getComment() {
return getValue(MetaKey.COMMENT.key);
}
/**
* Using snapshot prevents the cursor from returning a document more than once.
*
* @param useSnapshot
*/
public void setSnapshot(boolean useSnapshot) {
setValue(MetaKey.SNAPSHOT.key, useSnapshot);
}
/**
* @return {@literal null} if not set.
*/
public boolean getSnapshot() {
return getValue(MetaKey.SNAPSHOT.key, false);
}
/**
* @return
*/
public boolean hasValues() {
return !this.values.isEmpty();
}
/**
* Get {@link Iterable} of set meta values.
*
* @return
*/
public Iterable<Entry<String, Object>> values() {
return Collections.unmodifiableSet(this.values.entrySet());
}
/**
* Sets or removes the value in case of {@literal null} or empty {@link String}.
*
* @param key must not be {@literal null} or empty.
* @param value
*/
private void setValue(String key, Object value) {
Assert.hasText(key, "Meta key must not be 'null' or blank.");
if (value == null || (value instanceof String && !StringUtils.hasText((String) value))) {
this.values.remove(key);
}
this.values.put(key, value);
}
@SuppressWarnings("unchecked")
private <T> T getValue(String key) {
return (T) this.values.get(key);
}
private <T> T getValue(String key, T defaultValue) {
T value = getValue(key);
return value != null ? value : defaultValue;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.values);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof Meta)) {
return false;
}
Meta other = (Meta) obj;
return ObjectUtils.nullSafeEquals(this.values, other.values);
}
}

View File

@@ -343,7 +343,6 @@ public final class NearQuery {
*
* @return
*/
@SuppressWarnings("deprecation")
public DBObject toDBObject() {
BasicDBObject dbObject = new BasicDBObject();

View File

@@ -25,6 +25,7 @@ import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -39,56 +40,65 @@ import com.mongodb.DBObject;
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class Query {
private static final String RESTRICTED_TYPES_KEY = "_$RESTRICTED_TYPES";
private final Set<Class<?>> restrictedTypes = new HashSet<Class<?>>();
private final Map<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private final Map<String, CriteriaDefinition> criteria = new LinkedHashMap<String, CriteriaDefinition>();
private Field fieldSpec;
private Sort sort;
private int skip;
private int limit;
private String hint;
private Meta meta = new Meta();
/**
* Static factory method to create a {@link Query} using the provided {@link Criteria}.
* Static factory method to create a {@link Query} using the provided {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @return
* @since 1.6
*/
public static Query query(Criteria criteria) {
return new Query(criteria);
public static Query query(CriteriaDefinition criteriaDefinition) {
return new Query(criteriaDefinition);
}
public Query() {}
/**
* Creates a new {@link Query} using the given {@link Criteria}.
* Creates a new {@link Query} using the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @since 1.6
*/
public Query(Criteria criteria) {
addCriteria(criteria);
public Query(CriteriaDefinition criteriaDefinition) {
addCriteria(criteriaDefinition);
}
/**
* Adds the given {@link Criteria} to the current {@link Query}.
* Adds the given {@link CriteriaDefinition} to the current {@link Query}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
* @return
* @since 1.6
*/
public Query addCriteria(Criteria criteria) {
CriteriaDefinition existing = this.criteria.get(criteria.getKey());
String key = criteria.getKey();
public Query addCriteria(CriteriaDefinition criteriaDefinition) {
CriteriaDefinition existing = this.criteria.get(criteriaDefinition.getKey());
String key = criteriaDefinition.getKey();
if (existing == null) {
this.criteria.put(key, criteria);
this.criteria.put(key, criteriaDefinition);
} else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, "
+ "you can't add a second '" + key + "' criteria. " + "Query already contains '"
+ existing.getCriteriaObject() + "'.");
}
return this;
}
@@ -166,7 +176,7 @@ public class Query {
for (Order order : sort) {
if (order.isIgnoreCase()) {
throw new IllegalArgumentException(String.format("Gven sort contained an Order for %s with ignore case! "
throw new IllegalArgumentException(String.format("Given sort contained an Order for %s with ignore case! "
+ "MongoDB does not support sorting ignoreing case currently!", order.getProperty()));
}
}
@@ -268,8 +278,86 @@ public class Query {
return hint;
}
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
/**
* @param maxTimeMsec
* @return
* @see Meta#setMaxTimeMsec(long)
* @since 1.6
*/
public Query maxTimeMsec(long maxTimeMsec) {
meta.setMaxTimeMsec(maxTimeMsec);
return this;
}
/**
* @param timeout
* @param timeUnit
* @return
* @see Meta#setMaxTime(long, TimeUnit)
* @since 1.6
*/
public Query maxTime(long timeout, TimeUnit timeUnit) {
meta.setMaxTime(timeout, timeUnit);
return this;
}
/**
* @param maxScan
* @return
* @see Meta#setMaxScan(long)
* @since 1.6
*/
public Query maxScan(long maxScan) {
meta.setMaxScan(maxScan);
return this;
}
/**
* @param comment
* @return
* @see Meta#setComment(String)
* @since 1.6
*/
public Query comment(String comment) {
meta.setComment(comment);
return this;
}
/**
* @return
* @see Meta#setSnapshot(boolean)
* @since 1.6
*/
public Query useSnapshot() {
meta.setSnapshot(true);
return this;
}
/**
* @return never {@literal null}.
* @since 1.6
*/
public Meta getMeta() {
return meta;
}
/**
* @param meta must not be {@literal null}.
* @since 1.6
*/
public void setMeta(Meta meta) {
Assert.notNull(meta, "Query meta might be empty but must not be null.");
this.meta = meta;
}
protected List<CriteriaDefinition> getCriteria() {
return new ArrayList<CriteriaDefinition>(this.criteria.values());
}
/*
@@ -297,16 +385,26 @@ public class Query {
return false;
}
Query that = (Query) obj;
return querySettingsEquals((Query) obj);
}
/**
* Tests whether the settings of the given {@link Query} are equal to this query.
*
* @param that
* @return
*/
protected boolean querySettingsEquals(Query that) {
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean fieldsEqual = nullSafeEquals(this.fieldSpec, that.fieldSpec);
boolean sortEqual = nullSafeEquals(this.sort, that.sort);
boolean hintEqual = nullSafeEquals(this.hint, that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
boolean metaEqual = nullSafeEquals(this.meta, that.meta);
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual && metaEqual;
}
/*
@@ -324,6 +422,7 @@ public class Query {
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
result += 31 * nullSafeHashCode(meta);
return result;
}

View File

@@ -13,7 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query.text;
package org.springframework.data.mongodb.core.query;
/**
* A {@link Term} defines one or multiple words {@link Type#WORD} or phrases {@link Type#PHRASE} to be used in the
@@ -24,7 +24,7 @@ package org.springframework.data.mongodb.core.query.text;
*/
public class Term {
enum Type {
public enum Type {
WORD, PHRASE;
}

View File

@@ -13,16 +13,12 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query.text;
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
@@ -30,42 +26,59 @@ import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
* Implementation of {@link CriteriaDefinition} to be used for full text search .
* Implementation of {@link CriteriaDefinition} to be used for full text search.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.6
*/
public class TextCriteria extends Criteria {
public class TextCriteria implements CriteriaDefinition {
private final List<Term> terms;
private String language;
private List<Term> terms;
/**
* Creates a new {@link TextCriteria}.
*
* @see #forDefaultLanguage()
* @see #forLanguage(String)
*/
public TextCriteria() {
this(null);
}
private TextCriteria(String language) {
this.language = language;
this.terms = new ArrayList<Term>();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
/**
* Returns a new {@link TextCriteria} for the default language.
*
* @return
*/
@Override
public DBObject getCriteriaObject() {
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
if (StringUtils.hasText(language)) {
builder.add("$language", language);
}
if (!CollectionUtils.isEmpty(terms)) {
builder.add("$search", join(terms.iterator()));
}
return new BasicDBObject("$text", builder.get());
public static TextCriteria forDefaultLanguage() {
return new TextCriteria();
}
/**
* @param words
* For a full list of supported languages see the mongodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param language
* @return
*/
public static TextCriteria forLanguage(String language) {
Assert.hasText(language, "Language must not be null or empty!");
return new TextCriteria(language);
}
/**
* Configures the {@link TextCriteria} to match any of the given words.
*
* @param words the words to match.
* @return
*/
public TextCriteria matchingAny(String... words) {
@@ -73,22 +86,21 @@ public class TextCriteria extends Criteria {
for (String word : words) {
matching(word);
}
return this;
}
/**
* Add given {@link Term} to criteria.
* Adds given {@link Term} to criteria.
*
* @param term must not be null.
* @param term must not be {@literal null}.
*/
public void matching(Term term) {
public TextCriteria matching(Term term) {
Assert.notNull(term, "Term to add must not be null.");
this.terms.add(term);
}
private void notMatching(Term term) {
matching(term.negate());
this.terms.add(term);
return this;
}
/**
@@ -110,7 +122,7 @@ public class TextCriteria extends Criteria {
public TextCriteria notMatching(String term) {
if (StringUtils.hasText(term)) {
notMatching(new Term(term, Term.Type.WORD));
matching(new Term(term, Term.Type.WORD).negate());
}
return this;
}
@@ -136,7 +148,7 @@ public class TextCriteria extends Criteria {
public TextCriteria notMatchingPhrase(String phrase) {
if (StringUtils.hasText(phrase)) {
notMatching(new Term(phrase, Term.Type.PHRASE));
matching(new Term(phrase, Term.Type.PHRASE).negate());
}
return this;
}
@@ -155,69 +167,45 @@ public class TextCriteria extends Criteria {
return this;
}
/**
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getKey()
*/
public static TextCriteria forDefaultLanguage() {
return new TextCriteriaBuilder().build();
}
/**
* For a full list of supported languages see the mongdodb reference manual for <a
* href="http://docs.mongodb.org/manual/reference/text-search-languages/">Text Search Languages</a>.
*
* @param language
* @return
*/
public static TextCriteria forLanguage(String language) {
return new TextCriteriaBuilder().withLanguage(language).build();
}
private static String join(Iterator<Term> iterator) {
Term first = iterator.next();
if (!iterator.hasNext()) {
return first.getFormatted();
}
StringBuilder buf = new StringBuilder(256);
if (first != null) {
buf.append(first);
}
while (iterator.hasNext()) {
buf.append(' ');
Term obj = iterator.next();
if (obj != null) {
buf.append(obj.getFormatted());
}
}
return buf.toString();
}
public static class TextCriteriaBuilder {
private TextCriteria instance;
public TextCriteriaBuilder() {
this.instance = new TextCriteria();
}
public TextCriteriaBuilder withLanguage(String language) {
this.instance.language = language;
return this;
}
public TextCriteria build() {
return this.instance;
}
}
@Override
public String getKey() {
return "$text";
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
@Override
public DBObject getCriteriaObject() {
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
if (StringUtils.hasText(language)) {
builder.add("$language", language);
}
if (!terms.isEmpty()) {
builder.add("$search", join(terms));
}
return new BasicDBObject("$text", builder.get());
}
private String join(Iterable<Term> terms) {
List<String> result = new ArrayList<String>();
for (Term term : terms) {
if (term != null) {
result.add(term.getFormatted());
}
}
return StringUtils.collectionToDelimitedString(result, " ");
}
}

View File

@@ -13,12 +13,10 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query.text;
package org.springframework.data.mongodb.core.query;
import java.util.Locale;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -147,6 +145,10 @@ public class TextQuery extends Query {
return scoreFieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#getFieldsObject()
*/
@Override
public DBObject getFieldsObject() {
@@ -155,24 +157,32 @@ public class TextQuery extends Query {
}
DBObject fields = super.getFieldsObject();
if (fields == null) {
fields = new BasicDBObject();
}
fields.put(getScoreFieldName(), META_TEXT_SCORE);
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#getSortObject()
*/
@Override
public DBObject getSortObject() {
DBObject sort = new BasicDBObject();
if (this.sortByScore) {
sort.put(getScoreFieldName(), META_TEXT_SCORE);
}
if (super.getSortObject() != null) {
sort.putAll(super.getSortObject());
}
return sort;
}
}

View File

@@ -64,7 +64,7 @@ public class Update {
}
/**
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exlude fields from making
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exclude fields from making
* it into the created {@link Update} object. Note, that this will set attributes directly and <em>not</em> use
* {@literal $set}. This means fields not given in the {@link DBObject} will be nulled when executing the update. To
* create an only-updating {@link Update} instance of a {@link DBObject}, call {@link #set(String, Object)} for each
@@ -189,12 +189,7 @@ public class Update {
* @return
*/
public Update pushAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
addMultiFieldOperation("$pushAll", key, convertedValues);
addMultiFieldOperation("$pushAll", key, Arrays.copyOf(values, values.length));
return this;
}
@@ -258,12 +253,7 @@ public class Update {
* @return
*/
public Update pullAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
addFieldOperation("$pullAll", key, convertedValues);
addFieldOperation("$pullAll", key, Arrays.copyOf(values, values.length));
return this;
}
@@ -400,7 +390,7 @@ public class Update {
*/
@Override
public String toString() {
return getUpdateObject().toString();
return SerializationUtils.serializeToJsonSafely(getUpdateObject());
}
/**
@@ -495,13 +485,7 @@ public class Update {
return ((Collection<?>) values[0]).toArray();
}
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
return convertedValues;
return Arrays.copyOf(values, values.length);
}
/*

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.data.annotation.QueryAnnotation;
/**
* @author Christoph Strobl
* @since 1.6
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@Documented
@QueryAnnotation
public @interface Meta {
/**
* Set the maximum time limit in milliseconds for processing operations.
*
* @return
*/
long maxExcecutionTime() default -1;
/**
* Only scan the specified number of documents.
*
* @return
*/
long maxScanDocuments() default -1;
/**
* Add a comment to the query.
*
* @return
*/
String comment() default "";
/**
* Using snapshot prevents the cursor from returning a document more than once.
*
* @return
*/
boolean snapshot() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,8 @@ import org.springframework.data.repository.PagingAndSortingRepository;
* Mongo specific {@link org.springframework.data.repository.Repository} interface.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
@NoRepositoryBean
public interface MongoRepository<T, ID extends Serializable> extends PagingAndSortingRepository<T, ID> {
@@ -47,4 +49,26 @@ public interface MongoRepository<T, ID extends Serializable> extends PagingAndSo
* @see org.springframework.data.repository.PagingAndSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
List<T> findAll(Sort sort);
/**
* Inserts the given a given entity. Assumes the instance to be new to be able to apply insertion optimizations. Use
* the returned instance for further operations as the save operation might have changed the entity instance
* completely. Prefer using {@link #save(Object)} instead to avoid the usage of store-specific API.
*
* @param entity must not be {@literal null}.
* @return the saved entity
* @since 1.7
*/
<S extends T> S insert(S entity);
/**
* Inserts the given entities. Assumes the given entities to have not been persisted yet and thus will optimize the
* insert over a call to {@link #save(Iterable)}. Prefer using {@link #save(Iterable)} to avoid the usage of store
* specific API.
*
* @param entities must not be {@literal null}.
* @return the saved entities
* @since 1.7
*/
<S extends T> List<S> insert(Iterable<S> entities);
}

View File

@@ -38,7 +38,7 @@ import org.springframework.data.annotation.QueryAnnotation;
public @interface Query {
/**
* Takes a MongoDB JSON string to define the actual query to be executed. This one will take precendece over the
* Takes a MongoDB JSON string to define the actual query to be executed. This one will take precedence over the
* method name then.
*
* @return

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,12 +25,14 @@ import javax.enterprise.inject.spi.BeanManager;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactory;
import org.springframework.data.repository.cdi.CdiRepositoryBean;
import org.springframework.data.repository.config.CustomRepositoryImplementationDetector;
import org.springframework.util.Assert;
/**
* {@link CdiRepositoryBean} to create Mongo repository instances.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
@@ -43,11 +45,13 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
* @param qualifiers must not be {@literal null}.
* @param repositoryType must not be {@literal null}.
* @param beanManager must not be {@literal null}.
* @param detector detector for the custom {@link org.springframework.data.repository.Repository} implementations
* {@link CustomRepositoryImplementationDetector}, can be {@literal null}.
*/
public MongoRepositoryBean(Bean<MongoOperations> operations, Set<Annotation> qualifiers, Class<T> repositoryType,
BeanManager beanManager) {
BeanManager beanManager, CustomRepositoryImplementationDetector detector) {
super(qualifiers, repositoryType, beanManager);
super(qualifiers, repositoryType, beanManager, detector);
Assert.notNull(operations);
this.operations = operations;
@@ -58,11 +62,11 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
* @see org.springframework.data.repository.cdi.CdiRepositoryBean#create(javax.enterprise.context.spi.CreationalContext, java.lang.Class)
*/
@Override
protected T create(CreationalContext<T> creationalContext, Class<T> repositoryType) {
protected T create(CreationalContext<T> creationalContext, Class<T> repositoryType, Object customImplementation) {
MongoOperations mongoOperations = getDependencyInstance(operations, MongoOperations.class);
MongoRepositoryFactory factory = new MongoRepositoryFactory(mongoOperations);
return factory.getRepository(repositoryType);
return factory.getRepository(repositoryType, customImplementation);
}
}

View File

@@ -40,6 +40,7 @@ import org.springframework.data.repository.cdi.CdiRepositoryExtensionSupport;
* CDI extension to export Mongo repositories.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class MongoRepositoryExtension extends CdiRepositoryExtensionSupport {
@@ -110,6 +111,7 @@ public class MongoRepositoryExtension extends CdiRepositoryExtensionSupport {
}
// Construct and return the repository bean.
return new MongoRepositoryBean<T>(mongoOperations, qualifiers, repositoryType, beanManager);
return new MongoRepositoryBean<T>(mongoOperations, qualifiers, repositoryType, beanManager,
getCustomImplementationDetector());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Annotation;
import java.util.Collection;
import java.util.Collections;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -22,7 +26,9 @@ import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.config.BeanNames;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
@@ -43,6 +49,15 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
private boolean fallbackMappingContextCreated = false;
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModuleName()
*/
@Override
public String getModuleName() {
return "MongoDB";
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModulePrefix()
@@ -60,6 +75,24 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
return MongoRepositoryFactoryBean.class.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getIdentifyingAnnotations()
*/
@Override
protected Collection<Class<? extends Annotation>> getIdentifyingAnnotations() {
return Collections.<Class<? extends Annotation>> singleton(Document.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getIdentifyingTypes()
*/
@Override
protected Collection<Class<?>> getIdentifyingTypes() {
return Collections.<Class<?>> singleton(MongoRepository.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.RepositoryConfigurationSource)

View File

@@ -21,11 +21,11 @@ import java.util.List;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice;
import org.springframework.data.domain.SliceImpl;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoPage;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Point;
@@ -85,39 +85,35 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(operations.getConverter(), accessor));
Object result = null;
applyQueryMetaAttributesWhenPresent(query);
if (isDeleteQuery()) {
result = new DeleteExecution().execute(query);
return new DeleteExecution().execute(query);
} else if (method.isGeoNearQuery() && method.isPageQuery()) {
MongoParameterAccessor countAccessor = new MongoParametersParameterAccessor(method, parameters);
Query countQuery = createCountQuery(new ConvertingParameterAccessor(operations.getConverter(), countAccessor));
result = new GeoNearExecution(accessor).execute(query, countQuery);
return new GeoNearExecution(accessor).execute(query, countQuery);
} else if (method.isGeoNearQuery()) {
result = new GeoNearExecution(accessor).execute(query);
return new GeoNearExecution(accessor).execute(query);
} else if (method.isSliceQuery()) {
result = new SlicedExecution(accessor.getPageable()).execute(query);
return new SlicedExecution(accessor.getPageable()).execute(query);
} else if (method.isCollectionQuery()) {
result = new CollectionExecution(accessor.getPageable()).execute(query);
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
result = new PagedExecution(accessor.getPageable()).execute(query);
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
result = new SingleEntityExecution(isCountQuery()).execute(query);
return new SingleEntityExecution(isCountQuery()).execute(query);
}
}
if (result == null) {
return result;
private Query applyQueryMetaAttributesWhenPresent(Query query) {
if (method.hasQueryMetaAttributes()) {
query.setMeta(method.getQueryMetaAttributes());
}
Class<?> expectedReturnType = method.getReturnType().getType();
if (expectedReturnType.isAssignableFrom(result.getClass())) {
return result;
}
return CONVERSION_SERVICE.convert(result, expectedReturnType);
return query;
}
/**
@@ -129,7 +125,12 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @return
*/
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return createQuery(accessor);
Query query = createQuery(accessor);
applyQueryMetaAttributesWhenPresent(query);
return query;
}
/**
@@ -195,6 +196,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* {@link Execution} for {@link Slice} query methods.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.5
*/
@@ -216,9 +218,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
int pageSize = pageable.getPageSize();
Pageable slicePageable = new PageRequest(pageable.getPageNumber(), pageSize + 1, pageable.getSort());
List result = operations.find(query.with(slicePageable), metadata.getJavaType(), metadata.getCollectionName());
// Apply Pageable but tweak limit to peek into next page
Query modifiedQuery = query.with(pageable).limit(pageSize + 1);
List result = operations.find(modifiedQuery, metadata.getJavaType(), metadata.getCollectionName());
boolean hasNext = result.size() > pageSize;
@@ -255,12 +259,14 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
Class<?> type = metadata.getJavaType();
int overallLimit = query.getLimit();
long count = operations.count(query, metadata.getCollectionName());
long count = operations.count(query, type, collectionName);
count = overallLimit != 0 ? Math.min(count, query.getLimit()) : count;
boolean pageableOutOfScope = pageable.getOffset() > query.getLimit();
boolean pageableOutOfScope = pageable.getOffset() > count;
if (pageableOutOfScope) {
return new PageImpl<Object>(Collections.emptyList(), pageable, count);
@@ -274,7 +280,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
query.limit(overallLimit - pageable.getOffset());
}
List<?> result = operations.find(query, metadata.getJavaType(), metadata.getCollectionName());
List<?> result = operations.find(query, type, collectionName);
return new PageImpl(result, pageable, count);
}
}
@@ -301,7 +307,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return countProjection ? operations.count(query, metadata.getJavaType()) : operations.findOne(query,
metadata.getJavaType());
metadata.getJavaType(), metadata.getCollectionName());
}
}
@@ -310,7 +316,6 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
final class GeoNearExecution extends Execution {
private final MongoParameterAccessor accessor;
@@ -342,12 +347,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(countQuery, metadata.getCollectionName());
return new org.springframework.data.mongodb.core.geo.GeoPage<Object>(doExecuteQuery(query),
accessor.getPageable(), count);
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
}
@SuppressWarnings("unchecked")
private org.springframework.data.mongodb.core.geo.GeoResults<Object> doExecuteQuery(Query query) {
private GeoResults<Object> doExecuteQuery(Query query) {
Point nearLocation = accessor.getGeoNearLocation();
NearQuery nearQuery = NearQuery.near(nearLocation);
@@ -367,8 +371,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (org.springframework.data.mongodb.core.geo.GeoResults<Object>) operations.geoNear(nearQuery,
metadata.getJavaType(), metadata.getCollectionName());
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}
private boolean isListOfGeoResult() {
@@ -405,7 +408,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private Object deleteAndConvertResult(Query query, MongoEntityMetadata<?> metadata) {
if (method.isCollectionQuery()) {
return operations.findAllAndRemove(query, metadata.getJavaType());
return operations.findAllAndRemove(query, metadata.getJavaType(), metadata.getCollectionName());
}
WriteResult writeResult = operations.remove(query, metadata.getJavaType(), metadata.getCollectionName());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
@@ -38,6 +39,7 @@ import com.mongodb.DBRef;
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class ConvertingParameterAccessor implements MongoParameterAccessor {
@@ -110,6 +112,14 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getGeoNearLocation();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getFullText()
*/
public TextCriteria getFullText() {
return delegate.getFullText();
}
/**
* Converts the given value with the underlying {@link MongoWriter}.
*

View File

@@ -17,12 +17,14 @@ package org.springframework.data.mongodb.repository.query;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParameterAccessor;
/**
* Mongo-specific {@link ParameterAccessor} exposing a maximum distance parameter.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface MongoParameterAccessor extends ParameterAccessor {
@@ -40,4 +42,12 @@ public interface MongoParameterAccessor extends ParameterAccessor {
* @return
*/
Point getGeoNearLocation();
/**
* Returns the {@link TextCriteria} to be used for full text query.
*
* @return null if not set.
* @since 1.6
*/
TextCriteria getFullText();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import java.util.List;
import org.springframework.core.MethodParameter;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.mongodb.repository.Near;
import org.springframework.data.mongodb.repository.query.MongoParameters.MongoParameter;
import org.springframework.data.repository.query.Parameter;
@@ -31,10 +32,13 @@ import org.springframework.data.repository.query.Parameters;
* Custom extension of {@link Parameters} discovering additional
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParameters extends Parameters<MongoParameters, MongoParameter> {
private final Integer distanceIndex;
private final Integer fullTextIndex;
private Integer nearIndex;
/**
@@ -48,6 +52,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
super(method);
List<Class<?>> parameterTypes = Arrays.asList(method.getParameterTypes());
this.distanceIndex = parameterTypes.indexOf(Distance.class);
this.fullTextIndex = parameterTypes.indexOf(TextCriteria.class);
if (this.nearIndex == null && isGeoNearMethod) {
this.nearIndex = getNearIndex(parameterTypes);
@@ -56,19 +61,19 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
}
}
private MongoParameters(List<MongoParameter> parameters, Integer distanceIndex, Integer nearIndex) {
private MongoParameters(List<MongoParameter> parameters, Integer distanceIndex, Integer nearIndex,
Integer fullTextIndex) {
super(parameters);
this.distanceIndex = distanceIndex;
this.nearIndex = nearIndex;
this.fullTextIndex = fullTextIndex;
}
@SuppressWarnings({ "unchecked", "deprecation" })
private final int getNearIndex(List<Class<?>> parameterTypes) {
for (Class<?> reference : Arrays.asList(Point.class, org.springframework.data.mongodb.core.geo.Point.class,
double[].class)) {
for (Class<?> reference : Arrays.asList(Point.class, double[].class)) {
int nearIndex = parameterTypes.indexOf(reference);
@@ -124,13 +129,31 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
return nearIndex;
}
/**
* Returns ths inde of the parameter to be used as a textquery param
*
* @return
* @since 1.6
*/
public int getFullTextParameterIndex() {
return fullTextIndex != null ? fullTextIndex.intValue() : -1;
}
/**
* @return
* @since 1.6
*/
public boolean hasFullTextParameter() {
return this.fullTextIndex != null && this.fullTextIndex.intValue() >= 0;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.Parameters#createFrom(java.util.List)
*/
@Override
protected MongoParameters createFrom(List<MongoParameter> parameters) {
return new MongoParameters(parameters, this.distanceIndex, this.nearIndex);
return new MongoParameters(parameters, this.distanceIndex, this.nearIndex, this.fullTextIndex);
}
/**
@@ -162,7 +185,8 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
*/
@Override
public boolean isSpecialParameter() {
return super.isSpecialParameter() || Distance.class.isAssignableFrom(getType()) || isNearParameter();
return super.isSpecialParameter() || Distance.class.isAssignableFrom(getType()) || isNearParameter()
|| TextCriteria.class.isAssignableFrom(getType());
}
private boolean isNearParameter() {
@@ -181,5 +205,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
private boolean hasNearAnnotation() {
return parameter.getParameterAnnotation(Near.class) != null;
}
}
}

View File

@@ -17,12 +17,17 @@ package org.springframework.data.mongodb.repository.query;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.ParametersParameterAccessor;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* Mongo-specific {@link ParametersParameterAccessor} to allow access to the {@link Distance} parameter.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParametersParameterAccessor extends ParametersParameterAccessor implements MongoParameterAccessor {
@@ -77,4 +82,35 @@ public class MongoParametersParameterAccessor extends ParametersParameterAccesso
return (Point) value;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getFullText()
*/
@Override
public TextCriteria getFullText() {
int index = method.getParameters().getFullTextParameterIndex();
return index >= 0 ? potentiallyConvertFullText(getValue(index)) : null;
}
protected TextCriteria potentiallyConvertFullText(Object fullText) {
Assert.notNull(fullText, "Fulltext parameter must not be 'null'.");
if (fullText instanceof String) {
return TextCriteria.forDefaultLanguage().matching((String) fullText);
}
if (fullText instanceof Term) {
return TextCriteria.forDefaultLanguage().matching((Term) fullText);
}
if (fullText instanceof TextCriteria) {
return ((TextCriteria) fullText);
}
throw new IllegalArgumentException(String.format(
"Expected full text parameter to be one of String, Term or TextCriteria but found %s.",
ClassUtils.getShortName(fullText.getClass())));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,13 +20,16 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.util.Arrays;
import java.util.Collection;
import java.util.Iterator;
import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -40,16 +43,19 @@ import org.springframework.data.repository.query.parser.Part.IgnoreCaseType;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Custom query creator to create Mongo criterias.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private static final Logger LOG = LoggerFactory.getLogger(MongoQueryCreator.class);
private static final Pattern PUNCTATION_PATTERN = Pattern.compile("\\p{Punct}");
private final MongoParameterAccessor accessor;
private final boolean isGeoNearQuery;
@@ -102,9 +108,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
Criteria criteria = from(part, property,
where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
Criteria criteria = from(part, property, where(path.toDotPath()), (PotentiallyConvertingIterator) iterator);
return criteria;
}
@@ -123,9 +127,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
return from(part, property,
base.and(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return from(part, property, base.and(path.toDotPath()), (PotentiallyConvertingIterator) iterator);
}
/*
@@ -146,11 +148,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
@Override
protected Query complete(Criteria criteria, Sort sort) {
if (criteria == null) {
return null;
}
Query query = new Query(criteria).with(sort);
Query query = (criteria == null ? new Query() : new Query(criteria)).with(sort);
if (LOG.isDebugEnabled()) {
LOG.debug("Created query " + query);
@@ -198,7 +196,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
return addAppropriateLikeRegexTo(criteria, part, parameters.next().toString());
return createContainingCriteria(part, property, criteria, parameters);
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
@@ -216,7 +214,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
if (distance == null) {
return criteria.near(point);
} else {
if (distance.getMetric() != null) {
if (!Metrics.NEUTRAL.equals(distance.getMetric())) {
criteria.nearSphere(point);
} else {
criteria.near(point);
@@ -269,19 +267,23 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private Criteria createLikeRegexCriteriaOrThrow(Part part, MongoPersistentProperty property, Criteria criteria,
PotentiallyConvertingIterator parameters, boolean shouldNegateExpression) {
PropertyPath path = part.getProperty().getLeafProperty();
switch (part.shouldIgnoreCase()) {
case ALWAYS:
if (part.getProperty().getType() != String.class) {
throw new IllegalArgumentException(String.format("part %s must be of type String but was %s",
part.getProperty(), part.getType()));
if (path.getType() != String.class) {
throw new IllegalArgumentException(
String.format("Part %s must be of type String but was %s", path, path.getType()));
}
// fall-through
case WHEN_POSSIBLE:
if (shouldNegateExpression) {
criteria = criteria.not();
}
return addAppropriateLikeRegexTo(criteria, part, parameters.nextConverted(property).toString());
case NEVER:
@@ -292,6 +294,27 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
Arrays.asList(IgnoreCaseType.ALWAYS, IgnoreCaseType.WHEN_POSSIBLE), part.shouldIgnoreCase()));
}
/**
* If the target property of the comparison is of type String, then the operator checks for match using regular
* expression. If the target property of the comparison is a {@link Collection} then the operator evaluates to true if
* it finds an exact match within any member of the {@link Collection}.
*
* @param part
* @param property
* @param criteria
* @param parameters
* @return
*/
private Criteria createContainingCriteria(Part part, MongoPersistentProperty property, Criteria criteria,
PotentiallyConvertingIterator parameters) {
if (property.isCollectionLike()) {
return criteria.in(nextAsArray(parameters, property));
}
return addAppropriateLikeRegexTo(criteria, part, parameters.next().toString());
}
/**
* Creates an appropriate like-regex and appends it to the given criteria.
*
@@ -337,8 +360,8 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return (T) parameter;
}
throw new IllegalArgumentException(String.format("Expected parameter type of %s but got %s!", type,
parameter.getClass()));
throw new IllegalArgumentException(
String.format("Expected parameter type of %s but got %s!", type, parameter.getClass()));
}
private Object[] nextAsArray(PotentiallyConvertingIterator iterator, MongoPersistentProperty property) {
@@ -356,23 +379,57 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private String toLikeRegex(String source, Part part) {
Type type = part.getType();
String regex = prepareAndEscapeStringBeforeApplyingLikeRegex(source, part);
switch (type) {
case STARTING_WITH:
source = "^" + source;
regex = "^" + regex;
break;
case ENDING_WITH:
source = source + "$";
regex = regex + "$";
break;
case CONTAINING:
source = "*" + source + "*";
regex = ".*" + regex + ".*";
break;
case SIMPLE_PROPERTY:
case NEGATING_SIMPLE_PROPERTY:
source = "^" + source + "$";
regex = "^" + regex + "$";
default:
}
return source.replaceAll("\\*", ".*");
return regex;
}
private String prepareAndEscapeStringBeforeApplyingLikeRegex(String source, Part qpart) {
if (!ObjectUtils.nullSafeEquals(Type.LIKE, qpart.getType())) {
return PUNCTATION_PATTERN.matcher(source).find() ? Pattern.quote(source) : source;
}
if (source.equals("*")) {
return ".*";
}
StringBuilder sb = new StringBuilder();
boolean leadingWildcard = source.startsWith("*");
boolean trailingWildcard = source.endsWith("*");
String valueToUse = source.substring(leadingWildcard ? 1 : 0,
trailingWildcard ? source.length() - 1 : source.length());
if (PUNCTATION_PATTERN.matcher(valueToUse).find()) {
valueToUse = Pattern.quote(valueToUse);
}
if (leadingWildcard) {
sb.append(".*");
}
sb.append(valueToUse);
if (trailingWildcard) {
sb.append(".*");
}
return sb.toString();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.geo.GeoResults;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.Meta;
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.QueryMethod;
@@ -39,6 +40,7 @@ import org.springframework.util.StringUtils;
* Mongo specific implementation of {@link QueryMethod}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoQueryMethod extends QueryMethod {
@@ -126,8 +128,7 @@ public class MongoQueryMethod extends QueryMethod {
MongoPersistentEntity<?> collectionEntity = domainClass.isAssignableFrom(returnedObjectType) ? returnedEntity
: managedEntity;
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(),
collectionEntity.getCollection());
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(), collectionEntity);
}
return this.metadata;
@@ -143,7 +144,7 @@ public class MongoQueryMethod extends QueryMethod {
}
/**
* Returns whether te query is a geo near query.
* Returns whether the query is a geo near query.
*
* @return
*/
@@ -181,4 +182,55 @@ public class MongoQueryMethod extends QueryMethod {
TypeInformation<?> getReturnType() {
return ClassTypeInformation.fromReturnTypeOf(method);
}
/**
* @return return true if {@link Meta} annotation is available.
* @since 1.6
*/
public boolean hasQueryMetaAttributes() {
return getMetaAnnotation() != null;
}
/**
* Returns the {@link Meta} annotation that is applied to the method or {@code null} if not available.
*
* @return
* @since 1.6
*/
Meta getMetaAnnotation() {
return method.getAnnotation(Meta.class);
}
/**
* Returns the {@link org.springframework.data.mongodb.core.query.Meta} attributes to be applied.
*
* @return never {@literal null}.
* @since 1.6
*/
public org.springframework.data.mongodb.core.query.Meta getQueryMetaAttributes() {
Meta meta = getMetaAnnotation();
if (meta == null) {
return new org.springframework.data.mongodb.core.query.Meta();
}
org.springframework.data.mongodb.core.query.Meta metaAttributes = new org.springframework.data.mongodb.core.query.Meta();
if (meta.maxExcecutionTime() > 0) {
metaAttributes.setMaxTimeMsec(meta.maxExcecutionTime());
}
if (meta.maxScanDocuments() > 0) {
metaAttributes.setMaxScan(meta.maxScanDocuments());
}
if (StringUtils.hasText(meta.comment())) {
metaAttributes.setComment(meta.comment());
}
if (meta.snapshot()) {
metaAttributes.setSnapshot(meta.snapshot());
}
return metaAttributes;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2014 the original author or authors.
* Copyright 2002-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.QueryMethod;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.parser.PartTree;
@@ -77,6 +78,11 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
query.limit(tree.getMaxResults());
}
TextCriteria textCriteria = accessor.getFullText();
if (textCriteria != null) {
query.addCriteria(textCriteria);
}
String fieldSpec = this.getQueryMethod().getFieldSpecification();
if (!StringUtils.hasText(fieldSpec)) {
@@ -87,11 +93,12 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
BasicQuery result = new BasicQuery(query.getQueryObject().toString(), fieldSpec);
result.setSortObject(query.getSortObject());
return result;
} catch (JSONParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod(),
o_O));
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
/**
@@ -25,21 +26,22 @@ import org.springframework.util.Assert;
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final String collectionName;
private final MongoPersistentEntity<?> collectionEntity;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and collection name.
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and {@link MongoPersistentEntity} to use for
* collection lookups.
*
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param collectionEntity must not be {@literal null} or empty.
*/
public SimpleMongoEntityMetadata(Class<T> type, String collectionName) {
public SimpleMongoEntityMetadata(Class<T> type, MongoPersistentEntity<?> collectionEntity) {
Assert.notNull(type, "Type must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(collectionEntity, "Collection entity must not be null or empty!");
this.type = type;
this.collectionName = collectionName;
this.collectionEntity = collectionEntity;
}
/*
@@ -55,6 +57,6 @@ class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
* @see org.springframework.data.mongodb.repository.query.MongoEntityMetadata#getCollectionName()
*/
public String getCollectionName() {
return collectionName;
return collectionEntity.getCollection();
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
@@ -23,7 +26,10 @@ import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.util.JSON;
/**
@@ -31,17 +37,20 @@ import com.mongodb.util.JSON;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class StringBasedMongoQuery extends AbstractMongoQuery {
private static final String COUND_AND_DELETE = "Manually defined query for %s cannot be both a count and delete query at the same time!";
private static final Pattern PLACEHOLDER = Pattern.compile("\\?(\\d+)");
private static final Logger LOG = LoggerFactory.getLogger(StringBasedMongoQuery.class);
private static final ParameterBindingParser PARSER = ParameterBindingParser.INSTANCE;
private final String query;
private final String fieldSpec;
private final boolean isCountQuery;
private final boolean isDeleteQuery;
private final List<ParameterBinding> queryParameterBindings;
private final List<ParameterBinding> fieldSpecParameterBindings;
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod} and {@link MongoOperations}.
@@ -65,7 +74,11 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
super(method, mongoOperations);
this.query = query;
this.queryParameterBindings = PARSER.parseParameterBindingsFrom(query);
this.fieldSpec = method.getFieldSpecification();
this.fieldSpecParameterBindings = PARSER.parseParameterBindingsFrom(method.getFieldSpecification());
this.isCountQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().count() : false;
this.isDeleteQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().delete() : false;
@@ -81,12 +94,12 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
String queryString = replacePlaceholders(query, accessor);
String queryString = replacePlaceholders(query, accessor, queryParameterBindings);
Query query = null;
if (fieldSpec != null) {
String fieldString = replacePlaceholders(fieldSpec, accessor);
String fieldString = replacePlaceholders(fieldSpec, accessor, fieldSpecParameterBindings);
query = new BasicQuery(queryString, fieldString);
} else {
query = new BasicQuery(queryString);
@@ -119,21 +132,189 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return this.isDeleteQuery;
}
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor) {
/**
* Replaced the parameter place-holders with the actual parameter values from the given {@link ParameterBinding}s.
*
* @param input
* @param accessor
* @param bindings
* @return
*/
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor, List<ParameterBinding> bindings) {
Matcher matcher = PLACEHOLDER.matcher(input);
String result = input;
while (matcher.find()) {
String group = matcher.group();
int index = Integer.parseInt(matcher.group(1));
result = result.replace(group, getParameterWithIndex(accessor, index));
if (bindings.isEmpty()) {
return input;
}
return result;
StringBuilder result = new StringBuilder(input);
for (ParameterBinding binding : bindings) {
String parameter = binding.getParameter();
int idx = result.indexOf(parameter);
if (idx != -1) {
result.replace(idx, idx + parameter.length(), getParameterValueForBinding(accessor, binding));
}
}
return result.toString();
}
private String getParameterWithIndex(ConvertingParameterAccessor accessor, int index) {
return JSON.serialize(accessor.getBindableValue(index));
/**
* Returns the serialized value to be used for the given {@link ParameterBinding}.
*
* @param accessor
* @param binding
* @return
*/
private String getParameterValueForBinding(ConvertingParameterAccessor accessor, ParameterBinding binding) {
Object value = accessor.getBindableValue(binding.getParameterIndex());
if (value instanceof String && binding.isQuoted()) {
return (String) value;
}
return JSON.serialize(value);
}
/**
* A parser that extracts the parameter bindings from a given query string.
*
* @author Thomas Darimont
*/
private static enum ParameterBindingParser {
INSTANCE;
private static final String PARAMETER_PREFIX = "_param_";
private static final String PARSEABLE_PARAMETER = "\"" + PARAMETER_PREFIX + "$1\"";
private static final Pattern PARAMETER_BINDING_PATTERN = Pattern.compile("\\?(\\d+)");
private static final Pattern PARSEABLE_BINDING_PATTERN = Pattern.compile("\"?" + PARAMETER_PREFIX + "(\\d+)\"?");
private final static int PARAMETER_INDEX_GROUP = 1;
/**
* Returns a list of {@link ParameterBinding}s found in the given {@code input} or an
* {@link Collections#emptyList()}.
*
* @param input
* @param conversionService must not be {@literal null}.
* @return
*/
public List<ParameterBinding> parseParameterBindingsFrom(String input) {
if (!StringUtils.hasText(input)) {
return Collections.emptyList();
}
List<ParameterBinding> bindings = new ArrayList<ParameterBinding>();
String parseableInput = makeParameterReferencesParseable(input);
collectParameterReferencesIntoBindings(bindings, JSON.parse(parseableInput));
return bindings;
}
private String makeParameterReferencesParseable(String input) {
Matcher matcher = PARAMETER_BINDING_PATTERN.matcher(input);
String parseableInput = matcher.replaceAll(PARSEABLE_PARAMETER);
return parseableInput;
}
private void collectParameterReferencesIntoBindings(List<ParameterBinding> bindings, Object value) {
if (value instanceof String) {
String string = ((String) value).trim();
potentiallyAddBinding(string, bindings);
} else if (value instanceof Pattern) {
String string = ((Pattern) value).toString().trim();
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(string);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
/*
* The pattern is used as a direct parameter replacement, e.g. 'field': ?1,
* therefore we treat it as not quoted to remain backwards compatible.
*/
boolean quoted = !string.equals(PARAMETER_PREFIX + paramIndex);
bindings.add(new ParameterBinding(paramIndex, quoted));
}
} else if (value instanceof DBRef) {
DBRef dbref = (DBRef) value;
potentiallyAddBinding(dbref.getRef(), bindings);
potentiallyAddBinding(dbref.getId().toString(), bindings);
} else if (value instanceof DBObject) {
DBObject dbo = (DBObject) value;
for (String field : dbo.keySet()) {
collectParameterReferencesIntoBindings(bindings, field);
collectParameterReferencesIntoBindings(bindings, dbo.get(field));
}
}
}
private void potentiallyAddBinding(String source, List<ParameterBinding> bindings) {
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(source);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
boolean quoted = (source.startsWith("'") && source.endsWith("'"))
|| (source.startsWith("\"") && source.endsWith("\""));
bindings.add(new ParameterBinding(paramIndex, quoted));
}
}
}
/**
* A generic parameter binding with name or position information.
*
* @author Thomas Darimont
*/
private static class ParameterBinding {
private final int parameterIndex;
private final boolean quoted;
/**
* Creates a new {@link ParameterBinding} with the given {@code parameterIndex} and {@code quoted} information.
*
* @param parameterIndex
* @param quoted whether or not the parameter is already quoted.
*/
public ParameterBinding(int parameterIndex, boolean quoted) {
this.parameterIndex = parameterIndex;
this.quoted = quoted;
}
public boolean isQuoted() {
return quoted;
}
public int getParameterIndex() {
return parameterIndex;
}
public String getParameter() {
return "?" + parameterIndex;
}
}
}

View File

@@ -35,7 +35,7 @@ import com.mysema.query.apt.Configuration;
import com.mysema.query.apt.DefaultConfiguration;
/**
* Annotation processor to create Querydsl query types for QueryDsl annoated classes.
* Annotation processor to create Querydsl query types for QueryDsl annotated classes.
*
* @author Oliver Gierke
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,8 +27,10 @@ import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QSort;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityInformation;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
@@ -43,18 +45,21 @@ import com.mysema.query.types.path.PathBuilder;
* Special QueryDsl based repository implementation that allows execution {@link Predicate}s in various forms.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleMongoRepository<T, ID> implements
QueryDslPredicateExecutor<T> {
private final PathBuilder<T> builder;
private final EntityInformation<T, ID> entityInformation;
private final MongoOperations mongoOperations;
/**
* Creates a new {@link QueryDslMongoRepository} for the given {@link EntityMetadata} and {@link MongoTemplate}. Uses
* the {@link SimpleEntityPathResolver} to create an {@link EntityPath} for the given domain class.
*
* @param entityInformation
* @param template
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
@@ -64,17 +69,21 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* Creates a new {@link QueryDslMongoRepository} for the given {@link MongoEntityInformation}, {@link MongoTemplate}
* and {@link EntityPathResolver}.
*
* @param entityInformation
* @param mongoOperations
* @param resolver
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations,
EntityPathResolver resolver) {
super(entityInformation, mongoOperations);
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.entityInformation = entityInformation;
this.mongoOperations = mongoOperations;
}
/*
@@ -98,7 +107,6 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, com.mysema.query.types.OrderSpecifier<?>[])
*/
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
return createQueryFor(predicate).orderBy(orders).list();
}
@@ -114,6 +122,28 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return new PageImpl<T>(applyPagination(query, pageable).list(), pageable, countQuery.count());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.SimpleMongoRepository#findAll(org.springframework.data.domain.Pageable)
*/
@Override
public Page<T> findAll(Pageable pageable) {
MongodbQuery<T> countQuery = createQuery();
MongodbQuery<T> query = createQuery();
return new PageImpl<T>(applyPagination(query, pageable).list(), pageable, countQuery.count());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.SimpleMongoRepository#findAll(org.springframework.data.domain.Sort)
*/
@Override
public List<T> findAll(Sort sort) {
return applySorting(createQuery(), sort).list();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#count(com.mysema.query.types.Predicate)
@@ -129,11 +159,16 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @return
*/
private MongodbQuery<T> createQueryFor(Predicate predicate) {
return createQuery().where(predicate);
}
Class<T> domainType = getEntityInformation().getJavaType();
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), domainType);
return query.where(predicate);
/**
* Creates a {@link MongodbQuery}.
*
* @return
*/
private MongodbQuery<T> createQuery() {
return new SpringDataMongodbQuery<T>(mongoOperations, entityInformation.getJavaType());
}
/**
@@ -166,6 +201,15 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return query;
}
// TODO: find better solution than instanceof check
if (sort instanceof QSort) {
List<OrderSpecifier<?>> orderSpecifiers = ((QSort) sort).getOrderSpecifiers();
query.orderBy(orderSpecifiers.toArray(new OrderSpecifier<?>[orderSpecifiers.size()]));
return query;
}
for (Order order : sort) {
query.orderBy(toOrder(order));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
@@ -40,6 +41,8 @@ import org.springframework.util.Assert;
* Repository base implementation for Mongo.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class SimpleMongoRepository<T, ID extends Serializable> implements MongoRepository<T, ID> {
@@ -47,7 +50,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
private final MongoEntityInformation<T, ID> entityInformation;
/**
* Creates a ew {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
* Creates a new {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
*
* @param metadata must not be {@literal null}.
* @param template must not be {@literal null}.
@@ -69,7 +72,12 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
Assert.notNull(entity, "Entity must not be null!");
mongoOperations.save(entity, entityInformation.getCollectionName());
if (entityInformation.isNew(entity)) {
mongoOperations.insert(entity, entityInformation.getCollectionName());
} else {
mongoOperations.save(entity, entityInformation.getCollectionName());
}
return entity;
}
@@ -81,11 +89,22 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<S> result = new ArrayList<S>();
List<S> result = convertIterableToList(entities);
boolean allNew = true;
for (S entity : entities) {
save(entity);
result.add(entity);
if (allNew && !entityInformation.isNew(entity)) {
allNew = false;
}
}
if (allNew) {
mongoOperations.insertAll(result);
} else {
for (S entity : result) {
save(entity);
}
}
return result;
@@ -180,7 +199,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public Iterable<T> findAll(Iterable<ID> ids) {
Set<ID> parameters = new HashSet<ID>();
Set<ID> parameters = new HashSet<ID>(tryDetermineRealSizeOrReturn(ids, 10));
for (ID id : ids) {
parameters.add(id);
}
@@ -208,6 +227,38 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
return findAll(new Query().with(sort));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#insert(java.lang.Object)
*/
@Override
public <S extends T> S insert(S entity) {
Assert.notNull(entity, "Entity must not be null!");
mongoOperations.insert(entity, entityInformation.getCollectionName());
return entity;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#insert(java.lang.Iterable)
*/
@Override
public <S extends T> List<S> insert(Iterable<S> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<S> list = convertIterableToList(entities);
if (list.isEmpty()) {
return list;
}
mongoOperations.insertAll(list);
return list;
}
private List<T> findAll(Query query) {
if (query == null) {
@@ -217,19 +268,27 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/**
* Returns the underlying {@link MongoOperations} instance.
*
* @return
*/
protected MongoOperations getMongoOperations() {
return this.mongoOperations;
private static <T> List<T> convertIterableToList(Iterable<T> entities) {
if (entities instanceof List) {
return (List<T>) entities;
}
int capacity = tryDetermineRealSizeOrReturn(entities, 10);
if (capacity == 0 || entities == null) {
return Collections.<T> emptyList();
}
List<T> list = new ArrayList<T>(capacity);
for (T entity : entities) {
list.add(entity);
}
return list;
}
/**
* @return the entityInformation
*/
protected MongoEntityInformation<T, ID> getEntityInformation() {
return entityInformation;
private static int tryDetermineRealSizeOrReturn(Iterable<?> iterable, int defaultSize) {
return iterable == null ? 0 : (iterable instanceof Collection) ? ((Collection<?>) iterable).size() : defaultSize;
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.repository.support;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.regex.Pattern;
import org.springframework.data.mapping.context.MappingContext;
@@ -41,7 +44,17 @@ import com.mysema.query.types.PathType;
*/
class SpringDataMongodbSerializer extends MongodbSerializer {
private final String ID_KEY = "_id";
private static final String ID_KEY = "_id";
private static final Set<PathType> PATH_TYPES;
static {
Set<PathType> pathTypes = new HashSet<PathType>();
pathTypes.add(PathType.VARIABLE);
pathTypes.add(PathType.PROPERTY);
PATH_TYPES = Collections.unmodifiableSet(pathTypes);
}
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -138,7 +151,7 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Path<?> parent = path.getMetadata().getParent();
if (parent == null) {
if (parent == null || !PATH_TYPES.contains(path.getMetadata().getPathType())) {
return null;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,7 @@ import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
@@ -49,11 +50,17 @@ public class ServerAddressPropertyEditorUnitTests {
/**
* @see DATAMONGO-454
* @see DATAMONGO-1062
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsAddressConfigWithoutASingleParsableServerAddress() {
public void rejectsAddressConfigWithoutASingleParsableAndResolvableServerAddress() {
editor.setAsText("foo, bar");
String unknownHost1 = "gugu.nonexistant.example.org";
String unknownHost2 = "gaga.nonexistant.example.org";
assertUnresolveableHostnames(unknownHost1, unknownHost2);
editor.setAsText(unknownHost1 + "," + unknownHost2);
}
/**
@@ -193,4 +200,16 @@ public class ServerAddressPropertyEditorUnitTests {
assertThat(addresses, hasItem(new ServerAddress(InetAddress.getByName(hostAddress), port)));
}
}
private void assertUnresolveableHostnames(String... hostnames) {
for (String hostname : hostnames) {
try {
InetAddress.getByName(hostname);
Assert.fail("Supposedly unresolveable hostname '" + hostname + "' can be resolved.");
} catch (UnknownHostException expected) {
// ok
}
}
}
}

View File

@@ -0,0 +1,93 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* Integration tests for {@link DefaultIndexOperations}.
*
* @author Christoph Strobl
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class DefaultIndexOperationsIntegrationTests {
static final DBObject GEO_SPHERE_2D = new BasicDBObject("loaction", "2dsphere");
@Autowired MongoTemplate template;
DefaultIndexOperations indexOps;
DBCollection collection;
@Before
public void setUp() {
String collectionName = this.template.getCollectionName(DefaultIndexOperationsIntegrationTestsSample.class);
this.collection = this.template.getDb().getCollection(collectionName);
this.collection.dropIndexes();
this.indexOps = new DefaultIndexOperations(template, collectionName);
}
/**
* @see DATAMONGO-1008
*/
@Test
public void getIndexInfoShouldBeAbleToRead2dsphereIndex() {
collection.createIndex(GEO_SPHERE_2D);
IndexInfo info = findAndReturnIndexInfo(GEO_SPHERE_2D);
assertThat(info.getIndexFields().get(0).isGeo(), is(true));
}
private IndexInfo findAndReturnIndexInfo(DBObject keys) {
return findAndReturnIndexInfo(indexOps.getIndexInfo(), keys);
}
@SuppressWarnings("deprecation")
private static IndexInfo findAndReturnIndexInfo(Iterable<IndexInfo> candidates, DBObject keys) {
return findAndReturnIndexInfo(candidates, DBCollection.genIndexName(keys));
}
private static IndexInfo findAndReturnIndexInfo(Iterable<IndexInfo> candidates, String name) {
for (IndexInfo info : candidates) {
if (ObjectUtils.nullSafeEquals(name, info.getName())) {
return info;
}
}
throw new AssertionError(String.format("Index with %s was not found", name));
}
static class DefaultIndexOperationsIntegrationTestsSample {}
}

View File

@@ -74,6 +74,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
@@ -185,9 +186,14 @@ public class MongoTemplateTests {
template.dropCollection(DocumentWithCollection.class);
template.dropCollection(DocumentWithCollectionOfSimpleType.class);
template.dropCollection(DocumentWithMultipleCollections.class);
template.dropCollection(DocumentWithNestedCollection.class);
template.dropCollection(DocumentWithEmbeddedDocumentWithCollection.class);
template.dropCollection(DocumentWithNestedList.class);
template.dropCollection(DocumentWithDBRefCollection.class);
template.dropCollection(SomeContent.class);
template.dropCollection(SomeTemplate.class);
template.dropCollection(Address.class);
template.dropCollection(DocumentWithCollectionOfSamples.class);
}
@Test
@@ -2221,6 +2227,243 @@ public class MongoTemplateTests {
assertThat(retrieved.model.value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenWholeCollectionIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models", Collections.singletonList(entry));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenFirstElementIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.0", entry);
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationOnDocumentWithNestedCollectionObjectInsertedAtSecondIndex() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.1", Collections.singletonMap("key2", new ModelA("value2")));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(1));
assertThat(retrieved.models.get(1).entrySet(), hasSize(1));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(1).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(1).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingPositionedElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.0", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingSecondElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.1", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(2));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value1"));
assertThat(retrieved.embeddedDocument.models.get(1).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenRewriting()
throws Exception {
List<Model> models = Arrays.<Model> asList(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument",
new DocumentWithCollection(Arrays.<Model> asList(new ModelA("value2"))));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnDocumentWithNestedLists() {
DocumentWithNestedList doc = new DocumentWithNestedList();
List<Model> entry = new ArrayList<Model>();
entry.add(new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithNestedList.class), notNullValue());
Update update = Update.update("models.0.1", new ModelA("value2"));
template.findAndModify(query, update, DocumentWithNestedList.class);
DocumentWithNestedList retrieved = template.findOne(query, DocumentWithNestedList.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0), hasSize(2));
assertThat(retrieved.models.get(0).get(0), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(0).value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get(1), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(1).value(), equalTo("value2"));
}
/**
* @see DATAMONGO-407
*/
@@ -2511,6 +2754,37 @@ public class MongoTemplateTests {
assertThat(template.getDb().getCollection("sample").find(new BasicDBObject("field", "data")).count(), is(1));
}
/**
* @see DATAMONGO-1001
*/
@Test
public void shouldAllowSavingOfLazyLoadedDbRefs() {
template.dropCollection(SomeTemplate.class);
template.dropCollection(SomeMessage.class);
template.dropCollection(SomeContent.class);
SomeContent content = new SomeContent();
content.id = "content-1";
content.text = "spring";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "template-1";
tmpl.content = content; // @DBRef(lazy=true) tmpl.content
template.save(tmpl);
SomeTemplate savedTmpl = template.findById(tmpl.id, SomeTemplate.class);
SomeContent loadedContent = savedTmpl.getContent();
loadedContent.setText("data");
template.save(loadedContent);
assertThat(template.findById(content.id, SomeContent.class).getText(), is("data"));
}
/**
* @see DATAMONGO-880
*/
@@ -2595,6 +2869,33 @@ public class MongoTemplateTests {
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyAddToSetWithEachShouldNotAddDuplicatesNorTypeHintForSimpleDocuments() {
DocumentWithCollectionOfSamples doc = new DocumentWithCollectionOfSamples();
doc.samples = Arrays.asList(new Sample(null, "sample1"));
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithCollectionOfSamples.class), notNullValue());
Update update = new Update().addToSet("samples").each(new Sample(null, "sample2"), new Sample(null, "sample1"));
template.findAndModify(query, update, DocumentWithCollectionOfSamples.class);
DocumentWithCollectionOfSamples retrieved = template.findOne(query, DocumentWithCollectionOfSamples.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.samples, hasSize(2));
assertThat(retrieved.samples.get(0).field, is("sample1"));
assertThat(retrieved.samples.get(1).field, is("sample2"));
}
/**
* @see DATAMONGO-888
*/
@@ -2709,6 +3010,23 @@ public class MongoTemplateTests {
assertThat(template.findAll(DBObject.class, "collection"), hasSize(0));
}
/**
* @see DATAMONGO-1207
*/
@Test
public void ignoresNullElementsForInsertAll() {
Address newYork = new Address("NY", "New York");
Address washington = new Address("DC", "Washington");
template.insertAll(Arrays.asList(newYork, null, washington));
List<Address> result = template.findAll(Address.class);
assertThat(result, hasSize(2));
assertThat(result, hasItems(newYork, washington));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@@ -2760,6 +3078,7 @@ public class MongoTemplateTests {
@Id public String id;
@Field("db_ref_list")/** @see DATAMONGO-1058 */
@org.springframework.data.mongodb.core.mapping.DBRef//
public List<Sample> dbRefAnnotatedList;
@@ -2783,12 +3102,36 @@ public class MongoTemplateTests {
List<String> values;
}
static class DocumentWithCollectionOfSamples {
@Id String id;
List<Sample> samples;
}
static class DocumentWithMultipleCollections {
@Id String id;
List<String> string1;
List<String> string2;
}
static class DocumentWithNestedCollection {
@Id String id;
List<Map<String, Model>> models = new ArrayList<Map<String, Model>>();
}
static class DocumentWithNestedList {
@Id String id;
List<List<Model>> models = new ArrayList<List<Model>>();
}
static class DocumentWithEmbeddedDocumentWithCollection {
@Id String id;
DocumentWithCollection embeddedDocument;
DocumentWithEmbeddedDocumentWithCollection(DocumentWithCollection embeddedDocument) {
this.embeddedDocument = embeddedDocument;
}
}
static interface Model {
String value();
@@ -2894,6 +3237,41 @@ public class MongoTemplateTests {
String state;
String city;
Address() {}
Address(String state, String city) {
this.state = state;
this.city = city;
}
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Address)) {
return false;
}
Address that = (Address) obj;
return ObjectUtils.nullSafeEquals(this.city, that.city) && //
ObjectUtils.nullSafeEquals(this.state, that.state);
}
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.city);
result += 31 * ObjectUtils.nullSafeHashCode(this.state);
return result;
}
}
static class VersionedPerson {
@@ -2956,6 +3334,11 @@ public class MongoTemplateTests {
return name;
}
public void setText(String text) {
this.text = text;
}
public String getId() {
return id;
}

View File

@@ -43,7 +43,9 @@ import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.Version;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -52,18 +54,21 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCre
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
/**
* Unit tests for {@link MongoTemplate}.
@@ -89,6 +94,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Before
public void setUp() {
when(cursor.copy()).thenReturn(cursor);
when(factory.getDb()).thenReturn(db);
when(factory.getExceptionTranslator()).thenReturn(exceptionTranslator);
when(db.getCollection(Mockito.any(String.class))).thenReturn(collection);
@@ -352,6 +358,74 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(captor.getValue(), equalTo(new BasicDBObjectBuilder().add("foo", 1).get()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void aggregateShouldHonorReadPreferenceWhenSet() {
CommandResult result = mock(CommandResult.class);
when(result.get("result")).thenReturn(Collections.emptySet());
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class))).thenReturn(result);
when(db.command(Mockito.any(DBObject.class))).thenReturn(result);
template.setReadPreference(ReadPreference.secondary());
template.aggregate(Aggregation.newAggregation(Aggregation.unwind("foo")), "collection-1", Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class), eq(ReadPreference.secondary()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void aggregateShouldIgnoreReadPreferenceWhenNotSet() {
CommandResult result = mock(CommandResult.class);
when(result.get("result")).thenReturn(Collections.emptySet());
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class))).thenReturn(result);
when(db.command(Mockito.any(DBObject.class))).thenReturn(result);
template.aggregate(Aggregation.newAggregation(Aggregation.unwind("foo")), "collection-1", Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void geoNearShouldHonorReadPreferenceWhenSet() {
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class)))
.thenReturn(mock(CommandResult.class));
when(db.command(Mockito.any(DBObject.class))).thenReturn(mock(CommandResult.class));
template.setReadPreference(ReadPreference.secondary());
NearQuery query = NearQuery.near(new Point(1, 1));
template.geoNear(query, Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class), eq(ReadPreference.secondary()));
}
/**
* @see DATAMONGO-1166
*/
@Test
public void geoNearShouldIgnoreReadPreferenceWhenNotSet() {
when(db.command(Mockito.any(DBObject.class), Mockito.any(ReadPreference.class)))
.thenReturn(mock(CommandResult.class));
when(db.command(Mockito.any(DBObject.class))).thenReturn(mock(CommandResult.class));
NearQuery query = NearQuery.near(new Point(1, 1));
template.geoNear(query, Wrapper.class);
verify(this.db, times(1)).command(Mockito.any(DBObject.class));
}
class AutogenerateableId {
@Id BigInteger id;

View File

@@ -15,16 +15,21 @@
*/
package org.springframework.data.mongodb.core;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.concurrent.TimeUnit;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate.QueryCursorPreparer;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
@@ -41,6 +46,13 @@ public class QueryCursorPreparerUnitTests {
@Mock MongoDbFactory factory;
@Mock DBCursor cursor;
@Mock DBCursor cursorToUse;
@Before
public void setUp() {
when(cursor.copy()).thenReturn(cursorToUse);
}
/**
* @see DATAMONGO-185
*/
@@ -49,9 +61,81 @@ public class QueryCursorPreparerUnitTests {
Query query = query(where("foo").is("bar")).withHint("hint");
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query, null);
preparer.prepare(cursor);
pepare(query);
verify(cursor).hint("hint");
verify(cursorToUse).hint("hint");
}
/**
* @see DATAMONGO-957
*/
@Test
public void doesNotApplyMetaWhenEmpty() {
Query query = query(where("foo").is("bar"));
query.setMeta(new Meta());
pepare(query);
verify(cursor, never()).copy();
verify(cursorToUse, never()).addSpecial(any(String.class), anyObject());
}
/**
* @see DATAMONGO-957
*/
@Test
public void appliesMaxScanCorrectly() {
Query query = query(where("foo").is("bar")).maxScan(100);
pepare(query);
verify(cursorToUse).addSpecial(eq("$maxScan"), eq(100L));
}
/**
* @see DATAMONGO-957
*/
@Test
public void appliesMaxTimeCorrectly() {
Query query = query(where("foo").is("bar")).maxTime(1, TimeUnit.SECONDS);
pepare(query);
verify(cursorToUse).addSpecial(eq("$maxTimeMS"), eq(1000L));
}
/**
* @see DATAMONGO-957
*/
@Test
public void appliesCommentCorrectly() {
Query query = query(where("foo").is("bar")).comment("spring data");
pepare(query);
verify(cursorToUse).addSpecial(eq("$comment"), eq("spring data"));
}
/**
* @see DATAMONGO-957
*/
@Test
public void appliesSnapshotCorrectly() {
Query query = query(where("foo").is("bar")).useSnapshot();
pepare(query);
verify(cursorToUse).addSpecial(eq("$snapshot"), eq(true));
}
private DBCursor pepare(Query query) {
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query, null);
return preparer.prepare(cursor);
}
}

Some files were not shown because too many files have changed in this diff Show More