Compare commits

..

211 Commits

Author SHA1 Message Date
Spring Buildmaster
798b56055d DATAMONGO-1173 - Release version 1.7.0.RC1. 2015-03-05 07:47:11 -08:00
Oliver Gierke
ce68e4a070 DATAMONGO-1173 - Prepare 1.7.0.RC1 (Fowler RC1). 2015-03-05 16:31:00 +01:00
Oliver Gierke
5da3130d26 DATAMONGO-1173 - Updated changelog. 2015-03-05 15:54:23 +01:00
Oliver Gierke
6687cdc101 DATAMONGO-1110 - Polishing.
Moved to newly introduced Range type in Spring Data Commons to more safely bind minimum and maximum distances. Changed internal APIs to always use a Range<Distance> which gets populated based on the method signature's characteristics: if only one Distance parameter is found it's interpreted as a range with upper bound only.

Removed invalid testcase for minDistance on 2D index.

Original pull request: #277.
2015-03-05 15:35:42 +01:00
Christoph Strobl
7e74ec6b62 DATAMONGO-1110 - Add support for $minDistance.
We now support $minDistance for NearQuery and Criteria. Please keep in mind that minDistance is only available for MongoDB 2.6 and better and can only be combined with $near or $nearSphere operator depending on the defined index type. Usage of $minDistance with NearQuery is only possible when a 2dsphere index is present. We also make sure $minDistance operator gets correctly nested when using GeoJSON types.

It is now possible to use a Range<Distance> parameter within the repository queries. This allows to define near queries like:

findByLocationNear(Point point, Range<Distance> distances);

The lower bound of the range is treated as the minimum distance while the upper one defines the maximum distance from the given point. In case a Distance parameter is provided it will serve as maxDistance.

Original pull request: #277.
2015-03-05 15:34:45 +01:00
Thomas Darimont
b887fa70a5 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-03-05 15:30:35 +01:00
Oliver Gierke
1c6ab25253 DATAMONGO-1135 - Polishing.
A few polishing changes to the GeoConverters.
2015-03-05 14:28:11 +01:00
Christoph Strobl
1c43a3d1ee DATAMONGO-1135 - Add support for GeoJson.
We’ve added special types representing GeoJson structures. This allows to use those within both queries and domain types.

GeoJson types should only be used in combination with a 2dsphere index as 2d index is not able to handle the structure. Though legacy coordinate pairs and GeoJson types can be mixed inside MongoDB, we currently do not support conversion of legacy coordinates to GeoJson types.
2015-03-05 14:28:11 +01:00
Thomas Darimont
60ca1b3509 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:21:12 +01:00
Oliver Gierke
39d9312005 DATAMONGO-479 - Polishing.
Removed ServersideJavaScript abstraction as we still had to resort on instanceof checks and it created more ambiguities than it helped (e.g. in a script with name and code, which of the two get's executed?). We now have an ExecutableMongoScript which is code only and a NamedMongoScript, which basically is the former assigned to a name. Execution can be triggered on the former or a name.

ScriptOperations.exists(…) now returns a primitive boolean to avoid null checks. JavaDoc.

Original pull request: #254.
2015-03-04 15:18:46 +01:00
Christoph Strobl
a0e42f5dfe DATAMONGO-479 - Add support for calling functions.
We added ScriptOperations to MongoTemplate. Those allow storage and execution of java script function directly on the MongoDB server instance. Having ScriptOperations in place builds the foundation for annotation driver support in repository layer.

Original pull request: #254.
2015-03-04 15:18:40 +01:00
Oliver Gierke
7a3aff12a5 DATAMONGO-1165 - Polishing.
Renamed MongoOperations executeAsStream(…) to stream(…). Make use of Spring Data Commons StreamUtils in AbstractMongoQuery's StreamExecution. Moved test case from PersonRepositoryIntegrationTests to AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types.

Original pull request: #274.
2015-03-03 22:33:33 +01:00
Thomas Darimont
d4f1ef8704 DATAMONGO-1165 - Add support for Java 8 Stream as return type for repository methods.
Added support for a MongoDB Cursor backed Iterator that allows the usage of a Java 8 Stream at the repository level.

Original pull request: #274.
2015-03-03 20:56:47 +01:00
Oliver Gierke
a86d704bec DATAMONGO-1158 - Polishing.
MongoFactoryBean, MongoOptionsFactoryBean, MongoClientFactoryBean and MongoClientOptionsFactoryBean now extend AbstractFactoryBean to get a lot of the lifecycle callbacks without further code.

Added non-null assertions to newly introduced methods on MongoOperations/MongoTemplate.

Moved MongoClientVersion into util package. Introduced static imports for ReflectionUtils and MongoClientVersion for all references in the newly introduced Invoker types.

Some formatting, JavaDoc polishes, suppress deprecation warnings. Added build profile for MongoDB Java driver 3.0 as well as the following snapshot.

Original pull request: #273.
2015-03-02 21:50:27 +01:00
Christoph Strobl
57ab27aa5b DATAMONGO-1158 - Add Support for MongoDB Java driver 3.0.
We now support mongo-java-driver version 2.x and 3.0 along with MongoDB Server 2.6.7 and 3.0.0.

Please note that some of the configurations options might no longer be valid when used with version 3 of the MongoDB Java driver. Have a look at the table below so see some of the major differences in using version 2.x or 3.0

                      | 2.x                  | 3.0
----------------------+----------------------+-----------------------------------------------
default WriteConcern  | NONE                 | UNACKNOWLEDGED
----------------------+----------------------+-----------------------------------------------
option for slaveOk    | available            | ignored
----------------------+----------------------+-----------------------------------------------
option for autoConnect| available            | ignored
----------------------+----------------------+-----------------------------------------------
write result checking | available            | ignored (errors are exceptions anyway)
----------------------+----------------------+-----------------------------------------------
rest index cache      | available            | throws UnsupportedOperationException
----------------------+----------------------+-----------------------------------------------
DBRef resolution      | via DBRef.fetch      | via collection.findOne
----------------------+----------------------+-----------------------------------------------
MapReduce Options     | applied              | ignored
----------------------+----------------------+-----------------------------------------------
authentication        | via UserCredentials  | via MongoClient
----------------------+----------------------+-----------------------------------------------
WriteConcernException | not available        | translated to DataIntegretyViolationException
----------------------+----------------------+-----------------------------------------------
executeInSession      | available            | requestStart/requestDone commands ignored.
----------------------+----------------------+-----------------------------------------------
index creation        | via createIndex      | via createIndex
----------------------+----------------------+-----------------------------------------------

We need to soften the exception validation a bit since the message is slightly different when using different storage engines in a MongoDB 3.0 environment.

Added an explicit <mongo-client /> element and <client-options /> to the configuration schema. These elements will replace existing <mongo /> and <options /> elements in a subsequent release. Added credentials attribute to <mongo-client /> which allows to define a set of credentials used for setting up the MongoClient correctly using authentication data. We now reject <mongo-options /> configuration when using MongoDB Java driver generation 3.0 and above.

Original pull request: #273.
2015-03-02 20:26:50 +01:00
Thomas Darimont
909cc8b5d3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:18 +01:00
Thomas Darimont
b7acbc4347 DATAMONGO-1167 - Added QueryDslPredicateExecutor.findAll(Predicate, Sort).
We now support findAll on QueryDslMongoRepository that accepts a Querydsl Predicate and a Sort and returns a List<T>.

Original pull request: #275.
2015-02-24 09:51:39 +01:00
Oliver Gierke
d276306ddc DATAMONGO-1162 - Adapt to API changes in Spring Data Commons. 2015-02-06 12:48:26 +01:00
Oliver Gierke
25b98b7ad2 DATAMONGO-1154 - Upgraded to MongoDB Java driver 2.13.0. 2015-01-29 20:54:39 +01:00
Oliver Gierke
819b424142 DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:04:16 +01:00
Oliver Gierke
5d0328ba4b DATAMONGO-1144 - Updated changelog. 2015-01-28 20:46:19 +01:00
Oliver Gierke
b219cff29c DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:59 +01:00
Oliver Gierke
409eeaf962 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:46 +01:00
Oliver Gierke
4e5e8bd026 DATAMONGO-1146 - Polishing.
Added missing @Override annotations to QueryDslMongoRepository methods.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:14 +01:00
Thomas Darimont
b91ec53ae0 DATAMONGO-1146 - Added QueryDslMongoRepository.exists(…) which accepts a Querydsl predicate.
Added explicit test case for QueryDslMongoRepository.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:06 +01:00
Oliver Gierke
ce0624b8b0 DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:57:56 +01:00
alex-on-java
b4de2769cf DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 17:51:44 +01:00
Thomas Darimont
3f7b0f1eb6 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:47:15 +01:00
Thomas Darimont
4055365c57 DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:04:01 +01:00
Thomas Darimont
db7f782ca6 DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:16:12 +01:00
Christoph Strobl
cde9d8d23a DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:17:13 +01:00
Oliver Gierke
3dd9b0a2b6 DATAMONGO-1136 - Polishing.
Polished equals(…) / hashCode() methods in GeoCommand.

Original pull request: #263.
2015-01-12 19:42:50 +01:00
Christoph Strobl
59e54cecd2 DATAMONGO-1136 - Use $geoWithin instead of $within for geo queries.
We now use the $geoWithin operator for geospatial criteria which requires to run on  at least MongoDB 2.4.

Original pull request: #263.
2015-01-12 19:42:15 +01:00
Oliver Gierke
5ed7e8efc2 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:10:10 +01:00
Oliver Gierke
fa85adfe0b DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:32 +01:00
Oliver Gierke
a3e4f44a64 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Thomas Darimont
4a7a485e62 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Christoph Strobl
c353e02b3e DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:45:26 +01:00
Christophe Fargette
1c2964cab4 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:37 +01:00
Oliver Gierke
47e083280a DATACMNS-1131 - We now register the ThreeTen back port converters by default.
Related ticket: DATACMNS-628.
2015-01-05 19:11:54 +01:00
Oliver Gierke
7db003100b DATAMONGO-1129 - Upgraded to MongoDB Java driver 2.12.4.
Added Travis build configuration, too.
2014-12-31 14:20:52 +01:00
Oliver Gierke
f814b1ef47 DATAMONGO-1128 - Added test cases to validate Optional mapping.
Added test cases to make sure Optional instances are handled correctly and the converters are actually applied to the nested value.
2014-12-31 13:59:43 +01:00
Oliver Gierke
f3d2ae366e DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:47:54 +01:00
Oliver Gierke
b6ecce3aa2 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:37:33 +01:00
Oliver Gierke
c5235be9a7 DATAMONGO-1106 - After release cleanups. 2014-12-01 13:44:58 +01:00
Spring Buildmaster
23300de9d4 DATAMONGO-1106 - Prepare next development iteration. 2014-12-01 13:36:36 +01:00
Spring Buildmaster
41dc57c84f DATAMONGO-1106 - Release version 1.7.0.M1 (Fowler M1). 2014-12-01 13:36:32 +01:00
Oliver Gierke
85d1fe1ce6 DATAMONGO-1106 - Prepare 1.7.0.M1 (Fowler M1). 2014-12-01 12:26:52 +01:00
Oliver Gierke
ac6067ad53 DATAMONGO-1106 - Updated changelog. 2014-12-01 12:25:53 +01:00
Thomas Darimont
173a62b5ce DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:09:14 +01:00
Oliver Gierke
cbbafce73d DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:23 +01:00
Oliver Gierke
2e74c19995 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:33:19 +01:00
Thomas Darimont
a212b7566c DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:33:18 +01:00
Oliver Gierke
08faa52ef4 DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:24:25 +01:00
Oliver Gierke
33bc4fffd9 DATAMONGO-1079 - Updated changelog. 2014-11-28 12:06:05 +01:00
Christoph Strobl
eca2108e15 DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:38 +01:00
Christoph Strobl
dab6034eb9 DATAMONGO-943 - Add support for $position to Update $push $each.
We now support $position on update.push.

Original pull request: #248.
2014-11-28 11:41:51 +01:00
Christoph Strobl
461e7d05d7 DATAMONGO-1092 - Ensure compatibility with MongoDB 2.8.0.rc0 and java driver 2.13.0-rc0.
We updated GroupByResults to allow working with changed data types returned for count and keys and fixed assertion on error message for duplicate keys.
Using java-driver 2.12.x when connecting to an 2.8.0.rc-0 instance is likely to cause trouble with authentication. This is the intended behavior.

2.8.0-rc0 throws error when removing elements from a collection that does not yet exist, which is different to what 2.6.x does.

The java-driver 2.13.0-rc0 works perfectly fine with a 2.6.x Server instance.
We deprecated Index.Duplicates#DROP since it has been removed in MongoDB 2.8

Original pull request: #246.
2014-11-28 11:33:12 +01:00
Oliver Gierke
10c37b101d DATAMONGO-1105 - Added implementation of QueryDslPredicateExecutor.findAll(OrderSpecifier<?>... orders).
Renamed QuerydslRepositorySupportUnitTests to QuerydslRepositorySupportTests as it's an integration test.
2014-11-28 10:37:21 +01:00
Christoph Strobl
81f2c910f7 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Added support for NotContaining along the way.

Original pull request: #241.
2014-11-27 17:06:39 +01:00
Thomas Darimont
1fd97713c1 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:45:35 +01:00
Oliver Gierke
2d3eeed9ec DATAMONGO-1102 - Added support for Java 8 date/time types.
We're now able to persist and read non-time-zoned JDK 8 date/time types (LocalDate, LocalTime, LocalDateTime) to and from Date instances.
2014-11-27 16:28:36 +01:00
Christoph Strobl
b22eb6f12f DATAMONGO-1101 - Add support for $bit to Update.
We now support bitwise and/or/xor operations for Update.
2014-11-26 11:26:56 +01:00
Mikhail Mikhaylenko
dfb0a2a368 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:40:30 +01:00
Thomas Darimont
03bcc56429 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:32:16 +01:00
Christoph Strobl
457fda3fc3 DATAMONGO-1097 - Add support for $mul to Update.
We now support multiply on Update allowing to multiply the value of the given key by a multiplier.
2014-11-24 20:38:44 +01:00
Oliver Gierke
54cee64610 DATAMONGO-1100 - Upgrade to new PersistentPropertyAccessor API. 2014-11-20 15:12:25 +01:00
Christoph Strobl
477499248a DATAMONGO-1086 - Mapping fails for collection with two embbeded types that extend a generic abstract.
We now use the type information of the raw property type to check if we need to include _class.
2014-11-20 15:12:25 +01:00
Oliver Gierke
3b70b6aeee DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:03 +01:00
Christoph Strobl
163762e99e DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:37:56 +01:00
Oliver Gierke
b99833df75 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:35:51 +01:00
Thomas Darimont
4be6231426 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:12 +01:00
Christoph Strobl
4673e3d511 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:28:22 +01:00
Christoph Strobl
00e48cc424 DATAMONGO-1050 - Explicitly annotated Field should not be considered Id.
We changed the id resolution to skip properties having an explicit name set via @Field unless they are marked with @Id. This means that

@Field(“id”) String id;

will be stored as “id” within mongodb. Prior to this change the fieldname would have been changed to “_id”.
Added tests to ensure proper field mapping for various "id" field variants.

Original pull request: #225.
2014-10-23 11:39:17 +02:00
Christoph Strobl
f8453825fb DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:55:50 +02:00
Christoph Strobl
6cda9ab939 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 11:36:15 +02:00
Oliver Gierke
831d667896 DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:13:53 +02:00
Christoph Strobl
17c342895a DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:45 +02:00
Christoph Strobl
6ef518e6a0 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:30 +02:00
Oliver Gierke
ddee2fbb12 DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:18 +02:00
Christoph Strobl
6512c2cdfb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:14 +02:00
Oliver Gierke
0eee05adaa DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 15:32:18 +02:00
Christoph Strobl
17e0154ff3 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:06:22 +02:00
Thomas Darimont
2780f60c65 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-09-30 12:55:24 +02:00
Christoph Strobl
7dd3450362 DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:40:26 +02:00
Oliver Gierke
ca4b2a61b8 DATAMONGO-1046 - After release cleanups. 2014-09-15 14:30:23 +02:00
Oliver Gierke
d2ecd65ca5 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:27:21 +02:00
Spring Buildmaster
03bd49f6c8 DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 03:12:04 -07:00
Spring Buildmaster
51607c5ed8 DATAMONGO-1046 - Release version 1.6.0.RELEASE (Evans GA). 2014-09-05 03:12:02 -07:00
Oliver Gierke
e2cbd3ee28 DATAMONGO-1046 - Prepare 1.6.0.RELEASE (Evans GA). 2014-09-05 11:43:58 +02:00
Oliver Gierke
5944e6b57e DATAMONGO-1046 - Updated changelog. 2014-09-05 09:23:31 +02:00
Oliver Gierke
efd46498ef DATAMONGO-1033 - Updated changelog. 2014-09-05 07:31:54 +02:00
Christoph Strobl
3d705a737f DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:47:13 +02:00
Christoph Strobl
996c57bccf DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:21:55 +02:00
Oliver Gierke
a31e72ff06 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:51:31 +02:00
Mark Paluch
f07d8fca8c DATAMONGO-1036 - Improved detection of custom implementations for CDI repositories.
Adapted to API changes in CDI extension.

Related ticket: DATACMNS-565.
2014-09-01 13:51:20 +02:00
Christoph Strobl
69dbdee01f DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:39 +02:00
Oliver Gierke
dedb9f3dc0 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.

Improved toString() method on ObjectPath to create more helpful output.
2014-08-26 20:07:46 +02:00
Oliver Gierke
7d69b840fe DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:16:18 +02:00
Christoph Strobl
4eaef300cb DATAMONGO-1025 - Fix creation of nested named index.
Deprecated collection attribute for @Indexed, @CompoundIndex, @GeoSpatialIndexed. Removed deprecated attribute `expireAfterSeconds` from @CompoundIndex.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Christoph Strobl
ec1a6b5edd DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Oliver Gierke
adc5485c09 DATAMONGO-1032 - Polished Asciidoctor documentation. 2014-08-26 14:24:51 +02:00
Oliver Gierke
f622b2916d DATAMONGO-1021 - After release cleanups. 2014-08-13 16:32:42 +02:00
Spring Buildmaster
26be0cf948 DATAMONGO-1021 - Prepare next development iteration. 2014-08-13 07:02:43 -07:00
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
Spring Buildmaster
d861fecdb8 DATAMONGO-981 - Release version 1.6.0.M1. 2014-07-10 10:38:55 -07:00
Oliver Gierke
f280e23095 DATAMONGO-981 - Prepare 1.6.0.M1 (Evans M1). 2014-07-10 19:28:34 +02:00
Oliver Gierke
ed0e1d92c0 DATAMONGO-981 - Updated changelog. 2014-07-10 17:14:24 +02:00
Christoph Strobl
d82fc22659 DATAMONGO-944 - Add support for $currentDate to Update.
Added currentDate and currentTimestamp to Update.

Original pull request: #200.
2014-07-10 15:13:59 +02:00
Thomas Darimont
6616d6788c DATAMONGO-975 - Add support for extracting date/time components from a field projection.
We added some extract-methods to ProjectionOperationBuilder to be able to extract date / time components from projected fields.

Original pull request: #204.
2014-07-10 12:45:17 +02:00
Christoph Strobl
322a7cf033 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:48:02 +02:00
Christoph Strobl
0f487c10ba DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-09 21:21:04 +02:00
Christoph Strobl
11417144bd DATAMONGO-980 - Use meta annotations from commons for @Score.
We now use Spring Data Commons' @ReadOnlyProperty to meta-annotate @Score to mark it as read-only property.

Original pull request: #201.
Related tickets: DATACMNS-534.
2014-07-09 14:51:21 +02:00
Christoph Strobl
dafc59b163 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:04 +02:00
Oliver Gierke
566f9a80c4 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:36:01 +02:00
Christoph Strobl
89a42c5648 DATAMONGO-976 - Add support for reading $meta projection on textScore into document.
We introduced @TextScore that can be used to mark a property to take { $meta : “textScore” }. In contrast to @Transient the value will be considered when reading documents.
The value can and will only get picked up if the score field is retrieved from the store.

Original pull request: #198.
2014-07-07 11:43:43 +02:00
Christoph Strobl
83ffbb00e8 DATAMONGO-850 - Add support for full text search via $text
Using TextQuery and TextCriteria allows creation of queries using $text $search.

{ $meta : “textScore” } can be included using TextQuery.includeScore. As the fieldname used for textScore must not be fixed to “score” one can use the overload taking the fieldname to override the default.

Original pull request: #198.
2014-07-07 11:39:47 +02:00
Christoph Strobl
84913cecab DATAMONGO-937 - Add support for creating text index.
We now support creating text index based on information gathered on domain types.

Using @TextIndexed marks properties to be considered for the full text index. Use the weight attribute to influence document scoring during search operations.

Please note that using @TextIndexed on entity properties forces all properties of any sub document to be considered as part of the text index. Any set weight will in that case be propagated to the siblings taking the most recent weight information into account, which means that a the weight attribute can be overridden for properties in sub documents.

The setting the index default language can be done via @Document(language) while @Language can be used to define the language_override field.

As text search is disabled by default for mongodb 2.4 we use a jUnit ClassRule to restrict integration tests potentially creating text index (as the entities for testing are found in the classpath) to only be executed in when a 2.6+ mongodb server is present.

For usage hints please see section 6.3.4 (Text Indexes) of reference manual.

Original pull request: #198.
2014-07-07 11:30:08 +02:00
Christoph Strobl
998bb09a92 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:16:57 +02:00
Oliver Gierke
cd68a8db54 DATAMONGO-977 - Removed reflective detection of Spring 4 in DBRef proxy creation.
After the Spring 4 upgrade we can now directly use the Objenesis infrastructure of it.
2014-07-02 09:23:19 +02:00
Oliver Gierke
df8477d180 DATAMONGO-955 - Updated changelog. 2014-06-30 10:57:12 +02:00
Christoph Strobl
244fbae0ce DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:25:26 +02:00
Oliver Gierke
19e08a52c0 DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:12:50 +02:00
Thomas Darimont
6389b1bb73 DATAMONGO-954 - Add support for system variables in aggregation operations.
Add assumes for appropriate MongoDB version.
2014-06-26 14:19:24 +02:00
Thomas Darimont
cadcbf6106 DATAMONGO-954 - Add support for system variables in aggregation operations.
We now support referring to system variables like for instance $$ROOT or $$CURRENT from within aggregation framework pipeline projection and group expressions.

Original pull request: #190.
2014-06-25 15:53:44 +02:00
Christoph Strobl
118f007ca6 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:12:32 +02:00
Christoph Strobl
cbb32bd29d DATAMONGO-950 - Add support for limiting the query result in the query derivation mechanism.
When deriving the query from its method name we check for the limit set on the PartTree to pass this on to the created query. PagedExecution not takes the overall limit into account, skips a query execution entirely (if the Pageable is out of scope completely) or alters the query limits accordingly.

Note, that there has been significant rework of this compared to the pull request to avoid new API in Query and extensive changes in MongoTemplate's QueryCursorPreparer.

Original pull request: #191.
2014-06-25 14:58:59 +02:00
Thomas Darimont
9858dcd740 DATAMONGO-960 - Allow to pass options to the Aggregation Pipeline.
We introduced AggregationOptions abstraction to conveniently construct option objects that can be handed to an Aggregation via the new Aggregation.withOptions(...) factory method. For more details, see the Builder class' JavaDoc.

Note that we exposed the "rawResults" in AggregationResults and put a null guard in MongoTemplate aggregate in order to support the "explain" option.

Original pull request: #195.
2014-06-25 13:25:57 +02:00
Thomas Darimont
1fb76d135b DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:30 +02:00
Oliver Gierke
bb62c8b2f1 DATAMONGO-958 - Switch to FieldNamingStrategy SPI in Spring Data Commons. 2014-06-20 21:27:24 +02:00
Christoph Strobl
2cbe7bf885 DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:14 +02:00
Christoph Strobl
6043f6b74d DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:30 +02:00
Christoph Strobl
ef1366592a DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:27:57 +02:00
Thomas Darimont
01cf9fb8f3 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:27:54 +02:00
Thomas Darimont
285c406d5d DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:06 +02:00
Oliver Gierke
ad29e52a57 DATAMONGO-936 - After release cleanups. 2014-05-20 19:54:03 +02:00
Spring Buildmaster
3cfe207c83 DATAMONGO-936 - Prepare next development iteration. 2014-05-20 09:35:38 -07:00
Spring Buildmaster
c7e65cbc40 DATAMONGO-936 - Release version 1.5.0.RELEASE. 2014-05-20 09:35:35 -07:00
Oliver Gierke
b8e02efb04 DATAMONGO-936 - Prepare 1.5 GA.
Removed obsolete readme.txt.
2014-05-20 17:14:04 +02:00
Oliver Gierke
c7f20fb836 DATAMONGO-936 - Prepare 1.5 GA. 2014-05-20 16:57:34 +02:00
Oliver Gierke
28a0202ef4 DATAMONGO-936 - Updated changelog. 2014-05-20 16:26:20 +02:00
Christoph Strobl
164e947045 DATAMONGO-926 - Avoid StackOverflowError while resolving index structures.
We now guard cyclic non transient, non DBRef property references while inspecting domain types for potentially index structures. To do so we check on the properties path and owning type to determine potential cycles. If found we log a warn message pointing to path, entity and property potentially causing problems and skip processing for this path.

Original pull request: #180.
2014-05-19 19:09:40 +02:00
Christoph Strobl
7e65c0c87d DATAMONGO-367 - Nested @Indexed should not trigger creation of separate collection.
The issue has been solved along with DATAMONGO-888 (Pull Request: #162). We have created additional tests to explicitly check it has truly been fixed.
2014-05-19 17:19:28 +02:00
Christoph Strobl
9c1f753f17 DATAMONGO-929 - Use property path for keys of Indexed and CompoundIndex.
Index creation failed for @Indexed and @CompoundIndex as the resolved dotPath was not used for creation. We now not only resolve the dotPath but also use it within the key for index definition. In case of a nested compound index the key definition is enhanced by the provided path.

When leaving the key definition empty for nested compound index we'll create an index for the whole nested document. Trying to create a compound index on root level not providing key information leads to InvalidDataApiUsageException.

Original pull request: #179.
2014-05-19 17:14:36 +02:00
Oliver Gierke
0f821eb52d DATAMONGO-925, DATAMONGO-928 - Polishing.
Only reject attribute setup if abbreviation is activated and a custom strategy is configured. Additional test cases for the rejection case and a custom, over-configuration (explicitly setting abbreviation to false, which is the default anyway).

Related pull request: #177.
2014-05-19 17:07:50 +02:00
Ryan Tenney
e3aadd63ab DATAMONGO-928 - Removed explicit default value for abbreviate-field-names from namespace XSD.
The default for boolean attributes leaks into the evaluation of XML namespace attributes which causes us being unable to detect whether two attributes have been set in a conflicting way.

Fix the documentation on the field-naming-strategy-ref attribute.

Original pull request: #183.
Related pull request: #177.
Related ticket: DATAMONGO-925.
2014-05-19 16:44:03 +02:00
Christoph Strobl
aa06d520df DATAMONGO-647 - Added test case to show that field names are mapped correctly.
Additional test added to check if the issue has truly been resolved by DATAMONGO-888.

Original pull request: #181.
Related pull Request: #162.
Related ticket: DATAMONGO-888.
2014-05-19 14:38:06 +02:00
Oliver Gierke
4fa1d4ba97 DATAMONGO-919 - After release cleanups. 2014-05-02 15:45:06 +02:00
Spring Buildmaster
ba9f11b345 DATAMONGO-919 - Prepare next development iteration. 2014-05-02 05:58:16 -07:00
Spring Buildmaster
4777dd2e5e DATAMONGO-919 - Release version 1.5.0.RC1. 2014-05-02 05:58:13 -07:00
Oliver Gierke
64b4591b72 DATAMONGO-919 - Prepare 1.5.0.RC1.
Upgraded to Spring Data Parent 1.4.0.RC1 and Spring Data Commons 1.8.0.RC1. Switched to milestone repository.
2014-05-02 14:50:48 +02:00
Oliver Gierke
bac5961fd8 DATAMONGO-919 - Updated changelog. 2014-05-02 14:50:48 +02:00
Thomas Darimont
b3cac862d8 DATAMONGO-924 - Improve aggregation field reference resolving.
Previously we didn't support referring to aliased fields defined in former stages of an aggregation pipeline. We now also consider field aliases during field reference lookup.

Original pull request: #176.
2014-05-02 14:46:13 +02:00
Oliver Gierke
b8ab2ad539 DATAMNOG-919 - Forward port changelog entries from bugfix branch. 2014-05-02 12:48:18 +02:00
Christoph Strobl
618d5bd5e9 DATAMONGO-919 - Prepare Release 1.5.0.RC1
Upgrade AspectJ Maven plugin to recent version which enables building the module on java 8 using AspectJ 1.8.
2014-05-01 21:02:01 +02:00
Oliver Gierke
096c3278b3 DATAMONGO-92 - Upgraded to MongoDB Java driver 2.12.1. 2014-04-30 08:51:18 +02:00
Kim Toms
c9d9976c22 DATAMONGO-920 - Improve debug message for delete events in AbstractMongoEventListener.
Adjusted debug message to reflect the actual operation.

Original pull request: #95.
2014-04-29 15:52:12 +02:00
Thomas Darimont
65497f93d4 DATAMONGO-827 - Allow index creation use mongodb auto generated names.
We now support letting MongoDB generate index names by introducing the attribute "useGeneratedName" to the @Indexed, @GeoSpatialIndex, @CompoundIndex annotations.
2014-04-28 19:58:56 +02:00
Christoph Strobl
af8a53bef6 DATAMONGO-909 - Compound index on inherited class should be created correctly.
With the overhaul of the index creation done in DATAMONGO-899 the CompoundIndex annotation is not longer just looked up at the concrete type but rather all its interfaces and super classes. So we just added an additional test to verify this behaviour.
2014-04-28 19:58:56 +02:00
Oliver Gierke
916b856e97 DATAMONGO-899 - Polished API of new index creation abstractions.
Removed the introduction of the IndexDefinition being collection aware again. The collection an index is created in is now held in the IndexDefinitionHolder. This is mostly due to the fact that the IndexDefinition implementations can be used with MongoTemplate and the index opoerations take a collection alongside the index definition.

Made the IndexResolver API package protected so that we can further change it going forward. We should think about deprecating the collectionName attributes on index annotations as it doesn't make too much sense to manually configure the collection name for the indexes as the collection is predefined through the domain type setting here. This would allow us to remove the entire collection handling code inside the IndexResolver implementation.

Turned IndexDefinitionHolder into a value object.

Original pull request: #168.
2014-04-28 19:58:50 +02:00
Christoph Strobl
7848da63f2 DATAMONGO-899 - Ensure proper creation of index structures for nested entities.
Index creation did not consider the properties path when creating the index. This lead to broken index creation when nesting entities that might require index structures.

Off now index creation traverses the entities property path for all top level entities (namely those holding the @Document annotation) and creates the index using the full property path.

This is a breaking change as having an entity to carry the @Document annotation has not been required by now.

Original Pull Request: #168
2014-04-28 16:50:26 +02:00
Christoph Strobl
f359a1d31a DATAMONGO-847 - Allow usage of Query within an Update clause.
In case we detect Query within a value used for an Update value we map the query itself to build the expression to use. This allows to form query statements for e.g. $pull using the same API as for the query itself.

Update update = new Update().pull("list", query(where("value").in("foo", "bar")));

Original Pull Request: #172.
2014-04-28 13:30:13 +02:00
Thomas Darimont
72d645feae DATAMONGO-917 - Improve Spring 4.0 framework version detection to avoid NullPointerExceptions.
We now check for the presence of DefaultParameterNameDiscoverer in order to determine if we are running with a Spring version later than 4.0 since this avoids potential NullPointerExceptions in cases where the package version information is not available e.g. in cases where the application was bundled into an uberjar e.g. via the maven-shade-plugin.

Original pull request: #173.
2014-04-28 13:21:09 +02:00
Thomas Darimont
72fe382bba DATAMONGO-913 - Improve DBRef handling in for LazyLoadingProxies.
We now use the captured DBRef of a given LazyLoadingProxy in MappingMongoConverter.toDBRef(..) in order to avoid a new DBRef creation that would fail for the proxy.

Original pull request: #174.
2014-04-28 13:07:57 +02:00
Thomas Darimont
d25e840cf5 DATAMONGO-914 - Improve resolving of lazy-loading proxies for classes that override equals(…)/hashCode().
We now properly resolve lazy-loading proxies for @DBRef's when an overridden equals or hash code method is called with Spring 4. We fall back to our old Objenesis proxy generation in order to circumvent the default handling for overridden hashcCode() and equals(…) methods in CglibAopProxies generated by Spring 4.

If we detect that we run with Spring 4 we use the repacked Objenesis that is included in Spring 4. Previously the generated proxy used some generic hashCode() or equals(…) logic that did not trigger a proper lazy loading in such cases.

Original pull request: #171.
2014-04-23 09:29:33 +02:00
Thomas Darimont
df1775572a DATAMONGO-912 - Consider custom conversions in all stages of an aggregation pipeline.
We now consider custom mongo conversions in all stages of an aggregation pipeline. Previously we did this only for the first stage and returned object basically unmapped in later stages. We now pass the root AggregationOperationContext on to nested ExposedFieldsAggregationOperationContexts so that those can delegate any mongo Mapping to the root context.

Original pull request: #170.
2014-04-23 09:00:56 +02:00
Christoph Strobl
86670cd49f DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:36:11 +02:00
Christoph Strobl
31a4bf906e DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:11:12 +02:00
Christoph Strobl
599291e8b7 DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails. 

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:11:00 +02:00
Thomas Darimont
f5a04fb9fb DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:55:57 +02:00
Oliver Gierke
88558b67c3 DATAMONGO-866 - Polishing for new field naming strategy configuration support.
Use the camel case split logic from Spring Data Commons (introduced for DATACMNS-486) in a common CamelCaseSplittingFieldNamingStrategy super class.

MappingMongoConverterParser now also rejects the configuration if both abbreviate-field-names and field-naming-strategy-ref are configured.
2014-04-10 18:36:37 +02:00
Ryan Tenney
d52cb255e0 DATAMONGO-866 - Add means to configure field naming strategy. 2014-04-10 18:03:55 +02:00
Oliver Gierke
6c214cbc37 DATAMONGO-910 - Upgraded to latest MongoDB Java driver 2.12.0.
Removed special mongo-osgi property as the driver version is now a valid OSGi version number.
2014-04-10 16:55:22 +02:00
Jeff Yemin
01012c1448 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:45 +02:00
Christoph Strobl
4c7befb910 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:43:51 +02:00
Christoph Strobl
b62669ec8f DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:33:34 +02:00
Oliver Gierke
0fb74caf9b DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 12:12:29 +02:00
Oliver Gierke
9623dac01f DATAMONGO-901 - Fixed regression of not registering type predicting post processor.
The changes for DATAMONGO-843 introduced a regression by skipping the registration of the RepositoryInterfaceAwareBeanPostProcessor. This can cause the wiring of repository bean definitions to fail depending on in which order the bean definitions get instantiated.

This change reintroduces the registration and adds an explicit test case for it.
2014-04-02 17:58:36 +02:00
Spring Buildmaster
695b27968c DATAMONGO-859 - Prepare next development iteration. 2014-03-31 17:10:40 +02:00
299 changed files with 27180 additions and 8868 deletions

22
.travis.yml Normal file
View File

@@ -0,0 +1,22 @@
language: java
jdk:
- oraclejdk8
services:
- mongodb
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

View File

@@ -11,7 +11,7 @@ For a comprehensive treatment of all the Spring Data MongoDB features, please re
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.spring.io/forum/spring-projects/data/nosql).
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.1.RELEASE</version>
<version>1.5.0.RELEASE</version>
</dependency>
```
@@ -36,7 +36,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.BUILD-SNAPSHOT</version>
<version>1.6.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -139,7 +139,7 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.spring.io/forum/spring-projects/data/nosql) by responding to questions and joining the debate.
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.

57
pom.xml
View File

@@ -2,10 +2,10 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.4.0.M1</version>
<version>1.6.0.RC1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,11 +29,11 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.8.0.M1</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
<springdata.commons>1.10.0.RC1</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
<developers>
<developer>
<id>ogierke</id>
@@ -105,11 +105,44 @@
<profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
<mongo>2.13.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.0-beta3</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
</profiles>
@@ -121,14 +154,14 @@
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone/</url>
<url>http://repo.spring.io/libs-milestone</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -2,22 +2,22 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<dependencies>
<!-- Spring -->
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
</dependency>
<dependency>
@@ -56,17 +56,13 @@
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
@@ -102,7 +98,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
@@ -131,8 +127,10 @@
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration>
</plugin>
</plugins>

View File

@@ -2,12 +2,12 @@ Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,6 +32,10 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,5 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.9.205">
<context version="7.1.10.209">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,6 +32,7 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
@@ -11,18 +11,19 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.7.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -50,7 +51,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -77,7 +78,7 @@
<version>1.0</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<dependency>
<groupId>javax.enterprise</groupId>
@@ -86,21 +87,21 @@
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
@@ -115,7 +116,7 @@
<version>${validation}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
@@ -129,23 +130,36 @@
<version>4.2.0.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
@@ -189,9 +203,14 @@
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,9 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -35,7 +38,6 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
@@ -44,6 +46,7 @@ import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
@@ -51,6 +54,8 @@ import com.mongodb.Mongo;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -67,7 +72,10 @@ public abstract class AbstractMongoConfiguration {
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
@@ -126,7 +134,10 @@ public abstract class AbstractMongoConfiguration {
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
@@ -144,10 +155,7 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
@@ -232,4 +240,15 @@ public abstract class AbstractMongoConfiguration {
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -30,6 +30,7 @@ import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.parsing.ReaderContext;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -51,10 +52,10 @@ import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -71,6 +72,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -83,8 +85,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
@@ -209,11 +214,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
@@ -222,6 +223,34 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return ctxRef;
}
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
boolean fieldNamingStrategyReferenced = StringUtils.hasText(fieldNamingStrategy);
boolean abbreviationActivated = StringUtils.hasText(abbreviateFieldNames)
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element);
return;
}
Object value = null;
if ("true".equals(abbreviateFieldNames)) {
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
} else if (fieldNamingStrategyReferenced) {
value = new RuntimeBeanReference(fieldNamingStrategy);
}
if (value != null) {
builder.addPropertyValue("fieldNamingStrategy", value);
}
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");

View File

@@ -0,0 +1,85 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -0,0 +1,132 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : text.split(",")) {
if (!text.contains(USERNAME_PASSWORD_DELIMINATOR) || !text.contains(DATABASE_DELIMINATOR)) {
throw new IllegalArgumentException("Credentials need to be in format 'username:password@database'!");
}
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(String.format(
"Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
credentials.add(MongoCredential.createCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private static String[] extractUserNameAndPassword(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
String userNameAndPassword = text.substring(0, dbSeperationIndex);
return userNameAndPassword.split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -33,6 +34,7 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,10 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -36,6 +32,7 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
@@ -64,7 +61,8 @@ public class MongoParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
@@ -75,19 +73,4 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -33,13 +34,13 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {
}
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
@@ -54,12 +55,14 @@ abstract class MongoParsingUtils {
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
@@ -80,13 +83,58 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
@@ -103,4 +151,56 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
@@ -36,11 +38,13 @@ import com.mongodb.MongoException;
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -69,9 +73,9 @@ public class DefaultIndexOperations implements IndexOperations {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
collection.createIndex(indexDefinition.getIndexKeys());
}
return null;
}
@@ -104,10 +108,12 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null;
}
});
@@ -140,8 +146,15 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key);
if ("2d".equals(value)) {
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
@@ -159,8 +172,8 @@ public class DefaultIndexOperations implements IndexOperations {
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
}
return indexInfoList;

View File

@@ -0,0 +1,188 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface IndexOperations {
@@ -51,7 +51,11 @@ public interface IndexOperations {
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -0,0 +1,295 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,12 +18,13 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
@@ -34,6 +35,7 @@ import com.mongodb.Mongo;
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -43,9 +45,7 @@ public abstract class MongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {
}
private MongoDbUtils() {}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -65,11 +65,24 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
@@ -109,22 +122,9 @@ public abstract class MongoDbUtils {
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
if (credentialsGiven && !authDb.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
if (requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}
// TX sync active, bind new database to thread
@@ -181,16 +181,36 @@ public abstract class MongoDbUtils {
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
db.requestDone();
ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a monog-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provied within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,21 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -36,9 +38,23 @@ import com.mongodb.MongoInternalException;
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -47,28 +63,25 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses.
// All other MongoExceptions
if (ex instanceof DuplicateKey) {
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
// Driver 2.12 throws this to indicate connection problems. String comparison to avoid hard dependency
if (ex.getClass().getName().equals("com.mongodb.MongoServerSelectionException")) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,7 @@ import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -40,12 +38,14 @@ import com.mongodb.WriteConcern;
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private Mongo mongo;
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoOptions mongoOptions;
private String host;
@@ -53,9 +53,11 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
/**
* @param mongoOptions
*/
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
@@ -66,7 +68,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
@@ -75,30 +76,19 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
}
/**
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
* Configures the host to connect to.
*
* @param host
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
@@ -112,12 +102,13 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
@@ -128,14 +119,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -146,10 +129,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
@Override
protected Mongo createInstance() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
@@ -175,18 +158,42 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongo.setWriteConcern(writeConcern);
}
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
return mongo;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
public void destroy() throws Exception {
this.mongo.close();
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,11 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -33,10 +33,14 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult;
/**
@@ -52,7 +56,6 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
@@ -86,9 +89,23 @@ public interface MongoOperations {
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
@@ -144,9 +161,26 @@ public interface MongoOperations {
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query
* @param entityType
* @return
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -250,6 +284,14 @@ public interface MongoOperations {
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -416,7 +458,9 @@ public interface MongoOperations {
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -425,7 +469,9 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -653,14 +699,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -942,4 +1002,5 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,41 +17,48 @@ package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.MongoOptions;
/**
* A factory bean for construction of a {@link MongoOptions} instance.
*
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* @author Graeme Rocher
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
@SuppressWarnings("deprecation")
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost;
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime;
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout;
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout;
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive;
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime;
private int writeNumber = DEFAULT_MONGO_OPTIONS.w;
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout;
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk;
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
private MongoOptions options;
/**
* Configures the maximum number of connections allowed per host until we will block.
*
@@ -144,7 +151,10 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
/**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
@@ -154,7 +164,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
@@ -163,7 +175,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
@@ -194,40 +208,41 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.sslSocketFactory = sslSocketFactory;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
public void afterPropertiesSet() {
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
MongoOptions options = new MongoOptions();
options.connectionsPerHost = connectionsPerHost;
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
options.maxWaitTime = maxWaitTime;
options.connectTimeout = connectTimeout;
options.socketTimeout = socketTimeout;
options.socketKeepAlive = socketKeepAlive;
options.autoConnectRetry = autoConnectRetry;
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
options.slaveOk = slaveOk;
options.w = writeNumber;
options.wtimeout = writeTimeout;
options.fsync = writeFsync;
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
this.options = options;
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
public MongoOptions getObject() {
return this.options;
return options;
}
/*
@@ -237,12 +252,4 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
public Class<?> getObjectType() {
return MongoOptions.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -53,9 +53,11 @@ import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -71,7 +73,6 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -94,14 +95,19 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.util.CloseableIterator;
import org.springframework.jca.cci.core.ConnectionCallback;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.Bytes;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
@@ -115,6 +121,7 @@ import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -310,6 +317,31 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return this.mongoConverter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeAsStream(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType) {
return execute(entityType, new CollectionCallback<CloseableIterator<T>>() {
@Override
public CloseableIterator<T> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityType);
DBObject mappedFields = queryMapper.getMappedFields(query.getFieldsObject(), persistentEntity);
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
DBCursor cursor = collection.find(mappedQuery, mappedFields);
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType);
return new CloseableIterableCusorAdapter<T>(cursor, exceptionTranslator, readCallback);
}
});
}
public String getCollectionName(Class<?> entityClass) {
return this.determineCollectionName(entityClass);
}
@@ -330,15 +362,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeCommand(com.mongodb.DBObject, int)
*/
@Deprecated
public CommandResult executeCommand(final DBObject command, final int options) {
return executeCommand(command, (options & Bytes.QUERYOPTION_SLAVEOK) != 0 ? ReadPreference.secondaryPreferred()
: ReadPreference.primary());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeCommand(com.mongodb.DBObject, com.mongodb.ReadPreference)
*/
public CommandResult executeCommand(final DBObject command, final ReadPreference readPreference) {
Assert.notNull(command, "Command must not be null!");
CommandResult result = execute(new DbCallback<CommandResult>() {
public CommandResult doInDB(DB db) throws MongoException, DataAccessException {
return db.command(command, options);
return db.command(command, readPreference);
}
});
logCommandExecutionError(command, result);
return result;
}
@@ -354,7 +403,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query, null));
}
/**
@@ -412,14 +461,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeInSession(org.springframework.data.mongodb.core.DbCallback)
*/
@Deprecated
public <T> T executeInSession(final DbCallback<T> action) {
return execute(new DbCallback<T>() {
public T doInDB(DB db) throws MongoException, DataAccessException {
try {
db.requestStart();
ReflectiveDbInvoker.requestStart(db);
return action.doInDB(db);
} finally {
db.requestDone();
ReflectiveDbInvoker.requestDone(db);
}
}
});
@@ -485,6 +540,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new DefaultIndexOperations(this, determineCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#scriptOps()
*/
@Override
public ScriptOperations scriptOps() {
return new DefaultScriptOperations(this);
}
// Find methods that take a Query to express the query and that return a single object.
public <T> T findOne(Query query, Class<T> entityClass) {
@@ -532,7 +596,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
new QueryCursorPreparer(query, entityClass));
}
public <T> T findById(Object id, Class<T> entityClass) {
@@ -614,8 +678,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
@@ -626,8 +690,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findAndRemove(Query query, Class<T> entityClass, String collectionName) {
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass);
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass);
}
public long count(Query query, Class<?> entityClass) {
@@ -639,7 +704,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
public long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
@@ -692,13 +761,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Prepare the WriteConcern before any processing is done using it. This allows a convenient way to apply custom
* settings in sub-classes.
* settings in sub-classes. <br />
* In case of using MongoDB Java driver version 3 the returned {@link WriteConcern} will be defaulted to
* {@link WriteConcern#ACKNOWLEDGED} when {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}.
*
* @param writeConcern any WriteConcern already configured or null
* @return The prepared WriteConcern or null
*/
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
return writeConcernResolver.resolve(mongoAction);
WriteConcern wc = writeConcernResolver.resolve(mongoAction);
if (MongoClientVersion.isMongo3Driver()
&& ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)
&& (wc == null || wc.getW() < 1)) {
return WriteConcern.ACKNOWLEDGED;
}
return wc;
}
protected <T> void doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
@@ -743,8 +822,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<Object> wrapper = BeanWrapper.create(entity, this.mongoConverter.getConversionService());
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
ConvertingPropertyAccessor accessor = new ConvertingPropertyAccessor(
mongoPersistentEntity.getPropertyAccessor(entity), mongoConverter.getConversionService());
accessor.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
@@ -837,11 +917,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private <T> void doSaveVersioned(T objectToSave, MongoPersistentEntity<?> entity, String collectionName) {
BeanWrapper<T> beanWrapper = BeanWrapper.create(objectToSave, this.mongoConverter.getConversionService());
ConvertingPropertyAccessor convertingAccessor = new ConvertingPropertyAccessor(
entity.getPropertyAccessor(objectToSave), mongoConverter.getConversionService());
MongoPersistentProperty idProperty = entity.getIdProperty();
MongoPersistentProperty versionProperty = entity.getVersionProperty();
Number version = beanWrapper.getProperty(versionProperty, Number.class);
Object version = convertingAccessor.getProperty(versionProperty);
Number versionNumber = convertingAccessor.getProperty(versionProperty, Number.class);
// Fresh instance -> initialize version property
if (version == null) {
@@ -851,12 +934,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
// Create query for entity with the id and old version
Object id = beanWrapper.getProperty(idProperty);
Object id = convertingAccessor.getProperty(idProperty);
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
Number number = beanWrapper.getProperty(versionProperty, Number.class);
beanWrapper.setProperty(versionProperty, number.longValue() + 1);
convertingAccessor.setProperty(versionProperty, versionNumber.longValue() + 1);
BasicDBObject dbObject = new BasicDBObject();
@@ -1005,8 +1087,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Calling update using query: %s and update: %s in collection: %s",
serializeToJsonSafely(queryObj), serializeToJsonSafely(updateObj), collectionName));
}
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
@@ -1016,7 +1098,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
if (entity != null && entity.hasVersionProperty() && !multi) {
if (writeResult.getN() == 0 && dbObjectContainsVersionProperty(queryObj, entity)) {
if (ReflectiveWriteResultInvoker.wasAcknowledged(writeResult) && writeResult.getN() == 0
&& dbObjectContainsVersionProperty(queryObj, entity)) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString() + " to collection " + collectionName);
}
@@ -1068,27 +1151,31 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Returns {@link Entry} containing the {@link MongoPersistentProperty} defining the {@literal id} as
* {@link Entry#getKey()} and the {@link Id}s property value as its {@link Entry#getValue()}.
* Returns {@link Entry} containing the field name of the id property as {@link Entry#getKey()} and the {@link Id}s
* property value as its {@link Entry#getValue()}.
*
* @param object
* @return
*/
private Map.Entry<MongoPersistentProperty, Object> extractIdPropertyAndValue(Object object) {
private Entry<String, Object> extractIdPropertyAndValue(Object object) {
Assert.notNull(object, "Id cannot be extracted from 'null'.");
Class<?> objectType = object.getClass();
if (object instanceof DBObject) {
return Collections.singletonMap(ID_FIELD, ((DBObject) object).get(ID_FIELD)).entrySet().iterator().next();
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null) {
if (idProp == null || entity == null) {
throw new MappingException("No id property found for object of type " + objectType);
}
Object idValue = BeanWrapper.create(object, mongoConverter.getConversionService())
.getProperty(idProp, Object.class);
return Collections.singletonMap(idProp, idValue).entrySet().iterator().next();
Object idValue = entity.getPropertyAccessor(object).getProperty(idProp);
return Collections.singletonMap(idProp.getFieldName(), idValue).entrySet().iterator().next();
}
/**
@@ -1099,8 +1186,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private Query getIdQueryFor(Object object) {
Map.Entry<MongoPersistentProperty, Object> id = extractIdPropertyAndValue(object);
return new Query(where(id.getKey().getFieldName()).is(id.getValue()));
Entry<String, Object> id = extractIdPropertyAndValue(object);
return new Query(where(id.getKey()).is(id.getValue()));
}
/**
@@ -1114,7 +1201,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notEmpty(objects, "Cannot create Query for empty collection.");
Iterator<?> it = objects.iterator();
Map.Entry<MongoPersistentProperty, Object> firstEntry = extractIdPropertyAndValue(it.next());
Entry<String, Object> firstEntry = extractIdPropertyAndValue(it.next());
ArrayList<Object> ids = new ArrayList<Object>(objects.size());
ids.add(firstEntry.getValue());
@@ -1123,7 +1210,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ids.add(extractIdPropertyAndValue(it.next()).getValue());
}
return new Query(where(firstEntry.getKey().getFieldName()).in(ids));
return new Query(where(firstEntry.getKey()).in(ids));
}
private void assertUpdateableIdIfNotSet(Object entity) {
@@ -1131,12 +1218,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
if (idProperty == null) {
if (idProperty == null || persistentEntity == null) {
return;
}
ConversionService service = mongoConverter.getConversionService();
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class);
Object idValue = persistentEntity.getPropertyAccessor(entity).getProperty(idProperty);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(String.format(
@@ -1180,7 +1266,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { dboq, collection.getName() });
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { serializeToJsonSafely(dboq),
collection.getName() });
}
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
@@ -1228,25 +1315,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
DBCollection inputCollection = getCollection(inputCollectionName);
MapReduceCommand command = new MapReduceCommand(inputCollection, mapFunc, reduceFunc,
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(), null);
DBObject commandObject = copyQuery(query, copyMapReduceOptions(mapReduceOptions, command));
MapReduceCommand command = new MapReduceCommand(inputCollection, mapFunc, reduceFunc,
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(), query == null
|| query.getQueryObject() == null ? null : queryMapper.getMappedObject(query.getQueryObject(), null));
copyMapReduceOptionsToCommand(query, mapReduceOptions, command);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
MapReduceOutput mapReduceOutput = inputCollection.mapReduce(command);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(commandObject));
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(mapReduceOutput.results()));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1254,7 +1340,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mappedResults.add(callback.doWith(dbObject));
}
return new MapReduceResults<T>(mappedResults, commandResult);
return new MapReduceResults<T>(mappedResults, mapReduceOutput);
}
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
@@ -1411,17 +1497,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CommandResult commandResult = executeCommand(command);
handleCommandError(commandResult, command);
// map results
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
}
/**
* Returns the potentially mapped results of the given {@commandResult} contained some.
*
* @param outputType
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, CommandResult commandResult) {
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
List<O> mappedResults = new ArrayList<O>();
if (resultSet == null) {
return Collections.emptyList();
}
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
List<O> mappedResults = new ArrayList<O>();
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
return new AggregationResults<O>(mappedResults, commandResult);
return mappedResults;
}
protected String replaceWithResourceIfNecessary(String function) {
@@ -1446,51 +1547,40 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return func;
}
private DBObject copyQuery(Query query, DBObject copyMapReduceOptions) {
private void copyMapReduceOptionsToCommand(Query query, MapReduceOptions mapReduceOptions,
MapReduceCommand mapReduceCommand) {
if (query != null) {
if (query.getSkip() != 0 || query.getFieldsObject() != null) {
throw new InvalidDataAccessApiUsageException(
"Can not use skip or field specification with map reduce operations");
}
if (query.getQueryObject() != null) {
copyMapReduceOptions.put("query", query.getQueryObject());
}
if (query.getLimit() > 0) {
copyMapReduceOptions.put("limit", query.getLimit());
mapReduceCommand.setLimit(query.getLimit());
}
if (query.getSortObject() != null) {
copyMapReduceOptions.put("sort", query.getSortObject());
mapReduceCommand.setSort(queryMapper.getMappedObject(query.getSortObject(), null));
}
}
return copyMapReduceOptions;
}
private DBObject copyMapReduceOptions(MapReduceOptions mapReduceOptions, MapReduceCommand command) {
if (mapReduceOptions.getJavaScriptMode() != null) {
command.addExtraOption("jsMode", true);
mapReduceCommand.setJsMode(true);
}
if (!mapReduceOptions.getExtraOptions().isEmpty()) {
for (Map.Entry<String, Object> entry : mapReduceOptions.getExtraOptions().entrySet()) {
command.addExtraOption(entry.getKey(), entry.getValue());
ReflectiveMapReduceInvoker.addExtraOption(mapReduceCommand, entry.getKey(), entry.getValue());
}
}
if (mapReduceOptions.getFinalizeFunction() != null) {
command.setFinalize(this.replaceWithResourceIfNecessary(mapReduceOptions.getFinalizeFunction()));
mapReduceCommand.setFinalize(this.replaceWithResourceIfNecessary(mapReduceOptions.getFinalizeFunction()));
}
if (mapReduceOptions.getOutputDatabase() != null) {
command.setOutputDB(mapReduceOptions.getOutputDatabase());
mapReduceCommand.setOutputDB(mapReduceOptions.getOutputDatabase());
}
if (!mapReduceOptions.getScopeVariables().isEmpty()) {
command.setScope(mapReduceOptions.getScopeVariables());
mapReduceCommand.setScope(mapReduceOptions.getScopeVariables());
}
DBObject commandObject = command.toDBObject();
DBObject outObject = (DBObject) commandObject.get("out");
if (mapReduceOptions.getOutputSharded() != null) {
outObject.put("sharded", mapReduceOptions.getOutputSharded());
}
return commandObject;
}
public Set<String> getCollectionNames() {
@@ -1594,12 +1684,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
DBObject mappedFields = queryMapper.getMappedFields(fields, entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
@@ -1637,8 +1728,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Class<T> entityClass) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndRemove using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, sort, entityClass, collectionName));
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
@@ -1662,8 +1753,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + mappedUpdate + " in collection: " + collectionName);
LOGGER.debug(String.format("findAndModify using query: %s fields: %s sort: %s for class: %s and update: %s "
+ "in collection: %s", serializeToJsonSafely(mappedQuery), fields, sort, entityClass,
serializeToJsonSafely(mappedUpdate), collectionName));
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
@@ -1695,15 +1787,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
ConversionService conversionService = mongoConverter.getConversionService();
BeanWrapper<Object> wrapper = BeanWrapper.create(savedObject, conversionService);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(savedObject.getClass());
PersistentPropertyAccessor accessor = entity.getPropertyAccessor(savedObject);
Object idValue = wrapper.getProperty(idProp, idProp.getType());
if (idValue != null) {
if (accessor.getProperty(idProp) != null) {
return;
}
wrapper.setProperty(idProp, id);
new ConvertingPropertyAccessor(accessor, conversionService).setProperty(idProp, id);
}
private DBCollection getAndPrepareCollection(DB db, String collectionName) {
@@ -1870,7 +1961,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
String error = writeResult.getError();
String error = ReflectiveWriteResultInvoker.getError(writeResult);
if (error == null) {
return;
@@ -1941,6 +2032,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return converter;
}
private DBObject getMappedSortObject(Query query, Class<?> type) {
if (query == null || query.getSortObject() == null) {
return null;
}
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
}
// Callback implementations
/**
@@ -1963,13 +2063,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " in db.collection: " + collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s in db.collection: %s", serializeToJsonSafely(query),
collection.getFullName()));
}
return collection.findOne(query);
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: " + query + " fields: " + fields + " in db.collection: "
+ collection.getFullName());
LOGGER.debug(String.format("findOne using query: %s fields: %s in db.collection: %s",
serializeToJsonSafely(query), fields, collection.getFullName()));
}
return collection.findOne(query, fields);
}
@@ -1998,7 +2099,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public DBCursor doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (fields == null || fields.toMap().isEmpty()) {
return collection.find(query);
} else {
return collection.find(query, fields);
@@ -2056,9 +2158,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Simple internal callback to allow operations on a {@link DBObject}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
private interface DbObjectCallback<T> {
static interface DbObjectCallback<T> {
T doWith(DBObject object);
}
@@ -2134,9 +2237,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
private final Class<?> type;
public QueryCursorPreparer(Query query, Class<?> type) {
public QueryCursorPreparer(Query query) {
this.query = query;
this.type = type;
}
/*
@@ -2150,11 +2256,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint())) {
&& !StringUtils.hasText(query.getHint()) && !query.getMeta().hasValues()) {
return cursor;
}
DBCursor cursorToUse = cursor;
DBCursor cursorToUse = cursor.copy();
try {
if (query.getSkip() > 0) {
@@ -2164,11 +2270,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
DBObject sortDbo = type != null ? getMappedSortObject(query, type) : query.getSortObject();
cursorToUse = cursorToUse.sort(sortDbo);
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
if (query.getMeta().hasValues()) {
for (Entry<String, Object> entry : query.getMeta().values()) {
cursorToUse = cursorToUse.addSpecial(entry.getKey(), entry.getValue());
}
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -2211,4 +2324,76 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
/**
* A {@link CloseableIterator} that is backed by a MongoDB {@link Cursor}.
*
* @since 1.7
* @author Thomas Darimont
*/
static class CloseableIterableCusorAdapter<T> implements CloseableIterator<T> {
private volatile Cursor cursor;
private PersistenceExceptionTranslator exceptionTranslator;
private DbObjectCallback<T> objectReadCallback;
/**
* Creates a new {@link CloseableIterableCusorAdapter} backed by the given {@link Cursor}.
*
* @param cursor
* @param exceptionTranslator
* @param objectReadCallback
*/
public CloseableIterableCusorAdapter(Cursor cursor, PersistenceExceptionTranslator exceptionTranslator,
DbObjectCallback<T> objectReadCallback) {
this.cursor = cursor;
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
@Override
public boolean hasNext() {
if (cursor == null) {
return false;
}
try {
return cursor.hasNext();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
}
}
@Override
public T next() {
if (cursor == null) {
return null;
}
try {
DBObject item = cursor.next();
T converted = objectReadCallback.doWith(item);
return converted;
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
}
}
@Override
public void close() {
Cursor c = cursor;
try {
c.close();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
} finally {
cursor = null;
exceptionTranslator = null;
objectReadCallback = null;
}
}
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -0,0 +1,158 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -0,0 +1,84 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,8 @@ import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -37,6 +39,7 @@ import com.mongodb.WriteConcern;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -54,7 +57,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
@@ -65,7 +70,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
@@ -77,7 +84,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
@@ -90,13 +99,36 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@SuppressWarnings("deprecation")
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
true, uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
@@ -117,6 +149,25 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
@@ -138,6 +189,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -44,9 +45,23 @@ import com.mongodb.DBObject;
*/
public class Aggregation {
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
private final List<AggregationOperation> operations;
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
@@ -66,6 +81,20 @@ public class Aggregation {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
@@ -92,11 +121,43 @@ public class Aggregation {
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
this.operations = Arrays.asList(aggregationOperations);
this.operations = aggregationOperations;
this.options = options;
}
/**
@@ -231,6 +292,29 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
@@ -248,13 +332,15 @@ public class Aggregation {
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
@@ -302,4 +388,51 @@ public class Aggregation {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import com.mongodb.DBObject;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
@@ -90,6 +91,16 @@ public class AggregationResults<T> implements Iterable<T> {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");

View File

@@ -268,14 +268,21 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.isAliased();
}
/**
* @return the synthetic
*/
public boolean isSynthetic() {
return synthetic;
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param input
* @param name
* @return
*/
public boolean canBeReferredToBy(String input) {
return getTarget().equals(input);
public boolean canBeReferredToBy(String name) {
return getName().equals(name) || getTarget().equals(name);
}
/*
@@ -340,6 +347,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
public FieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,16 +32,22 @@ import com.mongodb.DBObject;
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
private final AggregationOperationContext rootContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
this.exposedFields = exposedFields;
this.rootContext = rootContext;
}
/*
@@ -50,7 +56,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
return rootContext.getMappedObject(dbObject);
}
/*
@@ -59,7 +65,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field.getTarget());
return getReference(field, field.getTarget());
}
/*
@@ -68,11 +74,42 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(String name) {
return getReference(null, name);
}
ExposedField field = exposedFields.getField(name);
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
*/
private FieldReference getReference(Field field, String name) {
if (field != null) {
return new FieldReference(field);
Assert.notNull(name, "Name must not be null!");
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new FieldReference(exposedField);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,6 +30,7 @@ import org.springframework.util.StringUtils;
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
@@ -83,6 +84,15 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -186,15 +196,24 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* as well.
* Creates an aggregation field with the given {@code name}.
*
* @param key
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/
public AggregationField(String key) {
this(key, null);
public AggregationField(String name) {
this(name, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);
@@ -217,6 +236,10 @@ public final class Fields implements Iterable<Field> {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,17 +22,33 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
public GeoNearOperation(NearQuery nearQuery) {
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
@@ -41,6 +57,10 @@ public class GeoNearOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
@@ -364,7 +367,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
}
public Object getValue(AggregationOperationContext context) {
return reference == null ? value : context.getReference(reference).toString();
if (reference == null) {
return value;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
return reference;
}
return context.getReference(reference).toString();
}
@Override

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,14 +21,17 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation
* Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
class LimitOperation implements AggregationOperation {
public class LimitOperation implements AggregationOperation {
private final long maxElements;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,18 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation
* Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
@@ -32,17 +36,17 @@ import com.mongodb.DBObject;
*/
public class MatchOperation implements AggregationOperation {
private final Criteria criteria;
private final CriteriaDefinition criteriaDefinition;
/**
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
* Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @param criteriaDefinition must not be {@literal null}.
*/
public MatchOperation(Criteria criteria) {
public MatchOperation(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteria, "Criteria must not be null!");
this.criteria = criteria;
Assert.notNull(criteriaDefinition, "Criteria must not be null!");
this.criteriaDefinition = criteriaDefinition;
}
/*
@@ -51,6 +55,6 @@ public class MatchOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject()));
return new BasicDBObject("$match", context.getMappedObject(criteriaDefinition.getCriteriaObject()));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,15 +24,17 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* Encapsulates the aggregation framework {@code $project}-operation.
* <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle
@@ -235,26 +237,53 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
}
/**
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
*
* @author Thomas Darimont
*/
public static class ExpressionProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
private final Object[] params;
private final String expression;
/**
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
* parameters.
*
* @param value must not be {@literal null}.
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
public ExpressionProjectionOperationBuilder(Object value, ProjectionOperation operation, Object[] parameters) {
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
super(value, operation);
super(expression, operation, null);
this.expression = expression;
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder#project(java.lang.String, java.lang.Object[])
*/
@Override
public ProjectionOperationBuilder project(String operation, final Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values) {
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.length + 1);
result.add(ExpressionProjection.toMongoExpression(context,
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
result.addAll(Arrays.asList(values));
return result;
}
};
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
@@ -303,7 +332,11 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(getExposedField().getName(), TRANSFORMER.transform(expression, context, params));
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
}
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
return TRANSFORMER.transform(expression, context, params);
}
}
}
@@ -320,7 +353,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
private final String name;
private final ProjectionOperation operation;
private final OperationProjection previousProjection;
/**
@@ -335,7 +367,23 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
super(name, operation);
this.name = name;
this.operation = operation;
this.previousProjection = previousProjection;
}
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
* {@link ProjectionOperation}.
*
* @param value
* @param operation
* @param previousProjection
*/
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
OperationProjection previousProjection) {
super(value, operation);
this.name = null;
this.previousProjection = previousProjection;
}
@@ -521,8 +569,9 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public ProjectionOperationBuilder project(String operation, Object... values) {
OperationProjection projectionOperation = new OperationProjection(Fields.field(name), operation, values);
return new ProjectionOperationBuilder(name, this.operation.and(projectionOperation), projectionOperation);
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values);
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/**
@@ -627,6 +676,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
// implicit reference or explicit include?
if (value == null || Boolean.TRUE.equals(value)) {
if (Aggregation.SystemVariable.isReferingToSystemVariable(field.getTarget())) {
return field.getTarget();
}
// check whether referenced field exists in the context
return context.getReference(field).getReferenceValue();
@@ -649,7 +702,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/**
* Creates a new {@link OperationProjection} for the given field.
*
* @param name the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
@@ -672,18 +725,15 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBList values = new BasicDBList();
values.addAll(buildReferences(context));
DBObject inner = new BasicDBObject("$" + operation, getOperationArguments(context));
DBObject inner = new BasicDBObject("$" + operation, values);
return new BasicDBObject(this.field.getName(), inner);
return new BasicDBObject(getField().getName(), inner);
}
private List<Object> buildReferences(AggregationOperationContext context) {
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(field.getTarget()).toString());
result.add(context.getReference(getField().getName()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
@@ -692,6 +742,15 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return result;
}
/**
* Returns the field that holds the {@link OperationProjection}.
*
* @return
*/
protected Field getField() {
return field;
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*
@@ -699,7 +758,27 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public OperationProjection withAlias(String alias) {
return new OperationProjection(Fields.field(alias, this.field.getName()), operation, values.toArray());
final Field aliasedField = Fields.field(alias, this.field.getName());
return new OperationProjection(aliasedField, operation, values.toArray()) {
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.OperationProjection#getField()
*/
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationProjection that we replace
// with this new instance.
return OperationProjection.this.getOperationArguments(context);
}
};
}
}
@@ -731,6 +810,96 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new BasicDBObject(name, nestedObject);
}
}
/**
* Extracts the minute from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMinute() {
return project("minute");
}
/**
* Extracts the hour from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractHour() {
return project("hour");
}
/**
* Extracts the second from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractSecond() {
return project("second");
}
/**
* Extracts the millisecond from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMillisecond() {
return project("millisecond");
}
/**
* Extracts the year from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractYear() {
return project("year");
}
/**
* Extracts the month from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMonth() {
return project("month");
}
/**
* Extracts the week from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractWeek() {
return project("week");
}
/**
* Extracts the dayOfYear from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfYear() {
return project("dayOfYear");
}
/**
* Extracts the dayOfMonth from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfMonth() {
return project("dayOfMonth");
}
/**
* Extracts the dayOfWeek from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfWeek() {
return project("dayOfWeek");
}
}
/**
@@ -771,5 +940,4 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.List;
import org.springframework.util.Assert;
/**
@@ -30,11 +32,34 @@ public class TypedAggregation<I> extends Aggregation {
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
this(inputType, asAggregationList(operations));
}
super(operations);
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations) {
this(inputType, operations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s and the given
* {@link AggregationOptions}.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
* @param options must not be {@literal null}.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations, AggregationOptions options) {
super(operations, options);
Assert.notNull(inputType, "Input type must not be null!");
this.inputType = inputType;
@@ -48,4 +73,14 @@ public class TypedAggregation<I> extends Aggregation {
public Class<I> getInputType() {
return inputType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Aggregation#withOptions(org.springframework.data.mongodb.core.aggregation.AggregationOptions)
*/
public TypedAggregation<I> withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new TypedAggregation<I>(inputType, operations, options);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,9 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,14 +17,15 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -36,17 +37,23 @@ import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.Jsr310Converters;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.ThreeTenBackPortConverters;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.NamedMongoScriptToDBObjectConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToNamedMongoScriptCoverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.TermToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.CacheValue;
import org.springframework.util.Assert;
/**
@@ -58,6 +65,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class CustomConversions {
@@ -69,10 +77,13 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
private final List<Object> converters;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customWriteTargetTypes;
private final Map<Class<?>, CacheValue<Class<?>>> rawWriteTargetTypes;
/**
* Creates an empty {@link CustomConversions} object.
*/
@@ -92,7 +103,9 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue<Class<?>>>();
List<Object> toRegister = new ArrayList<Object>();
@@ -106,9 +119,14 @@ public class CustomConversions {
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.add(TermToStringConverter.INSTANCE);
toRegister.add(NamedMongoScriptToDBObjectConverter.INSTANCE);
toRegister.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());
toRegister.addAll(Jsr310Converters.getConvertersToRegister());
toRegister.addAll(ThreeTenBackPortConverters.getConvertersToRegister());
for (Object c : toRegister) {
registerConversion(c);
@@ -235,70 +253,103 @@ public class CustomConversions {
* @param sourceType must not be {@literal null}
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType) {
return getCustomWriteTarget(sourceType, null);
public Class<?> getCustomWriteTarget(final Class<?> sourceType) {
return getOrCreateAndCache(sourceType, rawWriteTargetTypes, new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, null, writingPairs);
}
});
}
/**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
* Returns the target type we can readTargetWriteLocl an inject of the given source type to. The returned type might
* be a subclass of the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply
* return the first target type matching or {@literal null} if no conversion can be found.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
public Class<?> getCustomWriteTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
Assert.notNull(sourceType);
if (requestedTargetType == null) {
return getCustomWriteTarget(sourceType);
}
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customWriteTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
}
});
}
/**
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might
* be a subclass of the given expected type though.
* Returns whether we have a custom conversion registered to readTargetWriteLocl into a Mongo native type. The
* returned type might be a subclass of the given expected type though.
*
* @param sourceType must not be {@literal null}
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType) {
Assert.notNull(sourceType);
return hasCustomWriteTarget(sourceType, null);
}
/**
* Returns whether we have a custom conversion registered to write an object of the given source type into an object
* of the given Mongo native target type.
* Returns whether we have a custom conversion registered to readTargetWriteLocl an object of the given source type
* into an object of the given Mongo native target type.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType
* @return
*/
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
return getCustomWriteTarget(sourceType, requestedTargetType) != null;
}
/**
* Returns whether we have a custom conversion registered to read the given source into the given target type.
* Returns whether we have a custom conversion registered to readTargetReadLock the given source into the given target
* type.
*
* @param sourceType must not be {@literal null}
* @param requestedTargetType must not be {@literal null}
* @return
*/
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
Assert.notNull(requestedTargetType);
return getCustomReadTarget(sourceType, requestedTargetType) != null;
}
/**
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
if (requestedTargetType == null) {
return null;
}
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customReadTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, readingPairs);
}
});
}
/**
* Inspects the given {@link ConvertiblePair}s for ones that have a source compatible type as source. Additionally
* checks assignability of the target type if one is given.
*
* @param sourceType must not be {@literal null}.
@@ -307,11 +358,15 @@ public class CustomConversions {
* @return
*/
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
Iterable<ConvertiblePair> pairs) {
Collection<ConvertiblePair> pairs) {
Assert.notNull(sourceType);
Assert.notNull(pairs);
if (requestedTargetType != null && pairs.contains(new ConvertiblePair(sourceType, requestedTargetType))) {
return requestedTargetType;
}
for (ConvertiblePair typePair : pairs) {
if (typePair.getSourceType().isAssignableFrom(sourceType)) {
Class<?> targetType = typePair.getTargetType();
@@ -325,32 +380,31 @@ public class CustomConversions {
}
/**
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
* Will try to find a value for the given key in the given cache or produce one using the given {@link Producer} and
* store it in the cache.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @param key the key to lookup a potentially existing value, must not be {@literal null}.
* @param cache the cache to find the value in, must not be {@literal null}.
* @param producer the {@link Producer} to create values to cache, must not be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue<Class<?>>> cache, Producer producer) {
Assert.notNull(sourceType);
CacheValue<Class<?>> cacheValue = cache.get(key);
if (requestedTargetType == null) {
return null;
if (cacheValue != null) {
return cacheValue.getValue();
}
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType);
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey);
Class<?> type = producer.get();
cache.put(key, CacheValue.<Class<?>> ofNullable(type));
if (readTargetTypeValue != null) {
return readTargetTypeValue.getType();
}
return type;
}
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs));
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
private interface Producer {
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType();
Class<?> get();
}
@WritingConverter
@@ -370,30 +424,4 @@ public class CustomConversions {
return source.toString();
}
}
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
* @author Thomas Darimont
*/
private static class CacheValue {
private static final CacheValue ABSENT = new CacheValue(null);
private final Class<?> type;
public CacheValue(Class<?> type) {
this.type = type;
}
public Class<?> getType() {
return type;
}
static CacheValue of(Class<?> type) {
return type == null ? ABSENT : new CacheValue(type);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,7 @@ import com.mongodb.DBObject;
*/
class DBObjectAccessor {
private final DBObject dbObject;
private final BasicDBObject dbObject;
/**
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
@@ -46,7 +46,7 @@ class DBObjectAccessor {
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
this.dbObject = dbObject;
this.dbObject = (BasicDBObject) dbObject;
}
/**
@@ -62,6 +62,11 @@ class DBObjectAccessor {
Assert.notNull(prop, "MongoPersistentProperty must not be null!");
String fieldName = prop.getFieldName();
if (!fieldName.contains(".")) {
dbObject.put(fieldName, value);
return;
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
DBObject dbObject = this.dbObject;
@@ -87,12 +92,16 @@ class DBObjectAccessor {
* @param property must not be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<Object, Object> source = this.dbObject.toMap();
Map<String, Object> source = this.dbObject;
Object result = null;
while (source != null && parts.hasNext()) {
@@ -108,14 +117,14 @@ class DBObjectAccessor {
}
@SuppressWarnings("unchecked")
private Map<Object, Object> getAsMap(Object source) {
private Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) {
return ((DBObject) source).toMap();
return (BasicDBObject) source;
}
if (source instanceof Map) {
return (Map<Object, Object>) source;
return (Map<String, Object>) source;
}
return null;

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,16 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}
public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
@@ -25,6 +26,7 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4
*/
public interface DbRefResolver {
@@ -39,7 +41,8 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}
@@ -52,4 +55,13 @@ public interface DbRefResolver {
*/
DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity,
Object id);
/**
* Actually loads the {@link DBRef} from the datasource.
*
* @param dbRef must not be {@literal null}.
* @return
* @since 1.7
*/
DBObject fetch(DBRef dbRef);
}

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
class DefaultDbRefProxyHandler implements DbRefProxyHandler {
private final SpELContext spELContext;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ValueResolver resolver;
/**
* @param spELContext must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public DefaultDbRefProxyHandler(SpELContext spELContext,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, ValueResolver resolver) {
this.spELContext = spELContext;
this.mappingContext = mappingContext;
this.resolver = resolver;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/
@Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) {
if (source == null) {
return proxy;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(property);
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty.usePropertyAccess()) {
return proxy;
}
SpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(proxy, spELContext);
PersistentPropertyAccessor accessor = entity.getPropertyAccessor(proxy);
DBObject object = new BasicDBObject(idProperty.getFieldName(), source.getId());
ObjectPath objectPath = ObjectPath.ROOT.push(proxy, entity, null);
accessor.setProperty(idProperty, resolver.getValueInternal(idProperty, object, evaluator, objectPath));
return proxy;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,26 +25,22 @@ import java.lang.reflect.Method;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
import org.objenesis.Objenesis;
import org.objenesis.ObjenesisStd;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.cglib.proxy.Callback;
import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory;
import org.springframework.cglib.proxy.MethodProxy;
import org.springframework.core.SpringVersion;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
@@ -53,15 +49,14 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final boolean IS_SPRING_4_OR_BETTER = SpringVersion.getVersion().startsWith("4");
private static final boolean OBJENESIS_PRESENT = ClassUtils.isPresent("org.objenesis.Objenesis", null);
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
/**
* Creates a new {@link DefaultDbRefResolver} with the given {@link MongoDbFactory}.
@@ -74,6 +69,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
this.mongoDbFactory = mongoDbFactory;
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
this.objenesis = new ObjenesisStd(true);
}
/*
@@ -81,13 +77,14 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback);
return createLazyLoadingProxy(property, dbref, callback, handler);
}
return callback.resolve(property);
@@ -100,11 +97,16 @@ public class DefaultDbRefResolver implements DbRefResolver {
@Override
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
MongoPersistentEntity<?> entity, Object id) {
return new DBRef(entity.getCollection(), id);
}
DB db = mongoDbFactory.getDb();
db = annotation != null && StringUtils.hasText(annotation.db()) ? mongoDbFactory.getDb(annotation.db()) : db;
return new DBRef(db, entity.getCollection(), id);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#fetch(com.mongodb.DBRef)
*/
@Override
public DBObject fetch(DBRef dbRef) {
return ReflectiveDBRefResolver.fetch(mongoDbFactory.getDb(), dbRef);
}
/**
@@ -116,34 +118,47 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
if (!propertyType.isInterface()) {
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
factory.setCallbacks(new Callback[] { interceptor });
return handler.populateId(property, dbref, factory);
}
ProxyFactory proxyFactory = new ProxyFactory();
Class<?> propertyType = property.getType();
for (Class<?> type : propertyType.getInterfaces()) {
proxyFactory.addInterface(type);
}
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
proxyFactory.addInterface(LazyLoadingProxy.class);
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
if (propertyType.isInterface()) {
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
return handler.populateId(property, dbref, proxyFactory.getProxy());
}
proxyFactory.setProxyTargetClass(true);
proxyFactory.setTargetClass(propertyType);
/**
* Returns the CGLib enhanced type for the given source type.
*
* @param type
* @return
*/
private Class<?> getEnhancedTypeFor(Class<?> type) {
if (IS_SPRING_4_OR_BETTER || !OBJENESIS_PRESENT) {
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
return ObjenesisProxyEnhancer.enhanceAndGet(proxyFactory, propertyType, interceptor);
return enhancer.createClass();
}
/**
@@ -163,11 +178,12 @@ public class DefaultDbRefResolver implements DbRefResolver {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
@@ -179,8 +195,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
FINALIZE_METHOD = Object.class.getDeclaredMethod("finalize");
} catch (Exception e) {
throw new RuntimeException(e);
}
@@ -244,6 +261,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
// DATAMONGO-1076 - finalize methods should not trigger proxy initialization
if (FINALIZE_METHOD.equals(method)) {
return null;
}
}
Object target = ensureResolved();
@@ -265,7 +287,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
StringBuilder description = new StringBuilder();
if (dbref != null) {
description.append(dbref.getRef());
description.append(dbref.getCollectionName());
description.append(":");
description.append(dbref.getId());
} else {
@@ -372,27 +394,4 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result;
}
}
/**
* Static class to accomodate optional dependency on Objenesis.
*
* @author Oliver Gierke
*/
private static class ObjenesisProxyEnhancer {
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
public static Object enhanceAndGet(ProxyFactory proxyFactory, Class<?> type,
org.springframework.cglib.proxy.MethodInterceptor interceptor) {
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });
return factory;
}
}
}

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Default implementation of {@link DbRefResolverCallback}.
*
* @author Oliver Gierke
*/
class DefaultDbRefResolverCallback implements DbRefResolverCallback {
private final DBObject surroundingObject;
private final ObjectPath path;
private final ValueResolver resolver;
private final SpELExpressionEvaluator evaluator;
/**
* Creates a new {@link DefaultDbRefResolverCallback} using the given {@link DBObject}, {@link ObjectPath},
* {@link ValueResolver} and {@link SpELExpressionEvaluator}.
*
* @param surroundingObject must not be {@literal null}.
* @param path must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public DefaultDbRefResolverCallback(DBObject surroundingObject, ObjectPath path, SpELExpressionEvaluator evaluator,
ValueResolver resolver) {
this.surroundingObject = surroundingObject;
this.path = path;
this.resolver = resolver;
this.evaluator = evaluator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolverCallback#resolve(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public Object resolve(MongoPersistentProperty property) {
return resolver.getValueInternal(property, surroundingObject, evaluator, path);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,9 +30,18 @@ import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.geo.GeoJson;
import org.springframework.data.mongodb.core.geo.GeoJsonGeometryCollection;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPolygon;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonPolygon;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -43,6 +52,7 @@ import com.mongodb.DBObject;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.5
*/
abstract class GeoConverters {
@@ -63,16 +73,24 @@ abstract class GeoConverters {
BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE //
, GeoCommandToDbObjectConverter.INSTANCE);
, GeoCommandToDbObjectConverter.INSTANCE //
, GeoJsonToDbObjectConverter.INSTANCE //
, GeoJsonPointToDbObjectConverter.INSTANCE //
, GeoJsonPolygonToDbObjectConverter.INSTANCE //
, DbObjectToGeoJsonPointConverter.INSTANCE //
, DbObjectToGeoJsonPolygonConverter.INSTANCE //
, DbObjectToGeoJsonLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiPointConverter.INSTANCE //
, DbObjectToGeoJsonMultiPolygonConverter.INSTANCE //
, DbObjectToGeoJsonGeometryCollectionConverter.INSTANCE);
}
/**
@@ -82,7 +100,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
INSTANCE;
@@ -91,13 +109,11 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) {
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
(Double) source.get("y"));
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
}
}
@@ -107,7 +123,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
public static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
INSTANCE;
@@ -128,7 +144,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@WritingConverter
public static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
INSTANCE;
@@ -151,13 +167,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
* Converts a {@link BasicDBList} into a {@link Box}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
INSTANCE;
@@ -166,7 +182,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) {
if (source == null) {
@@ -176,7 +191,7 @@ abstract class GeoConverters {
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new org.springframework.data.mongodb.core.geo.Box(first, second);
return new Box(first, second);
}
}
@@ -186,7 +201,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
public static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
INSTANCE;
@@ -210,13 +225,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
* Converts a {@link DBObject} into a {@link Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
INSTANCE;
@@ -251,78 +266,13 @@ abstract class GeoConverters {
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
INSTANCE;
@@ -352,7 +302,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
INSTANCE;
@@ -393,7 +343,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
public static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
INSTANCE;
@@ -422,13 +372,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
* Converts a {@link BasicDBList} into a {@link Polygon}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
INSTANCE;
@@ -437,7 +387,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings({ "deprecation", "unchecked" })
@SuppressWarnings({ "unchecked" })
public Polygon convert(DBObject source) {
if (source == null) {
@@ -453,7 +403,7 @@ abstract class GeoConverters {
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
}
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
return new Polygon(newPoints);
}
}
@@ -463,7 +413,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
public static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
INSTANCE;
@@ -472,7 +422,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
@SuppressWarnings("rawtypes")
public DBObject convert(GeoCommand source) {
if (source == null) {
@@ -483,6 +433,10 @@ abstract class GeoConverters {
Shape shape = source.getShape();
if (shape instanceof GeoJson) {
return GeoJsonToDbObjectConverter.INSTANCE.convert((GeoJson) shape);
}
if (shape instanceof Box) {
argument.add(toList(((Box) shape).getFirst()));
@@ -493,10 +447,10 @@ abstract class GeoConverters {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
} else if (shape instanceof Circle) {
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius());
} else if (shape instanceof Polygon) {
@@ -514,7 +468,377 @@ abstract class GeoConverters {
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
@SuppressWarnings("rawtypes")
static enum GeoJsonToDbObjectConverter implements Converter<GeoJson, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJson source) {
if (source == null) {
return null;
}
DBObject dbo = new BasicDBObject("type", source.getType());
if (source instanceof GeoJsonGeometryCollection) {
BasicDBList dbl = new BasicDBList();
for (GeoJson geometry : ((GeoJsonGeometryCollection) source).getCoordinates()) {
dbl.add(convert(geometry));
}
dbo.put("geometries", dbl);
} else {
dbo.put("coordinates", convertIfNecessarry(source.getCoordinates()));
}
return dbo;
}
private Object convertIfNecessarry(Object candidate) {
if (candidate instanceof GeoJson) {
return convertIfNecessarry(((GeoJson) candidate).getCoordinates());
}
if (candidate instanceof Iterable) {
BasicDBList dbl = new BasicDBList();
for (Object element : (Iterable) candidate) {
dbl.add(convertIfNecessarry(element));
}
return dbl;
}
if (candidate instanceof Point) {
return toList((Point) candidate);
}
return candidate;
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPointToDbObjectConverter implements Converter<GeoJsonPoint, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPoint source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPolygonToDbObjectConverter implements Converter<GeoJsonPolygon, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPolygon source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPointConverter implements Converter<DBObject, GeoJsonPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("unchecked")
public GeoJsonPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Point"),
String.format("Cannot convert type '%s' to Point.", source.get("type")));
List<Double> dbl = (List<Double>) source.get("coordinates");
return new GeoJsonPoint(dbl.get(0).doubleValue(), dbl.get(1).doubleValue());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPolygonConverter implements Converter<DBObject, GeoJsonPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Polygon"),
String.format("Cannot convert type '%s' to Polygon.", source.get("type")));
return toGeoJsonPolygon((BasicDBList) source.get("coordinates"));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPolygonConverter implements Converter<DBObject, GeoJsonMultiPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPolygon"),
String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type")));
BasicDBList dbl = (BasicDBList) source.get("coordinates");
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>();
for (Object polygon : dbl) {
polygones.add(toGeoJsonPolygon((BasicDBList) polygon));
}
return new GeoJsonMultiPolygon(polygones);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonLineStringConverter implements Converter<DBObject, GeoJsonLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "LineString"),
String.format("Cannot convert type '%s' to LineString.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonLineString(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPointConverter implements Converter<DBObject, GeoJsonMultiPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPoint"),
String.format("Cannot convert type '%s' to MultiPoint.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonMultiPoint(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiLineStringConverter implements Converter<DBObject, GeoJsonMultiLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiLineString"),
String.format("Cannot convert type '%s' to MultiLineString.", source.get("type")));
List<GeoJsonLineString> lines = new ArrayList<GeoJsonLineString>();
BasicDBList cords = (BasicDBList) source.get("coordinates");
for (Object line : cords) {
lines.add(new GeoJsonLineString(toListOfPoint((BasicDBList) line)));
}
return new GeoJsonMultiLineString(lines);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonGeometryCollectionConverter implements Converter<DBObject, GeoJsonGeometryCollection> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@SuppressWarnings("rawtypes")
@Override
public GeoJsonGeometryCollection convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"),
String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type")));
List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
for (Object o : (List) source.get("geometries")) {
geometries.add(convertGeometries((DBObject) o));
}
return new GeoJsonGeometryCollection(geometries);
}
private static GeoJson<?> convertGeometries(DBObject source) {
Object type = source.get("type");
if (ObjectUtils.nullSafeEquals(type, "Point")) {
return DbObjectToGeoJsonPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPoint")) {
return DbObjectToGeoJsonMultiPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "LineString")) {
return DbObjectToGeoJsonLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiLineString")) {
return DbObjectToGeoJsonMultiLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "Polygon")) {
return DbObjectToGeoJsonPolygonConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPolygon")) {
return DbObjectToGeoJsonMultiPolygonConverter.INSTANCE.convert(source);
}
throw new IllegalArgumentException(String.format("Cannot convert unknown GeoJson type %s", type));
}
}
static List<Double> toList(Point point) {
return Arrays.asList(point.getX(), point.getY());
}
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPoint}s.
*
* @param listOfCoordinatePairs
* @return
* @since 1.7
*/
@SuppressWarnings("unchecked")
static List<Point> toListOfPoint(BasicDBList listOfCoordinatePairs) {
List<Point> points = new ArrayList<Point>();
for (Object point : listOfCoordinatePairs) {
Assert.isInstanceOf(List.class, point);
List<Double> coordinatesList = (List<Double>) point;
points.add(new GeoJsonPoint(coordinatesList.get(0).doubleValue(), coordinatesList.get(1).doubleValue()));
}
return points;
}
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPolygon}.
*
* @param dbList
* @return
* @since 1.7
*/
static GeoJsonPolygon toGeoJsonPolygon(BasicDBList dbList) {
return new GeoJsonPolygon(toListOfPoint((BasicDBList) dbList.get(0)));
}
}

View File

@@ -23,6 +23,7 @@ import com.mongodb.DBRef;
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.5
*/
public interface LazyLoadingProxy {
@@ -33,7 +34,7 @@ public interface LazyLoadingProxy {
* @return
* @since 1.5
*/
Object initialize();
Object getTarget();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,16 +31,17 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.ParameterValueProvider;
@@ -56,6 +57,7 @@ import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import com.mongodb.BasicDBList;
@@ -73,7 +75,9 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s";
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
@@ -81,6 +85,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
@@ -93,11 +98,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(DbRefResolver dbRefResolver,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(ConversionServiceFactory.createDefaultConversionService());
super(new DefaultConversionService());
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
@@ -184,11 +188,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) {
return read(type, dbo, null);
return read(type, dbo, ObjectPath.ROOT);
}
@SuppressWarnings("unchecked")
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo, Object parent) {
private <S extends Object> S read(TypeInformation<S> type, DBObject dbo, ObjectPath path) {
if (null == dbo) {
return null;
@@ -206,11 +210,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, parent);
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, path);
}
if (typeToUse.isMap()) {
return (S) readMap(typeToUse, dbo, parent);
return (S) readMap(typeToUse, dbo, path);
}
if (dbo instanceof BasicDBList) {
throw new MappingException(String.format(INCOMPATIBLE_TYPES, dbo, BasicDBList.class, typeToUse.getType(), path));
}
// Retrieve persistent entity info
@@ -220,41 +228,57 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for " + rawType.getName());
}
return read(persistentEntity, dbo, parent);
return read(persistentEntity, dbo, path);
}
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
DBObject source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent);
entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, path);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, parent);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, path);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
final S result = wrapper.getBean();
final PersistentPropertyAccessor accessor = new ConvertingPropertyAccessor(entity.getPropertyAccessor(instance),
conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
final S result = instance;
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
accessor.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
// we skip the id property since it was already set
if (idProperty != null && idProperty.equals(prop)) {
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
Object obj = getValueInternal(prop, dbo, evaluator, result);
wrapper.setProperty(prop, obj);
accessor.setProperty(prop, getValueInternal(prop, dbo, evaluator, currentPath));
}
});
@@ -262,19 +286,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty property = association.getInverse();
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getFieldName());
if (value == null) {
return;
}
Object value = dbo.get(property.getName());
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override
public Object resolve(MongoPersistentProperty property) {
return getValueInternal(property, dbo, evaluator, parent);
}
});
DbRefProxyHandler handler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
MappingMongoConverter.this);
DbRefResolverCallback callback = new DefaultDbRefResolverCallback(dbo, currentPath, evaluator,
MappingMongoConverter.this);
wrapper.setProperty(property, obj);
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
}
});
@@ -294,6 +320,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
}
// @see DATAMONGO-913
if (object instanceof LazyLoadingProxy) {
return ((LazyLoadingProxy) object).toDBRef();
}
return createDBRef(object, referingProperty);
}
@@ -309,14 +340,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
boolean handledByCustomConverter = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(obj.getClass());
Class<?> entityType = obj.getClass();
boolean handledByCustomConverter = conversions.getCustomWriteTarget(entityType, DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(entityType);
if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) {
typeMapper.writeType(type, dbo);
}
writeInternal(obj, dbo, type);
Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).getTarget() : obj;
writeInternal(target, dbo, type);
}
/**
@@ -332,7 +366,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
Class<?> customTarget = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class);
Class<?> entityType = obj.getClass();
Class<?> customTarget = conversions.getCustomWriteTarget(entityType, DBObject.class);
if (customTarget != null) {
DBObject result = conversionService.convert(obj, DBObject.class);
@@ -340,17 +375,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
if (Map.class.isAssignableFrom(obj.getClass())) {
if (Map.class.isAssignableFrom(entityType)) {
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityType);
writeInternal(obj, dbo, entity);
addCustomTypeKeyIfNecessary(typeHint, obj, dbo);
}
@@ -365,13 +400,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName());
}
final BeanWrapper<Object> wrapper = BeanWrapper.create(obj, conversionService);
final PersistentPropertyAccessor accessor = entity.getPropertyAccessor(obj);
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
try {
Object id = wrapper.getProperty(idProperty, Object.class);
Object id = accessor.getProperty(idProperty);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {}
}
@@ -380,11 +415,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
if (prop.equals(idProperty) || !prop.isWritable()) {
return;
}
Object propertyObj = wrapper.getProperty(prop);
Object propertyObj = accessor.getProperty(prop);
if (null != propertyObj) {
@@ -398,10 +433,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
});
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType();
Object propertyObj = wrapper.getProperty(inverseProp, type);
Object propertyObj = accessor.getProperty(inverseProp);
if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp);
}
@@ -457,7 +494,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize();
obj = ((LazyLoadingProxy) obj).getTarget();
}
// Lookup potential custom target type
@@ -471,7 +508,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object existingValue = accessor.get(prop);
BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue
: new BasicDBObject();
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
addCustomTypeKeyIfNecessary(ClassTypeInformation.from(prop.getRawType()), obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
@@ -554,7 +591,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
} else {
@@ -606,12 +643,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey();
Object val = entry.getValue();
if (conversions.isSimpleType(key.getClass())) {
// Don't use conversion service here as removal of ObjectToString converter results in some primitive types not
// being convertable
String simpleKey = potentiallyEscapeMapKey(key.toString());
String simpleKey = prepareMapKey(key);
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection || val.getClass().isArray()) {
@@ -632,6 +670,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbo;
}
/**
* Prepares the given {@link Map} key to be converted into a {@link String}. Will invoke potentially registered custom
* conversions and escape dots from the result as they're not supported as {@link Map} key in MongoDB.
*
* @param key must not be {@literal null}.
* @return
*/
private String prepareMapKey(Object key) {
Assert.notNull(key, "Map key must not be null!");
String convertedKey = potentiallyConvertMapKey(key);
return potentiallyEscapeMapKey(convertedKey);
}
/**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured.
@@ -654,6 +707,22 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return source.replaceAll("\\.", mapKeyDotReplacement);
}
/**
* Returns a {@link String} representation of the given {@link Map} key
*
* @param key
* @return
*/
private String potentiallyConvertMapKey(Object key) {
if (key instanceof String) {
return (String) key;
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class) ? (String) getPotentiallyConvertedSimpleWrite(key)
: key.toString();
}
/**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured.
@@ -668,7 +737,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Adds custom type information to the given {@link DBObject} if necessary. That is if the value is not the same as
* the one given. This is usually the case if you store a subtype of the actual declared type of the property.
*
*
* @param type
* @param value must not be {@literal null}.
* @param dbObject must not be {@literal null}.
@@ -677,10 +746,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
TypeInformation<?> actualType = type != null ? type.getActualType() : null;
Class<?> reference = actualType == null ? Object.class : actualType.getType();
Class<?> valueType = ClassUtils.getUserClass(value.getClass());
boolean notTheSameClass = !value.getClass().equals(reference);
boolean notTheSameClass = !valueType.equals(reference);
if (notTheSameClass) {
typeMapper.writeType(value.getClass(), dbObject);
typeMapper.writeType(valueType, dbObject);
}
}
@@ -733,7 +803,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) {
if (value == null || target == null) {
if (value == null || target == null || target.isAssignableFrom(value.getClass())) {
return value;
}
@@ -745,7 +815,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Enum.valueOf((Class<Enum>) target, value.toString());
}
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
return conversionService.convert(value, target);
}
protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
@@ -774,8 +844,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (target.getClass().equals(idProperty.getType())) {
id = target;
} else {
BeanWrapper<Object> wrapper = BeanWrapper.create(target, conversionService);
id = wrapper.getProperty(idProperty, Object.class);
PersistentPropertyAccessor accessor = targetEntity.getPropertyAccessor(target);
id = accessor.getProperty(idProperty);
}
if (null == id) {
@@ -786,11 +856,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
idMapper.convertId(id));
}
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
Object parent) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(dbo, spELContext, parent);
return provider.getPropertyValue(prop);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.ValueResolver#getValueInternal(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, com.mongodb.DBObject, org.springframework.data.mapping.model.SpELExpressionEvaluator, java.lang.Object)
*/
@Override
public Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath path) {
return new MongoDbPropertyValueProvider(dbo, evaluator, path).getPropertyValue(prop);
}
/**
@@ -798,11 +871,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
Assert.notNull(targetType);
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType();
@@ -823,9 +898,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
parent));
path));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent));
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
@@ -838,13 +913,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
*
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject
* @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
* @return
*/
@SuppressWarnings("unchecked")
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) {
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, ObjectPath path) {
Assert.notNull(dbObject);
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
@@ -871,7 +948,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
map.put(key, read(valueType, (DBObject) value, path));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
} else {
@@ -883,21 +960,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map;
}
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {
Object obj = dbList.get(i);
if (obj instanceof BasicDBList) {
rootList.add(unwrapList((BasicDBList) obj, targetType.getComponentType()));
} else if (obj instanceof DBObject) {
rootList.add(read(targetType, (DBObject) obj));
} else {
rootList.add(obj);
}
}
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
@@ -919,7 +981,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? null : ClassTypeInformation.OBJECT;
TypeInformation<?> typeHint = typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);
@@ -1007,24 +1069,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbObject;
}
/**
* {@link PropertyValueProvider} to evaluate a SpEL expression if present on the property or simply accesses the field
* of the configured source {@link DBObject}.
*
* @author Oliver Gierke
*/
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
private final DBObjectAccessor source;
private final SpELExpressionEvaluator evaluator;
private final Object parent;
private final ObjectPath path;
public MongoDbPropertyValueProvider(DBObject source, SpELContext factory, Object parent) {
this(source, new DefaultSpELExpressionEvaluator(source, factory), parent);
}
public MongoDbPropertyValueProvider(DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
/**
* Creates a new {@link MongoDbPropertyValueProvider} for the given source, {@link SpELExpressionEvaluator} and
* {@link ObjectPath}.
*
* @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param path can be {@literal null}.
*/
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, ObjectPath path) {
Assert.notNull(source);
Assert.notNull(evaluator);
this.source = new DBObjectAccessor(source);
this.evaluator = evaluator;
this.parent = parent;
this.path = path;
}
/*
@@ -1040,7 +1112,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
return readValue(value, property.getTypeInformation(), parent);
return readValue(value, property.getTypeInformation(), path);
}
}
@@ -1053,7 +1125,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent;
private final ObjectPath path;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
@@ -1063,10 +1135,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, ObjectPath path) {
super(evaluator, conversionService, delegate);
this.parent = parent;
this.path = path;
}
/*
@@ -1075,28 +1147,44 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent);
return readValue(object, parameter.getType(), path);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
private <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
return (T) readCollectionOrArray(type, (BasicDBList) value, path);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
return (T) read(type, (DBObject) value, path);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
@SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
if (rawType.equals(DBRef.class)) {
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
if (object != null) {
return (T) object;
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
}
/**
* Performs the fetch operation for the given {@link DBRef}.
*
@@ -1104,6 +1192,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return
*/
DBObject readRef(DBRef ref) {
return ref.fetch();
return dbRefResolver.fetch(ref);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,13 +20,19 @@ import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import org.bson.types.Code;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
@@ -34,6 +40,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
abstract class MongoConverters {
@@ -160,4 +167,65 @@ abstract class MongoConverters {
return source == null ? null : source.toString();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@Override
public String convert(Term source) {
return source == null ? null : source.getFormatted();
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum DBObjectToNamedMongoScriptCoverter implements Converter<DBObject, NamedMongoScript> {
INSTANCE;
@Override
public NamedMongoScript convert(DBObject source) {
if (source == null) {
return null;
}
String id = source.get("_id").toString();
Object rawValue = source.get("value");
return new NamedMongoScript(id, ((Code) rawValue).getCode());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum NamedMongoScriptToDBObjectConverter implements Converter<NamedMongoScript, DBObject> {
INSTANCE;
@Override
public DBObject convert(NamedMongoScript source) {
if (source == null) {
return new BasicDBObject();
}
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
builder.append("_id", source.getName());
builder.append("value", new Code(source.getCode()));
return builder.get();
}
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* A path of objects nested into each other. The type allows access to all parent objects currently in creation even
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
private ObjectPath() {
this.items = Collections.emptyList();
}
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
* @param parent can be {@literal null}.
* @param item
*/
private ObjectPath(ObjectPath parent, ObjectPath.ObjectPathItem item) {
List<ObjectPath.ObjectPathItem> items = new ArrayList<ObjectPath.ObjectPathItem>(parent.items);
items.add(item);
this.items = Collections.unmodifiableList(items);
}
/**
* Returns a copy of the {@link ObjectPath} with the given {@link Object} as current object.
*
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
ObjectPathItem item = new ObjectPathItem(object, id, entity.getCollection());
return new ObjectPath(this, item);
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
public Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())) {
return object;
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
* @return
*/
public Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
if (items.isEmpty()) {
return "[empty]";
}
List<String> strings = new ArrayList<String>(items.size());
for (ObjectPathItem item : items) {
strings.add(item.object.toString());
}
return StringUtils.collectionToDelimitedString(strings, " -> ");
}
/**
* An item in an {@link ObjectPath}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -57,6 +57,11 @@ import com.mongodb.DBRef;
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
}
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -119,6 +124,61 @@ public class QueryMapper {
return result;
}
/**
* Maps fields used for sorting to the {@link MongoPersistentEntity}s properties. <br />
* Also converts properties to their {@code $meta} representation if present.
*
* @param sortObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedSort(DBObject sortObject, MongoPersistentEntity<?> entity) {
if (sortObject == null) {
return null;
}
DBObject mappedSort = getMappedObject(sortObject, entity);
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
}
/**
* Maps fields to retrieve to the {@link MongoPersistentEntity}s properties. <br />
* Also onverts and potentially adds missing property {@code $meta} representation.
*
* @param fieldsObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedFields(DBObject fieldsObject, MongoPersistentEntity<?> entity) {
DBObject mappedFields = fieldsObject != null ? getMappedObject(fieldsObject, entity) : new BasicDBObject();
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields.keySet().isEmpty() ? null : mappedFields;
}
private void mapMetaAttributes(DBObject source, MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
if (entity == null || source == null) {
return;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsField(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
}
private DBObject getMappedTextScoreField(MongoPersistentProperty property) {
return new BasicDBObject(property.getFieldName(), META_TEXT_SCORE);
}
/**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
*
@@ -250,14 +310,32 @@ public class QueryMapper {
* type of the given value is compatible with the type of the given document field in order to deal with potential
* query field exclusions, since MongoDB uses the {@code int} {@literal 0} as an indicator for an excluded field.
*
* @param documentField
* @param documentField must not be {@literal null}.
* @param value
* @return
*/
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return documentField.isAssociation() && value != null
&& (documentField.getProperty().getActualType().isAssignableFrom(value.getClass()) //
|| documentField.getPropertyEntity().getIdProperty().getActualType().isAssignableFrom(value.getClass()));
Assert.notNull(documentField, "Document field must not be null!");
if (value == null) {
return false;
}
if (!documentField.isAssociation()) {
return false;
}
Class<? extends Object> type = value.getClass();
MongoPersistentProperty property = documentField.getProperty();
if (property.getActualType().isAssignableFrom(type)) {
return true;
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty()
&& (type.equals(DBRef.class) || entity.getIdProperty().getActualType().isAssignableFrom(type));
}
/**
@@ -289,7 +367,7 @@ public class QueryMapper {
* @return the converted mongo type or null if source is null
*/
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return converter.convertToMongoType(source);
return converter.convertToMongoType(source, entity == null ? null : entity.getTypeInformation());
}
protected Object convertAssociation(Object source, Field field) {
@@ -305,10 +383,16 @@ public class QueryMapper {
*/
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
if (property == null || source == null || source instanceof DBObject) {
return source;
}
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
return new DBRef(ref.getCollectionName(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
@@ -380,13 +464,20 @@ public class QueryMapper {
*/
public Object convertId(Object id) {
try {
return conversionService.convert(id, ObjectId.class);
} catch (ConversionException e) {
// Ignore
if (id == null) {
return null;
}
return delegateConvertToMongoType(id, null);
if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService
.convert(id, ObjectId.class) : delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
}
/**
@@ -594,6 +685,21 @@ public class QueryMapper {
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this(name, entity, context, null);
}
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext} with the given {@link MongoPersistentProperty}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
* @param property may be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
MongoPersistentProperty property) {
super(name);
@@ -603,7 +709,7 @@ public class QueryMapper {
this.mappingContext = context;
this.path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
this.property = path == null ? property : path.getLeafProperty();
this.association = findAssociation();
}
@@ -613,7 +719,7 @@ public class QueryMapper {
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
return new MetadataBackedField(name, entity, mappingContext, property);
}
/*
@@ -693,7 +799,7 @@ public class QueryMapper {
*/
@Override
public String getMappedKey() {
return path == null ? name : path.toDotPath(getPropertyConverter());
return path == null ? name : path.toDotPath(isAssociation() ? getAssociationConverter() : getPropertyConverter());
}
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
@@ -745,5 +851,56 @@ public class QueryMapper {
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE;
}
/**
* Return the {@link Converter} to use for creating the mapped key of an association. Default implementation is
* {@link AssociationConverter}.
*
* @return
* @since 1.7
*/
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
protected static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* {@link ReflectiveDBRefResolver} provides reflective access to {@link DBRef} API that is not consistently available
* for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBRefResolver {
private static final Method FETCH_METHOD;
static {
FETCH_METHOD = findMethod(DBRef.class, "fetch");
}
/**
* Fetches the object referenced from the database either be directly calling {@link DBRef#fetch()} or
* {@link DBCollection#findOne(Object)}.
*
* @param db can be {@literal null} when using MongoDB Java driver in version 2.x.
* @param ref must not be {@literal null}.
* @return the document that this references.
*/
public static DBObject fetch(DB db, DBRef ref) {
Assert.notNull(ref, "DBRef to fetch must not be null!");
if (isMongo3Driver()) {
return db.getCollection(ref.getCollectionName()).findOne(ref.getId());
}
return (DBObject) invokeMethod(FETCH_METHOD, ref);
}
}

View File

@@ -25,6 +25,7 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
@@ -79,10 +80,19 @@ public class UpdateMapper extends QueryMapper {
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
}
if (!isUpdateModifier(rawValue)) {
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
if (isQuery(rawValue)) {
return createMapEntry(field,
super.getMappedObject(((Query) rawValue).getQueryObject(), field.getPropertyEntity()));
}
if (isUpdateModifier(rawValue)) {
return getMappedUpdateModifier(field, rawValue);
}
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
}
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
Object value = null;
if (rawValue instanceof Modifier) {
@@ -99,7 +109,6 @@ public class UpdateMapper extends QueryMapper {
value = modificationOperations;
} else {
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
}
@@ -119,6 +128,10 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Modifier || value instanceof Modifiers;
}
private boolean isQuery(Object value) {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
@@ -181,47 +194,48 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
return new UpdatePropertyConverter(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getAssociationConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new UpdateAssociationConverter(getAssociation(), key);
}
/**
* Converter to skip all properties after an association property was rendered.
* Special mapper handling positional parameter {@literal $} within property names.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.7
*/
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private static class UpdateKeyMapper {
private final MongoPersistentProperty property;
private boolean associationFound;
private final Iterator<String> iterator;
protected UpdateKeyMapper(String rawKey) {
Assert.hasText(rawKey, "Key must not be null or empty!");
this.iterator = Arrays.asList(rawKey.split("\\.")).iterator();
this.iterator.next();
}
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param association must not be {@literal null}.
* @param property
* @return
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
protected String mapPropertyName(MongoPersistentProperty property) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
/**
@@ -229,10 +243,11 @@ public class UpdateMapper extends QueryMapper {
* contained in the source update key.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final Iterator<String> iterator;
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
@@ -243,8 +258,7 @@ public class UpdateMapper extends QueryMapper {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
this.iterator.next();
this.mapper = new UpdateKeyMapper(updateKey);
}
/*
@@ -253,9 +267,37 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
public String convert(MongoPersistentProperty property) {
return mapper.mapPropertyName(property);
}
}
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
/**
* {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
*
* @author Christoph Strobl
*/
protected static class UpdateAssociationConverter extends AssociationConverter {
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
super(association);
this.mapper = new UpdateKeyMapper(key);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return super.convert(source) == null ? null : mapper.mapPropertyName(source);
}
}
}

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Internal API to trigger the resolution of properties.
*
* @author Oliver Gierke
*/
interface ValueResolver {
/**
* Resolves the value for the given {@link MongoPersistentProperty} within the given {@link DBObject} using the given
* {@link SpELExpressionEvaluator} and {@link ObjectPath}.
*
* @param prop
* @param dbo
* @param evaluator
* @param parent
* @return
*/
Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath parent);
}

View File

@@ -1,75 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
super(lowerLeft, upperRight);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
}

View File

@@ -1,154 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -1,37 +0,0 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
super(multiplier);
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
*
* @param value
*/
public Distance(double value) {
this(value, Metrics.NEUTRAL);
}
public Distance(double value, Metric metric) {
super(value, metric);
}
}

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in GeoJSON ({@link http://geojson.org/}) format.
*
* @author Christoph Strobl
* @since 1.7
*/
public interface GeoJson<T extends Iterable<?>> {
/**
* String value representing the type of the {@link GeoJson} object.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geojson-objects
*/
String getType();
/**
* The value of the coordinates member is always an {@link Iterable}. The structure for the elements within is
* determined by {@link #getType()} of geometry.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geometry-objects
*/
T getCoordinates();
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Defines a {@link GeoJsonGeometryCollection} that consists of a {@link List} of {@link GeoJson} objects.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#geometry-collection
*/
public class GeoJsonGeometryCollection implements GeoJson<Iterable<GeoJson<?>>> {
private static final String TYPE = "GeometryCollection";
private final List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
/**
* Creates a new {@link GeoJsonGeometryCollection} for the given {@link GeoJson} instances.
*
* @param geometries
*/
public GeoJsonGeometryCollection(List<GeoJson<?>> geometries) {
Assert.notNull(geometries, "Geometries must not be null!");
this.geometries.addAll(geometries);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJson<?>> getCoordinates() {
return Collections.unmodifiableList(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonGeometryCollection)) {
return false;
}
GeoJsonGeometryCollection other = (GeoJsonGeometryCollection) obj;
return ObjectUtils.nullSafeEquals(this.geometries, other.geometries);
}
}

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJsonLineString} is defined as list of at least 2 {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#linestring
*/
public class GeoJsonLineString extends GeoJsonMultiPoint {
private static final String TYPE = "LineString";
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonLineString(List<Point> points) {
super(points);
}
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param first must not be {@literal null}
* @param second must not be {@literal null}
* @param others can be {@literal null}
*/
public GeoJsonLineString(Point first, Point second, Point... others) {
super(first, second, others);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint#getType()
*/
@Override
public String getType() {
return TYPE;
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiLineString} is defined as list of {@link GeoJsonLineString}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multilinestring
*/
public class GeoJsonMultiLineString implements GeoJson<Iterable<GeoJsonLineString>> {
private static final String TYPE = "MultiLineString";
private List<GeoJsonLineString> coordinates = new ArrayList<GeoJsonLineString>();
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link Point}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<Point>... lines) {
Assert.notEmpty(lines, "Points for MultiLineString must not be null!");
for (List<Point> line : lines) {
this.coordinates.add(new GeoJsonLineString(line));
}
}
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link GeoJsonLineString}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<GeoJsonLineString> lines) {
Assert.notNull(lines, "Lines for MultiLineString must not be null!");
this.coordinates.addAll(lines);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJsonLineString> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiLineString)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiLineString) obj).coordinates);
}
}

View File

@@ -0,0 +1,116 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPoint} is defined as list of {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multipoint
*/
public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {
private static final String TYPE = "MultiPoint";
private final List<Point> points;
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param points points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonMultiPoint(List<Point> points) {
Assert.notNull(points, "Points must not be null.");
Assert.isTrue(points.size() >= 2, "Minimum of 2 Points required.");
this.points = new ArrayList<Point>(points);
}
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param first must not be {@literal null}.
* @param second must not be {@literal null}.
* @param others must not be {@literal null}.
*/
public GeoJsonMultiPoint(Point first, Point second, Point... others) {
Assert.notNull(first, "First point must not be null!");
Assert.notNull(second, "Second point must not be null!");
Assert.notNull(others, "Additional points must not be null!");
this.points = new ArrayList<Point>();
this.points.add(first);
this.points.add(second);
this.points.addAll(Arrays.asList(others));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Point> getCoordinates() {
return Collections.unmodifiableList(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPoint)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.points, ((GeoJsonMultiPoint) obj).points);
}
}

View File

@@ -0,0 +1,93 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPolygon} is defined as a list of {@link GeoJsonPolygon}s.
*
* @author Christoph Strobl
* @since 1.7
*/
public class GeoJsonMultiPolygon implements GeoJson<Iterable<GeoJsonPolygon>> {
private static final String TYPE = "MultiPolygon";
private List<GeoJsonPolygon> coordinates = new ArrayList<GeoJsonPolygon>();
/**
* Creates a new {@link GeoJsonMultiPolygon} for the given {@link GeoJsonPolygon}s.
*
* @param polygons must not be {@literal null}.
*/
public GeoJsonMultiPolygon(List<GeoJsonPolygon> polygons) {
Assert.notNull(polygons, "Polygons for MultiPolygon must not be null!");
this.coordinates.addAll(polygons);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<GeoJsonPolygon> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPolygon)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiPolygon) obj).coordinates);
}
}

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJson} representation of {@link Point}.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#point
*/
public class GeoJsonPoint extends Point implements GeoJson<List<Double>> {
private static final long serialVersionUID = -8026303425147474002L;
private static final String TYPE = "Point";
/**
* Creates {@link GeoJsonPoint} for given coordinates.
*
* @param x
* @param y
*/
public GeoJsonPoint(double x, double y) {
super(x, y);
}
/**
* Creates {@link GeoJsonPoint} for given {@link Point}.
*
* @param point must not be {@literal null}.
*/
public GeoJsonPoint(Point point) {
super(point);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Double> getCoordinates() {
return Arrays.asList(Double.valueOf(getX()), Double.valueOf(getY()));
}
}

View File

@@ -0,0 +1,95 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
/**
* {@link GeoJson} representation of {@link Polygon}. Unlike {@link Polygon} the {@link GeoJsonPolygon} requires a
* closed border. Which means that the first and last {@link Point} have to have same coordinate pairs.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#polygon
*/
public class GeoJsonPolygon extends Polygon implements GeoJson<List<GeoJsonLineString>> {
private static final long serialVersionUID = 3936163018187247185L;
private static final String TYPE = "Polygon";
private List<GeoJsonLineString> coordinates = new ArrayList<GeoJsonLineString>();
/**
* Creates new {@link GeoJsonPolygon} from the given {@link Point}s.
*
* @param first must not be {@literal null}.
* @param second must not be {@literal null}.
* @param third must not be {@literal null}.
* @param fourth must not be {@literal null}.
* @param others can be {@literal null}.
*/
public GeoJsonPolygon(Point first, Point second, Point third, Point fourth, final Point... others) {
this(asList(first, second, third, fourth, others));
}
/**
* Creates new {@link GeoJsonPolygon} from the given {@link Point}s.
*
* @param points must not be {@literal null}.
*/
public GeoJsonPolygon(List<Point> points) {
super(points);
this.coordinates.add(new GeoJsonLineString(points));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<GeoJsonLineString> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
private static List<Point> asList(Point first, Point second, Point third, Point fourth, final Point... others) {
ArrayList<Point> result = new ArrayList<Point>(3 + others.length);
result.add(first);
result.add(second);
result.add(third);
result.add(fourth);
result.addAll(Arrays.asList(others));
return result;
}
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
/**
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
* from.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
private static final long serialVersionUID = 23421312312412L;
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
*
* @param content must not be {@literal null}.
*/
public GeoPage(GeoResults<T> results) {
super(results);
}
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}, {@link Pageable} and total.
*
* @param results must not be {@literal null}.
* @param pageable must not be {@literal null}.
* @param total
*/
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
super(results, pageable, total);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Calue object capturing some arbitrary object plus a distance.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
/**
* Creates a new {@link GeoResult} for the given content and distance.
*
* @param content must not be {@literal null}.
* @param distance must not be {@literal null}.
*/
public GeoResult(T content, Distance distance) {
super(content, distance);
}
}

View File

@@ -1,60 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
/**
* Value object to capture {@link GeoResult}s as well as the average distance they have.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
* to be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
/**
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
* given {@link GeoResult}s.
*
* @param results must not be {@literal null}.
*/
public GeoResults(List<? extends GeoResult<T>> results) {
super(results);
}
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
super(results, metric);
}
/**
* Creates a new {@link GeoResults} instance from the given {@link GeoResult}s and average distance.
*
* @param results must not be {@literal null}.
* @param averageDistance
*/
@PersistenceConstructor
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
super(results, averageDistance);
}
}

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.mongodb.core.query.NearQuery;
/**
* Commonly used {@link Metrics} for {@link NearQuery}s.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public enum Metrics implements Metric {
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
private final double multiplier;
private Metrics(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -1,55 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
/**
* Represents a geospatial point value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Point extends org.springframework.data.geo.Point {
@PersistenceConstructor
public Point(double x, double y) {
super(x, y);
}
public Point(org.springframework.data.geo.Point point) {
super(point);
}
public double[] asArray() {
return new double[] { getX(), getY() };
}
public List<Double> asList() {
return asList(this);
}
public static List<Double> asList(org.springframework.data.geo.Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -1,92 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Simple value object to represent a {@link Polygon}.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
public static final String COMMAND = "$polygon";
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param x
* @param y
* @param z
* @param others
*/
public <P extends Point> Polygon(P x, P y, P z, P... others) {
super(x, y, z, others);
}
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param points
*/
public <P extends Point> Polygon(List<P> points) {
super(points);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return asList(this);
}
/**
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
*
* @param polygon
* @return
*/
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
List<Point> points = polygon.getPoints();
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
for (Point point : points) {
tuples.add(Arrays.asList(point.getX(), point.getY()));
}
return tuples;
}
}

View File

@@ -1,45 +0,0 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Shape extends org.springframework.data.geo.Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
}

View File

@@ -22,6 +22,7 @@ import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.util.Assert;
/**
@@ -30,7 +31,6 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class Sphere implements Shape {
public static final String COMMAND = "$centerSphere";
@@ -73,23 +73,13 @@ public class Sphere implements Shape {
this(circle.getCenter(), circle.getRadius());
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
@Deprecated
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public org.springframework.data.mongodb.core.geo.Point getCenter() {
return new org.springframework.data.mongodb.core.geo.Point(this.center);
public Point getCenter() {
return new Point(this.center);
}
/**
@@ -141,20 +131,21 @@ public class Sphere implements Shape {
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
@Override
public List<? extends Object> asList() {
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
@Override
public String getCommand() {
return COMMAND;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import java.lang.annotation.Target;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Christoph Strobl
*/
@Target({ ElementType.TYPE })
@Documented
@@ -36,11 +37,12 @@ public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
* define the index direction (1 for ascending, -1 for descending). <br />
* If left empty on nested document, the whole document will be indexed.
*
* @return
*/
String def();
String def() default "";
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
@@ -72,18 +74,66 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
* The name of the index to be created. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* &#64;CompoundIndex(name = "compound_index", def = "{'h1': 1, 'h2': 1}")
* class Hybrid {
* String h1, h2;
* }
*
* &#64;CompoundIndex(name = "compound_index", def = "{'n1': 1, 'n2': 1}")
* class Nested {
* String n1, n2;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: 1, hybrid.h2: 1 } , { name: "hybrid.compound_index" } )
* db.root.createIndex( { nested.n1: 1, nested.n2: 1 } , { name: "nested.compound_index" } )
* db.hybrid.createIndex( { h1: 1, h2: 1 } , { name: "compound_index" } )
* </code>
* </pre>
*
* @return
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**
@@ -94,11 +144,4 @@ public @interface CompoundIndex {
*/
boolean background() default false;
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
int expireAfterSeconds() default -1;
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Index definition to span multiple keys.
*
* @author Christoph Strobl
* @since 1.5
*/
public class CompoundIndexDefinition extends Index {
private DBObject keys;
/**
* Creates a new {@link CompoundIndexDefinition} for the given keys.
*
* @param keys must not be {@literal null}.
*/
public CompoundIndexDefinition(DBObject keys) {
Assert.notNull(keys, "Keys must not be null!");
this.keys = keys;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.Index#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
BasicDBObject dbo = new BasicDBObject();
dbo.putAll(this.keys);
dbo.putAll(super.getIndexKeys());
return dbo;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,23 +25,71 @@ import java.lang.annotation.Target;
*
* @author Jon Brisbin
* @author Laurent Canet
* @author Thomas Darimont
* @author Christoph Strobl
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
public @interface GeoSpatialIndexed {
/**
* Name of the property in the document that contains the [x, y] or radial coordinates to index.
* Index name. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* class Hybrid {
* &#64;GeoSpatialIndexed(name="index") Point h1;
* }
*
* class Nested {
* &#64;GeoSpatialIndexed(name="index") Point n1;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: "2d" } , { name: "hybrid.index" } )
* db.root.createIndex( { nested.n1: "2d" } , { name: "nested.index" } )
* db.hybrid.createIndex( { h1: "2d" } , { name: "index" } )
* </code>
* </pre>
*
* @return
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Name of the collection in which to create the index.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import com.mongodb.DBObject;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Laurent Canet
* @author Christoph Strobl
*/
public class GeospatialIndex implements IndexDefinition {
@@ -57,8 +58,6 @@ public class GeospatialIndex implements IndexDefinition {
*/
public GeospatialIndex named(String name) {
Assert.hasText(name, "Name must have text!");
this.name = name;
return this;
}
@@ -151,12 +150,12 @@ public class GeospatialIndex implements IndexDefinition {
public DBObject getIndexOptions() {
if (name == null && min == null && max == null && bucketSize == null) {
if (!StringUtils.hasText(name) && min == null && max == null && bucketSize == null) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
@@ -186,6 +185,7 @@ public class GeospatialIndex implements IndexDefinition {
}
break;
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,18 +17,35 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public class Index implements IndexDefinition {
public enum Duplicates {
RETAIN, DROP
RETAIN, //
/**
* Dropping Duplicates was removed in MongoDB Server 2.8.0-rc0.
* <p>
* See https://jira.mongodb.org/browse/SERVER-14710
*
* @deprecated since 1.7.
*/
@Deprecated//
DROP
}
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();
@@ -41,6 +58,10 @@ public class Index implements IndexDefinition {
private boolean sparse = false;
private boolean background = false;
private long expire = -1;
public Index() {}
public Index(String key, Direction direction) {
@@ -104,6 +125,44 @@ public class Index implements IndexDefinition {
return this;
}
/**
* Build the index in background (non blocking).
*
* @return
* @since 1.5
*/
public Index background() {
this.background = true;
return this;
}
/**
* Specifies TTL in seconds.
*
* @param value
* @return
* @since 1.5
*/
public Index expire(long value) {
return expire(value, TimeUnit.SECONDS);
}
/**
* Specifies TTL with given {@link TimeUnit}.
*
* @param value
* @param unit
* @return
* @since 1.5
*/
public Index expire(long value, TimeUnit unit) {
Assert.notNull(unit, "TimeUnit for expiration must not be null.");
this.expire = unit.toSeconds(value);
return this;
}
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @param duplicates
@@ -125,11 +184,9 @@ public class Index implements IndexDefinition {
}
public DBObject getIndexOptions() {
if (name == null && !unique) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
if (unique) {
@@ -141,6 +198,13 @@ public class Index implements IndexDefinition {
if (sparse) {
dbo.put("sparse", true);
}
if (background) {
dbo.put("background", true);
}
if (expire >= 0) {
dbo.put("expireAfterSeconds", expire);
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright (c) 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,11 +20,11 @@ import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Christoph Strobl
*/
public interface IndexDefinition {
DBObject getIndexKeys();
DBObject getIndexOptions();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,22 +24,33 @@ import org.springframework.util.ObjectUtils;
* Value object for an index field.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public final class IndexField {
enum Type {
GEO, TEXT, DEFAULT;
}
private final String key;
private final Direction direction;
private final boolean isGeo;
private final Type type;
private final Float weight;
private IndexField(String key, Direction direction, boolean isGeo) {
private IndexField(String key, Direction direction, Type type) {
this(key, direction, type, Float.NaN);
}
private IndexField(String key, Direction direction, Type type, Float weight) {
Assert.hasText(key);
Assert.isTrue(direction != null ^ isGeo);
Assert.isTrue(direction != null ^ (Type.GEO.equals(type) || Type.TEXT.equals(type)));
this.key = key;
this.direction = direction;
this.isGeo = isGeo;
this.type = type == null ? Type.DEFAULT : type;
this.weight = weight == null ? Float.NaN : weight;
}
/**
@@ -53,12 +64,12 @@ public final class IndexField {
@Deprecated
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order.toDirection(), false);
return new IndexField(key, order.toDirection(), Type.DEFAULT);
}
public static IndexField create(String key, Direction order) {
Assert.notNull(order);
return new IndexField(key, order, false);
return new IndexField(key, order, Type.DEFAULT);
}
/**
@@ -68,7 +79,16 @@ public final class IndexField {
* @return
*/
public static IndexField geo(String key) {
return new IndexField(key, null, true);
return new IndexField(key, null, Type.GEO);
}
/**
* Creates a text {@link IndexField} for the given key.
*
* @since 1.6
*/
public static IndexField text(String key, Float weight) {
return new IndexField(key, null, Type.TEXT, weight);
}
/**
@@ -101,10 +121,20 @@ public final class IndexField {
/**
* Returns whether the {@link IndexField} is a geo index field.
*
* @return the isGeo
* @return true if type is {@link Type#GEO}.
*/
public boolean isGeo() {
return isGeo;
return Type.GEO.equals(type);
}
/**
* Returns wheter the {@link IndexField} is a text index field.
*
* @return true if type is {@link Type#TEXT}
* @since 1.6
*/
public boolean isText() {
return Type.TEXT.equals(type);
}
/*
@@ -125,7 +155,7 @@ public final class IndexField {
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
&& this.isGeo == that.isGeo;
&& this.type == that.type;
}
/*
@@ -138,7 +168,8 @@ public final class IndexField {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(direction);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
result += 31 * ObjectUtils.nullSafeHashCode(type);
result += 31 * ObjectUtils.nullSafeHashCode(weight);
return result;
}
@@ -148,6 +179,7 @@ public final class IndexField {
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
return String.format("IndexField [ key: %s, direction: %s, type: %s, weight: %s]", key, direction, type, weight);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2002-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,6 +23,11 @@ import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class IndexInfo {
private final List<IndexField> indexFields;
@@ -31,14 +36,30 @@ public class IndexInfo {
private final boolean unique;
private final boolean dropDuplicates;
private final boolean sparse;
private final String language;
/**
* @deprecated Will be removed in 1.7. Please use {@link #IndexInfo(List, String, boolean, boolean, boolean, String)}
* @param indexFields
* @param name
* @param unique
* @param dropDuplicates
* @param sparse
*/
@Deprecated
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
this(indexFields, name, unique, dropDuplicates, sparse, "");
}
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse,
String language) {
this.indexFields = Collections.unmodifiableList(indexFields);
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
this.language = language;
}
/**
@@ -84,14 +105,23 @@ public class IndexInfo {
return sparse;
}
/**
* @return
* @since 1.6
*/
public String getLanguage() {
return language;
}
@Override
public String toString() {
return "IndexInfo [indexFields=" + indexFields + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + "]";
+ dropDuplicates + ", sparse=" + sparse + ", language=" + language + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
@@ -99,6 +129,7 @@ public class IndexInfo {
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
result = prime * result + ObjectUtils.nullSafeHashCode(language);
return result;
}
@@ -137,6 +168,9 @@ public class IndexInfo {
if (unique != other.unique) {
return false;
}
if (!ObjectUtils.nullSafeEquals(language, other.language)) {
return false;
}
return true;
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @since 1.5
*/
interface IndexResolver {
/**
* Find and create {@link IndexDefinition}s for properties of given {@code type}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param type
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinitionHolder> resolveIndexForClass(Class<?> type);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,8 @@ import java.lang.annotation.Target;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Thomas Darimont
* @author Christoph Strobl
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -57,17 +59,63 @@ public @interface Indexed {
boolean dropDups() default false;
/**
* Index name.
* Index name. <br />
* <br />
* The name will only be applied as is when defined on root level. For usage on nested or embedded structures the
* provided name will be prefixed with the path leading to the entity. <br />
* <br />
* The structure below
*
* <pre>
* <code>
* &#64;Document
* class Root {
* Hybrid hybrid;
* Nested nested;
* }
*
* &#64;Document
* class Hybrid {
* &#64;Indexed(name="index") String h1;
* }
*
* class Nested {
* &#64;Indexed(name="index") String n1;
* }
* </code>
* </pre>
*
* resolves in the following index structures
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: 1 } , { name: "hybrid.index" } )
* db.root.createIndex( { nested.n1: 1 } , { name: "nested.index" } )
* db.hybrid.createIndex( { h1: 1} , { name: "index" } )
* </code>
* </pre>
*
* @return
*/
String name() default "";
/**
* Colleciton name for index to be created on.
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Collection name for index to be created on.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -22,19 +22,15 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
@@ -45,6 +41,7 @@ import com.mongodb.util.JSON;
* @author Philipp Schneider
* @author Johno Crawford
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -54,21 +51,37 @@ public class MongoPersistentEntityIndexCreator implements
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoDbFactory mongoDbFactory;
private final MongoMappingContext mappingContext;
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}
* @param mongoDbFactory must not be {@literal null}
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
this(mappingContext, mongoDbFactory, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory,
IndexResolver indexResolver) {
Assert.notNull(mongoDbFactory);
Assert.notNull(mappingContext);
Assert.notNull(indexResolver);
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.indexResolver = indexResolver;
for (MongoPersistentEntity<?> entity : mappingContext.getPersistentEntities()) {
checkForIndexes(entity);
@@ -93,87 +106,36 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void checkForIndexes(final MongoPersistentEntity<?> entity) {
final Class<?> type = entity.getType();
private void checkForIndexes(final MongoPersistentEntity<?> entity) {
Class<?> type = entity.getType();
if (!classesSeen.containsKey(type)) {
this.classesSeen.put(type, Boolean.TRUE);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Analyzing class " + type + " for index information.");
}
// Make sure indexes get created
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created compound index " + index);
}
}
}
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.isAnnotationPresent(Indexed.class)) {
Indexed index = property.findAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = property.getFieldName();
} else {
if (!name.equals(property.getName()) && index.unique() && !index.sparse()) {
// Names don't match, and sparse is not true. This situation will generate an error on the server.
if (LOGGER.isWarnEnabled()) {
LOGGER.warn("The index name " + name + " doesn't match this property name: " + property.getName()
+ ". Setting sparse=true on this index will prevent errors when inserting documents.");
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(property.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created property index " + index);
}
} else if (property.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = property.findAnnotation(GeoSpatialIndexed.class);
GeospatialIndex indexObject = new GeospatialIndex(property.getFieldName());
indexObject.withMin(index.min()).withMax(index.max());
indexObject.named(StringUtils.hasText(index.name()) ? index.name() : property.getName());
indexObject.typed(index.type()).withBucketSize(index.bucketSize())
.withAdditionalField(index.additionalField());
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
mongoDbFactory.getDb().getCollection(collection)
.ensureIndex(indexObject.getIndexKeys(), indexObject.getIndexOptions());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Created %s for entity %s in collection %s! ", indexObject, entity.getType(),
collection));
}
}
}
});
classesSeen.put(type, true);
checkForAndCreateIndexes(entity);
}
}
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.findAnnotation(Document.class) != null) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexForClass(entity.getType())) {
createIndex(indexToCreate);
}
}
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
}
/**
* Returns whether the current index creator was registered for the given {@link MappingContext}.
*
@@ -183,33 +145,4 @@ public class MongoPersistentEntityIndexCreator implements
public boolean isIndexCreatorFor(MappingContext<?, ?> context) {
return this.mappingContext.equals(context);
}
/**
* Triggers the actual index creation.
*
* @param collection the collection to create the index in
* @param name the name of the index about to be created
* @param indexDefinition the index definition
* @param unique whether it shall be a unique index
* @param dropDups whether to drop duplicates
* @param sparse sparse or not
* @param background whether the index will be created in the background
* @param expireAfterSeconds the time to live for documents in the collection
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
opts.put("background", background);
if (expireAfterSeconds != -1) {
opts.put("expireAfterSeconds", expireAfterSeconds);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

Some files were not shown because too many files have changed in this diff Show More