Compare commits

..

103 Commits

Author SHA1 Message Date
Oliver Gierke
13823690fc DATAMONGO-1246 - After release cleanups. 2015-07-01 09:17:13 +02:00
Spring Buildmaster
fc0187eb8e DATAMONGO-1246 - Prepare next development iteration. 2015-06-30 23:18:05 -07:00
Spring Buildmaster
07bf5a4f7e DATAMONGO-1246 - Release version 1.5.6.RELEASE. 2015-06-30 23:18:02 -07:00
Oliver Gierke
274adf320d DATAMONGO-1246 - Prepare 1.5.6.RELEASE (Dijkstra SR6). 2015-07-01 08:06:03 +02:00
Oliver Gierke
aed983462f DATAMONGO-1246 - Updated changelog. 2015-07-01 08:06:03 +02:00
Oliver Gierke
0f5179eb26 DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:40 +02:00
Oliver Gierke
df40adb618 DATAMONGO-1248 - Updated changelog. 2015-06-30 14:03:28 +02:00
Oliver Gierke
339d8e6f14 DATAMONGO-1228 - Updated changelog. 2015-06-30 14:03:21 +02:00
Oliver Gierke
292e645387 DATAMONGO-1189 - Updated changelog. 2015-06-30 14:02:56 +02:00
Oliver Gierke
df7c007e9e DATAMONGO-1173 - Updated changelog. 2015-06-30 14:01:38 +02:00
Oliver Gierke
9f3465f63c DATAMONGO-1144 - Updated changelog. 2015-06-30 14:01:00 +02:00
Christoph Strobl
2e7c166d38 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-24 20:19:22 +02:00
Christoph Strobl
ac7e3a708a DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:47:01 +02:00
Oliver Gierke
fa50f74729 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:55:07 +02:00
Eddú Meléndez
2aba831389 DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:39:22 +02:00
Oliver Gierke
33c665fb98 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:13:49 +02:00
Oliver Gierke
55f83fa0f2 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:08:29 +02:00
Oliver Gierke
13f2472e2f DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:08 +02:00
Oliver Gierke
5de9994093 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:24:43 +01:00
Thomas Darimont
4d24742181 DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:36 +01:00
Oliver Gierke
4af876152c DATAMONGO-1155 - Upgraded mongo-next build profile to driver version 2.13.0.
Related ticket: DATAMONGO-1154.
2015-01-29 21:17:19 +01:00
Oliver Gierke
872e706d2a DATAMONGO-1143 - After release cleanups. 2015-01-27 18:33:15 +01:00
Spring Buildmaster
51c66723e9 DATAMONGO-1143 - Prepare next development iteration. 2015-01-27 06:39:05 -08:00
Spring Buildmaster
ccfab5d91a DATAMONGO-1143 - Release version 1.5.5.RELEASE (Dijkstra SR5). 2015-01-27 06:38:56 -08:00
Oliver Gierke
c94c3f3791 DATAMONGO-1143 - Prepare 1.5.5.RELEASE (Dijkstra SR5). 2015-01-27 15:10:01 +01:00
Oliver Gierke
4924ff54b6 DATAMONGO-1143 - Updated changelog. 2015-01-27 14:35:12 +01:00
Oliver Gierke
edd9b58d1c DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:03:12 +01:00
alex-on-java
dc6cbde842 DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 18:05:59 +01:00
Christoph Strobl
aaaf7e57de DATAMONGO-1143 - Upgrade dependencies.
mongo-java-driver: 2.12.1 > 2.12.5
2015-01-21 20:52:35 +01:00
Christoph Strobl
76538aeff1 DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:18:01 +01:00
Oliver Gierke
1181a21e51 DATAMONGO-1106 - Updated changelog. 2015-01-20 07:40:28 +01:00
Oliver Gierke
c7f07d2386 DATAMONGO-1079 - Updated changelog. 2015-01-20 07:40:03 +01:00
Oliver Gierke
054ad6fc12 DATAMONGO-1046 - Updated changelog. 2015-01-20 07:39:35 +01:00
Oliver Gierke
1b4ecac732 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:22:16 +01:00
Oliver Gierke
1ef03bf707 DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:49 +01:00
Oliver Gierke
a1febdb541 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:51:53 +01:00
Thomas Darimont
79fdf44d24 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:51:46 +01:00
Christoph Strobl
d8bb644b30 DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:51:38 +01:00
Oliver Gierke
73db7b06b9 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:38:23 +01:00
Oliver Gierke
8483f36f48 DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:45 +01:00
Christoph Strobl
b426bdd64a DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:59 +01:00
Christoph Strobl
3349454a51 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Original pull request: #241.
2014-11-27 17:14:46 +01:00
Mikhail Mikhaylenko
c04343a764 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 10:05:46 +01:00
Thomas Darimont
05311ea118 DATAMONGO-1094 - Fixed tests.
Due to changed behaviour in Springframework 3.2.12, the order by which nested config classes are analysed has changed. Therefore we need to configure @EnableMongoRepositories annotation on SimpleConfigWithRepositories as well.
2014-11-26 10:03:04 +01:00
Thomas Darimont
a18633f847 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 19:06:44 +01:00
Oliver Gierke
29e2f87ee1 DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:38 +01:00
Christoph Strobl
5c5accd12d DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:38:35 +01:00
Thomas Darimont
08e534adcd DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 14:16:31 +02:00
Christoph Strobl
914acfea16 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 12:09:37 +02:00
Christoph Strobl
a6728f851b DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:51 +02:00
Oliver Gierke
81a7515cb7 DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 16:29:59 +02:00
Christoph Strobl
3f4087dc13 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:04:14 +02:00
Thomas Darimont
0430962861 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-10-01 09:48:12 +02:00
Christoph Strobl
8bb62a65db DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:49:28 +02:00
Christoph Strobl
d6defbcb56 DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:23:51 +02:00
Oliver Gierke
ed47850f71 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:52:15 +02:00
Oliver Gierke
ca1cfdf659 DATAMONGO-1033 - After release cleanups. 2014-08-27 12:40:22 +02:00
Spring Buildmaster
95d31d5bb7 DATAMONGO-1033 - Prepare next development iteration. 2014-08-27 03:11:38 -07:00
Spring Buildmaster
a7c3ef2aa8 DATAMONGO-1033 - Release version 1.5.4.RELEASE (Dijkstra SR4). 2014-08-27 03:11:35 -07:00
Oliver Gierke
3320e49c0b DATAMONGO-1033 - Prepare 1.5.4.RELEASE (Dijkstra SR4). 2014-08-27 11:57:52 +02:00
Oliver Gierke
286efca52d DATAMONGO-1033 - Updated changelog. 2014-08-27 11:28:08 +02:00
Christoph Strobl
92926befc9 DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:58 +02:00
Oliver Gierke
703f24ae1c DATAMONGO-1021 - Updated changelog. 2014-08-27 07:18:28 +02:00
Oliver Gierke
febe703954 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.
2014-08-26 20:05:58 +02:00
Oliver Gierke
440d16ebc6 DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:27:39 +02:00
Christoph Strobl
11c2e90736 DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 15:23:59 +02:00
Thomas Darimont
816a567f29 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:32:21 +02:00
Oliver Gierke
e35486759b DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:26:02 +02:00
Christoph Strobl
b80c81f861 DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:25:53 +02:00
Oliver Gierke
005d21c0b6 DATAMONGO-1007 - After release cleanups. 2014-07-28 12:29:37 +02:00
Spring Buildmaster
3c8b7a54d6 DATAMONGO-1007 - Prepare next development iteration. 2014-07-28 02:23:57 -07:00
Spring Buildmaster
b2d59f2539 DATAMONGO-1007 - Release version 1.5.2 (Dijkstra SR2). 2014-07-28 02:14:12 -07:00
Oliver Gierke
1cc2830e95 DATAMONGO-1007 - Prepare 1.5.2.RELEASE (Dijkstra SR2). 2014-07-28 10:43:09 +02:00
Oliver Gierke
9a4d6f6fb7 DATAMONGO-1007 - Updated changelog. 2014-07-28 10:19:43 +02:00
Oliver Gierke
e96db8b69b DATAMONGO-981 - Updated changelog. 2014-07-25 06:20:22 +02:00
Christoph Strobl
13ce75779f DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:42:05 +02:00
Oliver Gierke
cc785ecf4f DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:35:39 +02:00
Thomas Darimont
41fe1809da DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:16:53 +02:00
Thomas Darimont
fd5d39f4d9 DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:53 +02:00
Oliver Gierke
71d97ff53a DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:55 +02:00
Thomas Darimont
f2f3faef08 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:47 +02:00
Thomas Darimont
b1a488098b DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
Added additional constructor that accepts CriteriaDefinition to not force clients to extends Criteria. We deprecated the original constructor and delegate to the new one. We keep the old one to remain binary compatible.

Original pull request: #206.
2014-07-17 09:01:48 +02:00
Christoph Strobl
fd18a8b82c DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-10 07:08:43 +02:00
Christoph Strobl
761d2748c5 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:47:03 +02:00
Christoph Strobl
e2f3966763 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:25 +02:00
Oliver Gierke
2b3e5461c2 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:34:48 +02:00
Christoph Strobl
ca4db459c4 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:19:21 +02:00
Oliver Gierke
e0d0a5dc31 DATAMONGO-971 - After release cleanups. 2014-06-30 16:47:27 +02:00
Spring Buildmaster
18761a39c4 DATAMONGO-971 - Prepare next development iteration. 2014-06-30 07:02:21 -07:00
Spring Buildmaster
ad25751dbb DATAMONGO-971 - Release version 1.5.1.RELEASE (Dijkstra SR1). 2014-06-30 07:02:16 -07:00
Oliver Gierke
879ca6d149 DATAMONGO-971 - Prepare 1.5.1.RELEASE (Dijkstra SR1). 2014-06-30 15:01:20 +02:00
Oliver Gierke
8a40cf421c DATAMONGO-971 - Updated changelog. 2014-06-30 15:01:02 +02:00
Oliver Gierke
fbbb7b6bf7 DATAMONGO-955 - Updated changelog. 2014-06-30 10:56:34 +02:00
Christoph Strobl
0596403081 DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:26:51 +02:00
Oliver Gierke
d9eb18df4d DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:13:17 +02:00
Christoph Strobl
132e4a9839 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:54:24 +02:00
Thomas Darimont
4acf8caac1 DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:44 +02:00
Christoph Strobl
9de3c88e6b DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:25 +02:00
Christoph Strobl
6115c9562b DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:40 +02:00
Christoph Strobl
963a222616 DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:20:55 +02:00
Thomas Darimont
7a2de49ac1 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:20:53 +02:00
Thomas Darimont
8380806d7a DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:46 +02:00
Oliver Gierke
e60479e47a DATAMONGO-936 - Prepare next development iteration. 2014-05-22 12:22:35 +02:00
276 changed files with 9354 additions and 20353 deletions

View File

@@ -1,22 +0,0 @@
language: java
jdk:
- oraclejdk8
services:
- mongodb
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

37
pom.xml
View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.6.0.RC1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
<version>1.4.7.BUILD-SNAPSHOT</version>
</parent>
<modules>
@@ -29,9 +28,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.10.0.RC1</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<springdata.commons>1.8.7.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.12.5</mongo>
<mongo.osgi>2.12.5</mongo.osgi>
</properties>
<developers>
@@ -108,30 +107,14 @@
<id>mongo-next</id>
<properties>
<mongo>2.13.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.0-beta3</mongo>
<mongo>2.13.0</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<id>mongo-3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
@@ -157,8 +140,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,7 +14,7 @@
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>2.0.0</jpa>
<jpa>1.0.0.Final</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
@@ -59,8 +59,8 @@
<!-- JPA -->
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,10 +32,6 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.10.209">
<context version="7.1.9.205">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,7 +32,6 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>

View File

@@ -11,19 +11,18 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -51,7 +50,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -138,13 +137,6 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
@@ -152,12 +144,6 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,9 +28,6 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -38,15 +35,17 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.FieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.PropertyNameFieldNamingStrategy;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
@@ -55,7 +54,6 @@ import com.mongodb.MongoClient;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -72,10 +70,7 @@ public abstract class AbstractMongoConfiguration {
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
@@ -134,10 +129,7 @@ public abstract class AbstractMongoConfiguration {
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}

View File

@@ -52,10 +52,10 @@ import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;

View File

@@ -1,85 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -1,132 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : text.split(",")) {
if (!text.contains(USERNAME_PASSWORD_DELIMINATOR) || !text.contains(DATABASE_DELIMINATOR)) {
throw new IllegalArgumentException("Credentials need to be in format 'username:password@database'!");
}
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(String.format(
"Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
credentials.add(MongoCredential.createCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private static String[] extractUserNameAndPassword(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
String userNameAndPassword = text.substring(0, dbSeperationIndex);
return userNameAndPassword.split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,7 +22,6 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -34,7 +33,6 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,10 +15,14 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -32,7 +36,6 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
@@ -61,8 +64,7 @@ public class MongoParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
@@ -73,4 +75,19 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,7 +24,6 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -34,13 +33,13 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
private MongoParsingUtils() {
}
/**
* Parses the mongo replica-set element.
@@ -55,14 +54,12 @@ abstract class MongoParsingUtils {
}
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
@@ -83,58 +80,13 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
@@ -151,56 +103,4 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,66 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,7 +38,6 @@ import com.mongodb.MongoException;
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
@@ -73,9 +72,9 @@ public class DefaultIndexOperations implements IndexOperations {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.createIndex(indexDefinition.getIndexKeys());
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
@@ -108,12 +107,10 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
collection.resetIndexCache();
return null;
}
});
@@ -148,13 +145,6 @@ public class DefaultIndexOperations implements IndexOperations {
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
@@ -172,8 +162,8 @@ public class DefaultIndexOperations implements IndexOperations {
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
}
return indexInfoList;

View File

@@ -1,188 +0,0 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface IndexOperations {
@@ -51,11 +51,7 @@ public interface IndexOperations {
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,189 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,295 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,13 +18,12 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
@@ -35,7 +34,6 @@ import com.mongodb.MongoClient;
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -45,7 +43,9 @@ public abstract class MongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {}
private MongoDbUtils() {
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -65,24 +65,11 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
@@ -122,9 +109,22 @@ public abstract class MongoDbUtils {
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
if (credentialsGiven && !authDb.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
// TX sync active, bind new database to thread
@@ -181,36 +181,16 @@ public abstract class MongoDbUtils {
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
ReflectiveDbInvoker.requestDone(db);
db.requestDone();
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a monog-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provied within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,21 +15,23 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoCursorNotFoundException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import com.mongodb.MongoServerSelectionException;
import com.mongodb.MongoSocketException;
import com.mongodb.MongoTimeoutException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -38,23 +40,9 @@ import com.mongodb.MongoException;
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -63,22 +51,28 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses.
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
if (ex instanceof DuplicateKey || ex instanceof DuplicateKeyException) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
if (ex instanceof Network || ex instanceof MongoSocketException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
if (ex instanceof CursorNotFound || ex instanceof MongoCursorNotFoundException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
if (ex instanceof MongoServerSelectionException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoTimeoutException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// All other MongoExceptions

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,9 @@ import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -38,14 +40,12 @@ import com.mongodb.WriteConcern;
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private Mongo mongo;
private MongoOptions mongoOptions;
private String host;
@@ -53,11 +53,9 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* @param mongoOptions
*/
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
@@ -68,6 +66,7 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
@@ -76,19 +75,30 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
}
/**
* Configures the host to connect to.
*
* @param host
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
@@ -102,13 +112,12 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
}
/*
@@ -119,6 +128,14 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -129,10 +146,10 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
protected Mongo createInstance() throws Exception {
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
@@ -158,42 +175,18 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
mongo.setWriteConcern(writeConcern);
}
return mongo;
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
public void destroy() throws Exception {
this.mongo.close();
}
}

View File

@@ -19,11 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -33,14 +33,10 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult;
/**
@@ -56,6 +52,7 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
@@ -89,23 +86,9 @@ public interface MongoOperations {
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
@@ -161,26 +144,9 @@ public interface MongoOperations {
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query
* @param entityType
* @return
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -284,14 +250,6 @@ public interface MongoOperations {
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -699,27 +657,13 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
* Returns the number of documents for the given {@link Query} querying the given collection.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
long count(Query query, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
@@ -1002,5 +946,4 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,48 +17,41 @@ package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import com.mongodb.MongoOptions;
/**
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* A factory bean for construction of a {@link MongoOptions} instance.
*
* @author Graeme Rocher
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
@SuppressWarnings("deprecation")
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost;
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime;
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout;
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout;
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive;
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime;
private int writeNumber = DEFAULT_MONGO_OPTIONS.w;
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout;
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
private MongoOptions options;
/**
* Configures the maximum number of connections allowed per host until we will block.
*
@@ -151,10 +144,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
/**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
@@ -164,9 +154,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
@@ -175,9 +163,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
@@ -208,41 +194,40 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
this.sslSocketFactory = sslSocketFactory;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
public void afterPropertiesSet() {
MongoOptions options = new MongoOptions();
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
options.connectionsPerHost = connectionsPerHost;
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
options.maxWaitTime = maxWaitTime;
options.connectTimeout = connectTimeout;
options.socketTimeout = socketTimeout;
options.socketKeepAlive = socketKeepAlive;
options.autoConnectRetry = autoConnectRetry;
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
options.slaveOk = slaveOk;
options.w = writeNumber;
options.wtimeout = writeTimeout;
options.fsync = writeFsync;
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
this.options = options;
}
return options;
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
public MongoOptions getObject() {
return this.options;
}
/*
@@ -252,4 +237,12 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
public Class<?> getObjectType() {
return MongoOptions.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -53,11 +53,9 @@ import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -73,6 +71,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -95,19 +94,14 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.util.CloseableIterator;
import org.springframework.jca.cci.core.ConnectionCallback;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.Bytes;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
@@ -317,31 +311,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return this.mongoConverter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeAsStream(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType) {
return execute(entityType, new CollectionCallback<CloseableIterator<T>>() {
@Override
public CloseableIterator<T> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityType);
DBObject mappedFields = queryMapper.getMappedFields(query.getFieldsObject(), persistentEntity);
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
DBCursor cursor = collection.find(mappedQuery, mappedFields);
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType);
return new CloseableIterableCusorAdapter<T>(cursor, exceptionTranslator, readCallback);
}
});
}
public String getCollectionName(Class<?> entityClass) {
return this.determineCollectionName(entityClass);
}
@@ -362,32 +331,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeCommand(com.mongodb.DBObject, int)
*/
@Deprecated
public CommandResult executeCommand(final DBObject command, final int options) {
return executeCommand(command, (options & Bytes.QUERYOPTION_SLAVEOK) != 0 ? ReadPreference.secondaryPreferred()
: ReadPreference.primary());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeCommand(com.mongodb.DBObject, com.mongodb.ReadPreference)
*/
public CommandResult executeCommand(final DBObject command, final ReadPreference readPreference) {
Assert.notNull(command, "Command must not be null!");
CommandResult result = execute(new DbCallback<CommandResult>() {
public CommandResult doInDB(DB db) throws MongoException, DataAccessException {
return db.command(command, readPreference);
return readPreference != null ? db.command(command, readPreference) : db.command(command);
}
});
logCommandExecutionError(command, result);
return result;
}
@@ -461,20 +413,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeInSession(org.springframework.data.mongodb.core.DbCallback)
*/
@Deprecated
public <T> T executeInSession(final DbCallback<T> action) {
return execute(new DbCallback<T>() {
public T doInDB(DB db) throws MongoException, DataAccessException {
try {
ReflectiveDbInvoker.requestStart(db);
db.requestStart();
return action.doInDB(db);
} finally {
ReflectiveDbInvoker.requestDone(db);
db.requestDone();
}
}
});
@@ -540,15 +486,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new DefaultIndexOperations(this, determineCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#scriptOps()
*/
@Override
public ScriptOperations scriptOps() {
return new DefaultScriptOperations(this);
}
// Find methods that take a Query to express the query and that return a single object.
public <T> T findOne(Query query, Class<T> entityClass) {
@@ -629,7 +566,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
BasicDBObject command = new BasicDBObject("geoNear", collection);
command.putAll(near.toDBObject());
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
List<Object> results = (List<Object>) commandResult.get("results");
results = results == null ? Collections.emptyList() : results;
@@ -704,11 +641,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return count(query, null, collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
public long count(Query query, Class<?> entityClass, String collectionName) {
private long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
@@ -761,23 +694,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Prepare the WriteConcern before any processing is done using it. This allows a convenient way to apply custom
* settings in sub-classes. <br />
* In case of using MongoDB Java driver version 3 the returned {@link WriteConcern} will be defaulted to
* {@link WriteConcern#ACKNOWLEDGED} when {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}.
* settings in sub-classes.
*
* @param writeConcern any WriteConcern already configured or null
* @return The prepared WriteConcern or null
*/
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
WriteConcern wc = writeConcernResolver.resolve(mongoAction);
if (MongoClientVersion.isMongo3Driver()
&& ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)
&& (wc == null || wc.getW() < 1)) {
return WriteConcern.ACKNOWLEDGED;
}
return wc;
return writeConcernResolver.resolve(mongoAction);
}
protected <T> void doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
@@ -822,9 +745,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
ConvertingPropertyAccessor accessor = new ConvertingPropertyAccessor(
mongoPersistentEntity.getPropertyAccessor(entity), mongoConverter.getConversionService());
accessor.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
BeanWrapper<Object> wrapper = BeanWrapper.create(entity, this.mongoConverter.getConversionService());
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
@@ -841,27 +763,33 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doInsertAll(Collection<? extends T> listToSave, MongoWriter<T> writer) {
Map<String, List<T>> objs = new HashMap<String, List<T>>();
for (T o : listToSave) {
Map<String, List<T>> elementsByCollection = new HashMap<String, List<T>>();
for (T element : listToSave) {
if (element == null) {
continue;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(element.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(o.getClass());
if (entity == null) {
throw new InvalidDataAccessApiUsageException("No Persitent Entity information found for the class "
+ o.getClass().getName());
throw new InvalidDataAccessApiUsageException("No PersistentEntity information found for " + element.getClass());
}
String collection = entity.getCollection();
List<T> collectionElements = elementsByCollection.get(collection);
List<T> objList = objs.get(collection);
if (null == objList) {
objList = new ArrayList<T>();
objs.put(collection, objList);
if (null == collectionElements) {
collectionElements = new ArrayList<T>();
elementsByCollection.put(collection, collectionElements);
}
objList.add(o);
collectionElements.add(element);
}
for (Map.Entry<String, List<T>> entry : objs.entrySet()) {
for (Map.Entry<String, List<T>> entry : elementsByCollection.entrySet()) {
doInsertBatch(entry.getKey(), entry.getValue(), this.mongoConverter);
}
}
@@ -917,14 +845,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private <T> void doSaveVersioned(T objectToSave, MongoPersistentEntity<?> entity, String collectionName) {
ConvertingPropertyAccessor convertingAccessor = new ConvertingPropertyAccessor(
entity.getPropertyAccessor(objectToSave), mongoConverter.getConversionService());
BeanWrapper<T> beanWrapper = BeanWrapper.create(objectToSave, this.mongoConverter.getConversionService());
MongoPersistentProperty idProperty = entity.getIdProperty();
MongoPersistentProperty versionProperty = entity.getVersionProperty();
Object version = convertingAccessor.getProperty(versionProperty);
Number versionNumber = convertingAccessor.getProperty(versionProperty, Number.class);
Number version = beanWrapper.getProperty(versionProperty, Number.class);
// Fresh instance -> initialize version property
if (version == null) {
@@ -934,11 +859,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
// Create query for entity with the id and old version
Object id = convertingAccessor.getProperty(idProperty);
Object id = beanWrapper.getProperty(idProperty);
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
convertingAccessor.setProperty(versionProperty, versionNumber.longValue() + 1);
Number number = beanWrapper.getProperty(versionProperty, Number.class);
beanWrapper.setProperty(versionProperty, number.longValue() + 1);
BasicDBObject dbObject = new BasicDBObject();
@@ -1098,8 +1024,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
if (entity != null && entity.hasVersionProperty() && !multi) {
if (ReflectiveWriteResultInvoker.wasAcknowledged(writeResult) && writeResult.getN() == 0
&& dbObjectContainsVersionProperty(queryObj, entity)) {
if (writeResult.getN() == 0 && dbObjectContainsVersionProperty(queryObj, entity)) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString() + " to collection " + collectionName);
}
@@ -1170,11 +1095,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null || entity == null) {
if (idProp == null) {
throw new MappingException("No id property found for object of type " + objectType);
}
Object idValue = entity.getPropertyAccessor(object).getProperty(idProp);
Object idValue = BeanWrapper.create(object, mongoConverter.getConversionService())
.getProperty(idProp, Object.class);
return Collections.singletonMap(idProp.getFieldName(), idValue).entrySet().iterator().next();
}
@@ -1218,11 +1144,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
if (idProperty == null || persistentEntity == null) {
if (idProperty == null) {
return;
}
Object idValue = persistentEntity.getPropertyAccessor(entity).getProperty(idProperty);
ConversionService service = mongoConverter.getConversionService();
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(String.format(
@@ -1315,24 +1242,25 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
DBCollection inputCollection = getCollection(inputCollectionName);
MapReduceCommand command = new MapReduceCommand(inputCollection, mapFunc, reduceFunc,
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(), query == null
|| query.getQueryObject() == null ? null : queryMapper.getMappedObject(query.getQueryObject(), null));
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(), null);
copyMapReduceOptionsToCommand(query, mapReduceOptions, command);
DBObject commandObject = copyQuery(query, copyMapReduceOptions(mapReduceOptions, command));
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
MapReduceOutput mapReduceOutput = inputCollection.mapReduce(command);
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(mapReduceOutput.results()));
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(commandObject));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1340,7 +1268,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mappedResults.add(callback.doWith(dbObject));
}
return new MapReduceResults<T>(mappedResults, mapReduceOutput);
return new MapReduceResults<T>(mappedResults, commandResult);
}
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
@@ -1494,35 +1422,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
CommandResult commandResult = executeCommand(command);
CommandResult commandResult = executeCommand(command, getDb().getOptions());
handleCommandError(commandResult, command);
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult), commandResult);
}
/**
* Returns the potentially mapped results of the given {@commandResult} contained some.
*
* @param outputType
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, CommandResult commandResult) {
// map results
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("result");
if (resultSet == null) {
return Collections.emptyList();
}
List<O> mappedResults = new ArrayList<O>();
DbObjectCallback<O> callback = new UnwrapAndReadDbObjectCallback<O>(mongoConverter, outputType);
List<O> mappedResults = new ArrayList<O>();
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
return mappedResults;
return new AggregationResults<O>(mappedResults, commandResult);
}
protected String replaceWithResourceIfNecessary(String function) {
@@ -1547,40 +1460,51 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return func;
}
private void copyMapReduceOptionsToCommand(Query query, MapReduceOptions mapReduceOptions,
MapReduceCommand mapReduceCommand) {
private DBObject copyQuery(Query query, DBObject copyMapReduceOptions) {
if (query != null) {
if (query.getSkip() != 0 || query.getFieldsObject() != null) {
throw new InvalidDataAccessApiUsageException(
"Can not use skip or field specification with map reduce operations");
}
if (query.getQueryObject() != null) {
copyMapReduceOptions.put("query", queryMapper.getMappedObject(query.getQueryObject(), null));
}
if (query.getLimit() > 0) {
mapReduceCommand.setLimit(query.getLimit());
copyMapReduceOptions.put("limit", query.getLimit());
}
if (query.getSortObject() != null) {
mapReduceCommand.setSort(queryMapper.getMappedObject(query.getSortObject(), null));
copyMapReduceOptions.put("sort", queryMapper.getMappedObject(query.getSortObject(), null));
}
}
return copyMapReduceOptions;
}
private DBObject copyMapReduceOptions(MapReduceOptions mapReduceOptions, MapReduceCommand command) {
if (mapReduceOptions.getJavaScriptMode() != null) {
mapReduceCommand.setJsMode(true);
command.addExtraOption("jsMode", true);
}
if (!mapReduceOptions.getExtraOptions().isEmpty()) {
for (Map.Entry<String, Object> entry : mapReduceOptions.getExtraOptions().entrySet()) {
ReflectiveMapReduceInvoker.addExtraOption(mapReduceCommand, entry.getKey(), entry.getValue());
command.addExtraOption(entry.getKey(), entry.getValue());
}
}
if (mapReduceOptions.getFinalizeFunction() != null) {
mapReduceCommand.setFinalize(this.replaceWithResourceIfNecessary(mapReduceOptions.getFinalizeFunction()));
command.setFinalize(this.replaceWithResourceIfNecessary(mapReduceOptions.getFinalizeFunction()));
}
if (mapReduceOptions.getOutputDatabase() != null) {
mapReduceCommand.setOutputDB(mapReduceOptions.getOutputDatabase());
command.setOutputDB(mapReduceOptions.getOutputDatabase());
}
if (!mapReduceOptions.getScopeVariables().isEmpty()) {
mapReduceCommand.setScope(mapReduceOptions.getScopeVariables());
command.setScope(mapReduceOptions.getScopeVariables());
}
DBObject commandObject = command.toDBObject();
DBObject outObject = (DBObject) commandObject.get("out");
if (mapReduceOptions.getOutputSharded() != null) {
outObject.put("sharded", mapReduceOptions.getOutputSharded());
}
return commandObject;
}
public Set<String> getCollectionNames() {
@@ -1684,8 +1608,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedFields = queryMapper.getMappedFields(fields, entity);
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
@@ -1787,14 +1710,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
ConversionService conversionService = mongoConverter.getConversionService();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(savedObject.getClass());
PersistentPropertyAccessor accessor = entity.getPropertyAccessor(savedObject);
BeanWrapper<Object> wrapper = BeanWrapper.create(savedObject, conversionService);
if (accessor.getProperty(idProp) != null) {
Object idValue = wrapper.getProperty(idProp, idProp.getType());
if (idValue != null) {
return;
}
new ConvertingPropertyAccessor(accessor, conversionService).setProperty(idProp, id);
wrapper.setProperty(idProp, id);
}
private DBCollection getAndPrepareCollection(DB db, String collectionName) {
@@ -1961,7 +1885,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
String error = ReflectiveWriteResultInvoker.getError(writeResult);
String error = writeResult.getError();
if (error == null) {
return;
@@ -2038,7 +1962,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return null;
}
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
return queryMapper.getMappedObject(query.getSortObject(), entity);
}
// Callback implementations
@@ -2099,8 +2024,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public DBCursor doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null || fields.toMap().isEmpty()) {
if (fields == null) {
return collection.find(query);
} else {
return collection.find(query, fields);
@@ -2158,10 +2082,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Simple internal callback to allow operations on a {@link DBObject}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
static interface DbObjectCallback<T> {
private interface DbObjectCallback<T> {
T doWith(DBObject object);
}
@@ -2256,11 +2179,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint()) && !query.getMeta().hasValues()) {
&& !StringUtils.hasText(query.getHint())) {
return cursor;
}
DBCursor cursorToUse = cursor.copy();
DBCursor cursorToUse = cursor;
try {
if (query.getSkip() > 0) {
@@ -2276,12 +2199,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
if (query.getMeta().hasValues()) {
for (Entry<String, Object> entry : query.getMeta().values()) {
cursorToUse = cursorToUse.addSpecial(entry.getKey(), entry.getValue());
}
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -2324,76 +2241,4 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
/**
* A {@link CloseableIterator} that is backed by a MongoDB {@link Cursor}.
*
* @since 1.7
* @author Thomas Darimont
*/
static class CloseableIterableCusorAdapter<T> implements CloseableIterator<T> {
private volatile Cursor cursor;
private PersistenceExceptionTranslator exceptionTranslator;
private DbObjectCallback<T> objectReadCallback;
/**
* Creates a new {@link CloseableIterableCusorAdapter} backed by the given {@link Cursor}.
*
* @param cursor
* @param exceptionTranslator
* @param objectReadCallback
*/
public CloseableIterableCusorAdapter(Cursor cursor, PersistenceExceptionTranslator exceptionTranslator,
DbObjectCallback<T> objectReadCallback) {
this.cursor = cursor;
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
@Override
public boolean hasNext() {
if (cursor == null) {
return false;
}
try {
return cursor.hasNext();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
}
}
@Override
public T next() {
if (cursor == null) {
return null;
}
try {
DBObject item = cursor.next();
T converted = objectReadCallback.doWith(item);
return converted;
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
}
}
@Override
public void close() {
Cursor c = cursor;
try {
c.close();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
} finally {
cursor = null;
exceptionTranslator = null;
objectReadCallback = null;
}
}
}
}

View File

@@ -1,109 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -1,134 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -1,62 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -1,158 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -1,67 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -1,84 +0,0 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,8 +27,6 @@ import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -39,7 +37,6 @@ import com.mongodb.WriteConcern;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -57,9 +54,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
@@ -70,9 +65,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
@@ -84,9 +77,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
@@ -99,36 +90,13 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@Deprecated
@SuppressWarnings("deprecation")
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
true, uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
@@ -149,25 +117,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
@@ -189,7 +138,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,6 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -45,23 +44,9 @@ import com.mongodb.DBObject;
*/
public class Aggregation {
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
private final List<AggregationOperation> operations;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
@@ -81,20 +66,6 @@ public class Aggregation {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
@@ -121,43 +92,11 @@ public class Aggregation {
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
this.operations = aggregationOperations;
this.options = options;
this.operations = Arrays.asList(aggregationOperations);
}
/**
@@ -292,29 +231,6 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
@@ -339,8 +255,6 @@ public class Aggregation {
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
@@ -388,51 +302,4 @@ public class Aggregation {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -1,189 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,6 @@ import com.mongodb.DBObject;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
@@ -91,16 +90,6 @@ public class AggregationResults<T> implements Iterable<T> {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");

View File

@@ -88,7 +88,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
@@ -107,7 +107,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
@@ -363,7 +363,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,7 +30,6 @@ import org.springframework.util.StringUtils;
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
@@ -84,15 +83,6 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -196,24 +186,15 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation field with the given {@code name}.
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* as well.
*
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
* @param key
*/
public AggregationField(String name) {
this(name, null);
public AggregationField(String key) {
this(key, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);
@@ -236,10 +217,6 @@ public final class Fields implements Iterable<Field> {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,33 +22,17 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
public GeoNearOperation(NearQuery nearQuery) {
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
@@ -57,10 +41,6 @@ public class GeoNearOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,9 +31,6 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
@@ -367,16 +364,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
}
public Object getValue(AggregationOperationContext context) {
if (reference == null) {
return value;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
return reference;
}
return context.getReference(reference).toString();
return reference == null ? value : context.getReference(reference).toString();
}
@Override

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,10 +21,7 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
* Encapsulates the {@code $limit}-operation
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
@@ -22,11 +23,7 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
* Encapsulates the {@code $match}-operation
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
@@ -38,6 +35,18 @@ public class MatchOperation implements AggregationOperation {
private final CriteriaDefinition criteriaDefinition;
/**
* Creates a new {@link MatchOperation} for the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
* @deprecated Use {@link MatchOperation#MatchOperation(CriteriaDefinition)} instead. This constructor is scheduled
* for removal in the next versions.
*/
@Deprecated
public MatchOperation(Criteria criteria) {
this((CriteriaDefinition) criteria);
}
/**
* Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,17 +24,15 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedFi
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation.
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle
@@ -237,53 +235,26 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
}
/**
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
*
* @author Thomas Darimont
*/
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
public static class ExpressionProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
private final Object[] params;
private final String expression;
/**
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
* parameters.
*
* @param expression must not be {@literal null}.
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
public ExpressionProjectionOperationBuilder(Object value, ProjectionOperation operation, Object[] parameters) {
super(expression, operation, null);
this.expression = expression;
super(value, operation);
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder#project(java.lang.String, java.lang.Object[])
*/
@Override
public ProjectionOperationBuilder project(String operation, final Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values) {
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.length + 1);
result.add(ExpressionProjection.toMongoExpression(context,
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
result.addAll(Arrays.asList(values));
return result;
}
};
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
@@ -332,11 +303,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
}
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
return TRANSFORMER.transform(expression, context, params);
return new BasicDBObject(getExposedField().getName(), TRANSFORMER.transform(expression, context, params));
}
}
}
@@ -353,6 +320,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
private final String name;
private final ProjectionOperation operation;
private final OperationProjection previousProjection;
/**
@@ -367,23 +335,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
super(name, operation);
this.name = name;
this.previousProjection = previousProjection;
}
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
* {@link ProjectionOperation}.
*
* @param value
* @param operation
* @param previousProjection
*/
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
OperationProjection previousProjection) {
super(value, operation);
this.name = null;
this.operation = operation;
this.previousProjection = previousProjection;
}
@@ -569,9 +521,8 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public ProjectionOperationBuilder project(String operation, Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values);
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
OperationProjection projectionOperation = new OperationProjection(Fields.field(name), operation, values);
return new ProjectionOperationBuilder(name, this.operation.and(projectionOperation), projectionOperation);
}
/**
@@ -676,10 +627,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
// implicit reference or explicit include?
if (value == null || Boolean.TRUE.equals(value)) {
if (Aggregation.SystemVariable.isReferingToSystemVariable(field.getTarget())) {
return field.getTarget();
}
// check whether referenced field exists in the context
return context.getReference(field).getReferenceValue();
@@ -702,7 +649,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/**
* Creates a new {@link OperationProjection} for the given field.
*
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param name the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
@@ -725,15 +672,18 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject inner = new BasicDBObject("$" + operation, getOperationArguments(context));
BasicDBList values = new BasicDBList();
values.addAll(buildReferences(context));
return new BasicDBObject(getField().getName(), inner);
DBObject inner = new BasicDBObject("$" + operation, values);
return new BasicDBObject(this.field.getName(), inner);
}
protected List<Object> getOperationArguments(AggregationOperationContext context) {
private List<Object> buildReferences(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(getField().getName()).toString());
result.add(context.getReference(field.getTarget()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
@@ -742,15 +692,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return result;
}
/**
* Returns the field that holds the {@link OperationProjection}.
*
* @return
*/
protected Field getField() {
return field;
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*
@@ -758,27 +699,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public OperationProjection withAlias(String alias) {
final Field aliasedField = Fields.field(alias, this.field.getName());
return new OperationProjection(aliasedField, operation, values.toArray()) {
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.OperationProjection#getField()
*/
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationProjection that we replace
// with this new instance.
return OperationProjection.this.getOperationArguments(context);
}
};
return new OperationProjection(Fields.field(alias, this.field.getName()), operation, values.toArray());
}
}
@@ -810,96 +731,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new BasicDBObject(name, nestedObject);
}
}
/**
* Extracts the minute from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMinute() {
return project("minute");
}
/**
* Extracts the hour from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractHour() {
return project("hour");
}
/**
* Extracts the second from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractSecond() {
return project("second");
}
/**
* Extracts the millisecond from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMillisecond() {
return project("millisecond");
}
/**
* Extracts the year from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractYear() {
return project("year");
}
/**
* Extracts the month from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMonth() {
return project("month");
}
/**
* Extracts the week from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractWeek() {
return project("week");
}
/**
* Extracts the dayOfYear from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfYear() {
return project("dayOfYear");
}
/**
* Extracts the dayOfMonth from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfMonth() {
return project("dayOfMonth");
}
/**
* Extracts the dayOfWeek from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfWeek() {
return project("dayOfWeek");
}
}
/**
@@ -940,4 +771,5 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,9 +22,6 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,9 +26,6 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.List;
import org.springframework.util.Assert;
/**
@@ -32,34 +30,11 @@ public class TypedAggregation<I> extends Aggregation {
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
this(inputType, asAggregationList(operations));
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations) {
this(inputType, operations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s and the given
* {@link AggregationOptions}.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
* @param options must not be {@literal null}.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations, AggregationOptions options) {
super(operations, options);
super(operations);
Assert.notNull(inputType, "Input type must not be null!");
this.inputType = inputType;
@@ -73,14 +48,4 @@ public class TypedAggregation<I> extends Aggregation {
public Class<I> getInputType() {
return inputType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Aggregation#withOptions(org.springframework.data.mongodb.core.aggregation.AggregationOptions)
*/
public TypedAggregation<I> withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new TypedAggregation<I>(inputType, operations, options);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,9 +23,6 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -46,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
}
/**

View File

@@ -37,23 +37,17 @@ import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.Jsr310Converters;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.ThreeTenBackPortConverters;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.NamedMongoScriptToDBObjectConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToNamedMongoScriptCoverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.TermToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.CacheValue;
import org.springframework.util.Assert;
/**
@@ -65,7 +59,6 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class CustomConversions {
@@ -80,9 +73,9 @@ public class CustomConversions {
private final List<Object> converters;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customWriteTargetTypes;
private final Map<Class<?>, CacheValue<Class<?>>> rawWriteTargetTypes;
private final Map<ConvertiblePair, CacheValue> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue> customWriteTargetTypes;
private final Map<Class<?>, CacheValue> rawWriteTargetTypes;
/**
* Creates an empty {@link CustomConversions} object.
@@ -103,9 +96,9 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue<Class<?>>>();
this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue>();
List<Object> toRegister = new ArrayList<Object>();
@@ -119,14 +112,9 @@ public class CustomConversions {
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.add(TermToStringConverter.INSTANCE);
toRegister.add(NamedMongoScriptToDBObjectConverter.INSTANCE);
toRegister.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());
toRegister.addAll(Jsr310Converters.getConvertersToRegister());
toRegister.addAll(ThreeTenBackPortConverters.getConvertersToRegister());
for (Object c : toRegister) {
registerConversion(c);
@@ -388,16 +376,16 @@ public class CustomConversions {
* @param producer the {@link Producer} to create values to cache, must not be {@literal null}.
* @return
*/
private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue<Class<?>>> cache, Producer producer) {
private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue> cache, Producer producer) {
CacheValue<Class<?>> cacheValue = cache.get(key);
CacheValue cacheValue = cache.get(key);
if (cacheValue != null) {
return cacheValue.getValue();
return cacheValue.getType();
}
Class<?> type = producer.get();
cache.put(key, CacheValue.<Class<?>> ofNullable(type));
cache.put(key, CacheValue.of(type));
return type;
}
@@ -424,4 +412,30 @@ public class CustomConversions {
return source.toString();
}
}
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
* @author Thomas Darimont
*/
private static class CacheValue {
private static final CacheValue ABSENT = new CacheValue(null);
private final Class<?> type;
public CacheValue(Class<?> type) {
this.type = type;
}
public Class<?> getType() {
return type;
}
static CacheValue of(Class<?> type) {
return type == null ? ABSENT : new CacheValue(type);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
@@ -26,7 +25,6 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4
*/
public interface DbRefResolver {
@@ -41,8 +39,7 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}
@@ -55,13 +52,4 @@ public interface DbRefResolver {
*/
DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity,
Object id);
/**
* Actually loads the {@link DBRef} from the datasource.
*
* @param dbRef must not be {@literal null}.
* @return
* @since 1.7
*/
DBObject fetch(DBRef dbRef);
}

View File

@@ -1,79 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
class DefaultDbRefProxyHandler implements DbRefProxyHandler {
private final SpELContext spELContext;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ValueResolver resolver;
/**
* @param spELContext must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public DefaultDbRefProxyHandler(SpELContext spELContext,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, ValueResolver resolver) {
this.spELContext = spELContext;
this.mappingContext = mappingContext;
this.resolver = resolver;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/
@Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) {
if (source == null) {
return proxy;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(property);
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty.usePropertyAccess()) {
return proxy;
}
SpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(proxy, spELContext);
PersistentPropertyAccessor accessor = entity.getPropertyAccessor(proxy);
DBObject object = new BasicDBObject(idProperty.getFieldName(), source.getId());
ObjectPath objectPath = ObjectPath.ROOT.push(proxy, entity, null);
accessor.setProperty(idProperty, resolver.getValueInternal(idProperty, object, evaluator, objectPath));
return proxy;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,10 @@ import java.lang.reflect.Method;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
import org.objenesis.Objenesis;
import org.objenesis.ObjenesisStd;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.BeanUtils;
import org.springframework.cglib.proxy.Callback;
import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory;
@@ -36,11 +39,12 @@ import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.DB;
import com.mongodb.DBRef;
/**
@@ -49,14 +53,14 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final boolean OBJENESIS_PRESENT = ClassUtils.isPresent("org.objenesis.Objenesis", null);
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
/**
* Creates a new {@link DefaultDbRefResolver} with the given {@link MongoDbFactory}.
@@ -69,7 +73,6 @@ public class DefaultDbRefResolver implements DbRefResolver {
this.mongoDbFactory = mongoDbFactory;
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
this.objenesis = new ObjenesisStd(true);
}
/*
@@ -77,14 +80,13 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback, handler);
return createLazyLoadingProxy(property, dbref, callback);
}
return callback.resolve(property);
@@ -97,16 +99,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
@Override
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
MongoPersistentEntity<?> entity, Object id) {
return new DBRef(entity.getCollection(), id);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#fetch(com.mongodb.DBRef)
*/
@Override
public DBObject fetch(DBRef dbRef) {
return ReflectiveDBRefResolver.fetch(mongoDbFactory.getDb(), dbRef);
DB db = mongoDbFactory.getDb();
db = annotation != null && StringUtils.hasText(annotation.db()) ? mongoDbFactory.getDb(annotation.db()) : db;
return new DBRef(db, entity.getCollection(), id);
}
/**
@@ -118,47 +115,34 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
if (!propertyType.isInterface()) {
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
factory.setCallbacks(new Callback[] { interceptor });
return handler.populateId(property, dbref, factory);
}
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
ProxyFactory proxyFactory = new ProxyFactory();
Class<?> propertyType = property.getType();
for (Class<?> type : propertyType.getInterfaces()) {
proxyFactory.addInterface(type);
}
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
proxyFactory.addInterface(LazyLoadingProxy.class);
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
return handler.populateId(property, dbref, proxyFactory.getProxy());
}
if (propertyType.isInterface()) {
proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
/**
* Returns the CGLib enhanced type for the given source type.
*
* @param type
* @return
*/
private Class<?> getEnhancedTypeFor(Class<?> type) {
proxyFactory.setProxyTargetClass(true);
proxyFactory.setTargetClass(propertyType);
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
if (!OBJENESIS_PRESENT) {
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
return enhancer.createClass();
return ObjenesisProxyEnhancer.enhanceAndGet(proxyFactory, propertyType, interceptor);
}
/**
@@ -178,12 +162,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
@@ -195,9 +178,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
FINALIZE_METHOD = Object.class.getDeclaredMethod("finalize");
} catch (Exception e) {
throw new RuntimeException(e);
}
@@ -261,11 +243,6 @@ public class DefaultDbRefResolver implements DbRefResolver {
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
// DATAMONGO-1076 - finalize methods should not trigger proxy initialization
if (FINALIZE_METHOD.equals(method)) {
return null;
}
}
Object target = ensureResolved();
@@ -287,7 +264,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
StringBuilder description = new StringBuilder();
if (dbref != null) {
description.append(dbref.getCollectionName());
description.append(dbref.getRef());
description.append(":");
description.append(dbref.getId());
} else {
@@ -394,4 +371,109 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result;
}
}
/**
* Static class to accommodate optional dependency on Objenesis.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.4
*/
private static class ObjenesisProxyEnhancer {
private static final boolean IS_SPRING_4_OR_BETTER = ClassUtils.isPresent(
"org.springframework.core.DefaultParameterNameDiscoverer", null);
private static final InstanceCreatorStrategy INSTANCE_CREATOR;
static {
if (IS_SPRING_4_OR_BETTER) {
INSTANCE_CREATOR = new Spring4ObjenesisInstanceCreatorStrategy();
} else {
INSTANCE_CREATOR = new DefaultObjenesisInstanceCreatorStrategy();
}
}
public static Object enhanceAndGet(ProxyFactory proxyFactory, Class<?> type,
org.springframework.cglib.proxy.MethodInterceptor interceptor) {
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) INSTANCE_CREATOR.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });
return factory;
}
/**
* Strategy for constructing new instances of a given {@link Class}.
*
* @author Thomas Darimont
*/
interface InstanceCreatorStrategy {
Object newInstance(Class<?> clazz);
}
/**
* An {@link InstanceCreatorStrategy} that uses Objenesis from the classpath.
*
* @author Thomas Darimont
*/
private static class DefaultObjenesisInstanceCreatorStrategy implements InstanceCreatorStrategy {
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.ObjenesisProxyEnhancer.InstanceCreatorStrategy#newInstance(java.lang.Class)
*/
@Override
public Object newInstance(Class<?> clazz) {
return OBJENESIS.newInstance(clazz);
}
}
/**
* An {@link InstanceCreatorStrategy} that uses a repackaged version of Objenesis from Spring 4.
*
* @author Thomas Darimont
*/
private static class Spring4ObjenesisInstanceCreatorStrategy implements InstanceCreatorStrategy {
private static final String SPRING4_OBJENESIS_CLASS_NAME = "org.springframework.objenesis.ObjenesisStd";
private static final Object OBJENESIS;
private static final Method NEW_INSTANCE_METHOD;
static {
try {
Class<?> objenesisClass = ClassUtils.forName(SPRING4_OBJENESIS_CLASS_NAME,
ObjenesisProxyEnhancer.class.getClassLoader());
OBJENESIS = BeanUtils.instantiateClass(objenesisClass.getConstructor(boolean.class), true);
NEW_INSTANCE_METHOD = objenesisClass.getMethod("newInstance", Class.class);
} catch (Exception e) {
throw new RuntimeException("Could not setup Objenesis infrastructure with Spring 4 ", e);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.ObjenesisProxyEnhancer.InstanceCreatorStrategy#newInstance(java.lang.Class)
*/
@Override
public Object newInstance(Class<?> clazz) {
try {
return NEW_INSTANCE_METHOD.invoke(OBJENESIS, clazz);
} catch (Exception e) {
throw new RuntimeException("Could not created instance for " + clazz, e);
}
}
}
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Default implementation of {@link DbRefResolverCallback}.
*
* @author Oliver Gierke
*/
class DefaultDbRefResolverCallback implements DbRefResolverCallback {
private final DBObject surroundingObject;
private final ObjectPath path;
private final ValueResolver resolver;
private final SpELExpressionEvaluator evaluator;
/**
* Creates a new {@link DefaultDbRefResolverCallback} using the given {@link DBObject}, {@link ObjectPath},
* {@link ValueResolver} and {@link SpELExpressionEvaluator}.
*
* @param surroundingObject must not be {@literal null}.
* @param path must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public DefaultDbRefResolverCallback(DBObject surroundingObject, ObjectPath path, SpELExpressionEvaluator evaluator,
ValueResolver resolver) {
this.surroundingObject = surroundingObject;
this.path = path;
this.resolver = resolver;
this.evaluator = evaluator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolverCallback#resolve(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public Object resolve(MongoPersistentProperty property) {
return resolver.getValueInternal(property, surroundingObject, evaluator, path);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,18 +30,9 @@ import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.geo.GeoJson;
import org.springframework.data.mongodb.core.geo.GeoJsonGeometryCollection;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPolygon;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonPolygon;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -52,7 +43,6 @@ import com.mongodb.DBObject;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.5
*/
abstract class GeoConverters {
@@ -73,24 +63,16 @@ abstract class GeoConverters {
BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE //
, GeoCommandToDbObjectConverter.INSTANCE //
, GeoJsonToDbObjectConverter.INSTANCE //
, GeoJsonPointToDbObjectConverter.INSTANCE //
, GeoJsonPolygonToDbObjectConverter.INSTANCE //
, DbObjectToGeoJsonPointConverter.INSTANCE //
, DbObjectToGeoJsonPolygonConverter.INSTANCE //
, DbObjectToGeoJsonLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiPointConverter.INSTANCE //
, DbObjectToGeoJsonMultiPolygonConverter.INSTANCE //
, DbObjectToGeoJsonGeometryCollectionConverter.INSTANCE);
, GeoCommandToDbObjectConverter.INSTANCE);
}
/**
@@ -100,7 +82,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@ReadingConverter
static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
public static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
INSTANCE;
@@ -109,11 +91,13 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) {
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
(Double) source.get("y"));
}
}
@@ -123,7 +107,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
public static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
INSTANCE;
@@ -144,7 +128,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@WritingConverter
static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
public static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
INSTANCE;
@@ -167,13 +151,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link Box}.
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
public static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
INSTANCE;
@@ -182,6 +166,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) {
if (source == null) {
@@ -191,7 +176,7 @@ abstract class GeoConverters {
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new Box(first, second);
return new org.springframework.data.mongodb.core.geo.Box(first, second);
}
}
@@ -201,7 +186,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
public static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
INSTANCE;
@@ -225,13 +210,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link DBObject} into a {@link Circle}.
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
public static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
INSTANCE;
@@ -266,13 +251,78 @@ abstract class GeoConverters {
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
public static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
INSTANCE;
@@ -302,7 +352,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@ReadingConverter
static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
public static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
INSTANCE;
@@ -343,7 +393,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
public static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
INSTANCE;
@@ -372,13 +422,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link Polygon}.
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
public static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
INSTANCE;
@@ -387,7 +437,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings({ "unchecked" })
@SuppressWarnings({ "deprecation", "unchecked" })
public Polygon convert(DBObject source) {
if (source == null) {
@@ -403,7 +453,7 @@ abstract class GeoConverters {
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
}
return new Polygon(newPoints);
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
}
}
@@ -413,7 +463,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
public static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
INSTANCE;
@@ -422,7 +472,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("rawtypes")
@SuppressWarnings("deprecation")
public DBObject convert(GeoCommand source) {
if (source == null) {
@@ -433,10 +483,6 @@ abstract class GeoConverters {
Shape shape = source.getShape();
if (shape instanceof GeoJson) {
return GeoJsonToDbObjectConverter.INSTANCE.convert((GeoJson) shape);
}
if (shape instanceof Box) {
argument.add(toList(((Box) shape).getFirst()));
@@ -447,10 +493,10 @@ abstract class GeoConverters {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof Circle) {
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius());
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
} else if (shape instanceof Polygon) {
@@ -468,377 +514,7 @@ abstract class GeoConverters {
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
@SuppressWarnings("rawtypes")
static enum GeoJsonToDbObjectConverter implements Converter<GeoJson, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJson source) {
if (source == null) {
return null;
}
DBObject dbo = new BasicDBObject("type", source.getType());
if (source instanceof GeoJsonGeometryCollection) {
BasicDBList dbl = new BasicDBList();
for (GeoJson geometry : ((GeoJsonGeometryCollection) source).getCoordinates()) {
dbl.add(convert(geometry));
}
dbo.put("geometries", dbl);
} else {
dbo.put("coordinates", convertIfNecessarry(source.getCoordinates()));
}
return dbo;
}
private Object convertIfNecessarry(Object candidate) {
if (candidate instanceof GeoJson) {
return convertIfNecessarry(((GeoJson) candidate).getCoordinates());
}
if (candidate instanceof Iterable) {
BasicDBList dbl = new BasicDBList();
for (Object element : (Iterable) candidate) {
dbl.add(convertIfNecessarry(element));
}
return dbl;
}
if (candidate instanceof Point) {
return toList((Point) candidate);
}
return candidate;
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPointToDbObjectConverter implements Converter<GeoJsonPoint, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPoint source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPolygonToDbObjectConverter implements Converter<GeoJsonPolygon, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPolygon source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPointConverter implements Converter<DBObject, GeoJsonPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("unchecked")
public GeoJsonPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Point"),
String.format("Cannot convert type '%s' to Point.", source.get("type")));
List<Double> dbl = (List<Double>) source.get("coordinates");
return new GeoJsonPoint(dbl.get(0).doubleValue(), dbl.get(1).doubleValue());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPolygonConverter implements Converter<DBObject, GeoJsonPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Polygon"),
String.format("Cannot convert type '%s' to Polygon.", source.get("type")));
return toGeoJsonPolygon((BasicDBList) source.get("coordinates"));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPolygonConverter implements Converter<DBObject, GeoJsonMultiPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPolygon"),
String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type")));
BasicDBList dbl = (BasicDBList) source.get("coordinates");
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>();
for (Object polygon : dbl) {
polygones.add(toGeoJsonPolygon((BasicDBList) polygon));
}
return new GeoJsonMultiPolygon(polygones);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonLineStringConverter implements Converter<DBObject, GeoJsonLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "LineString"),
String.format("Cannot convert type '%s' to LineString.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonLineString(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPointConverter implements Converter<DBObject, GeoJsonMultiPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPoint"),
String.format("Cannot convert type '%s' to MultiPoint.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonMultiPoint(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiLineStringConverter implements Converter<DBObject, GeoJsonMultiLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiLineString"),
String.format("Cannot convert type '%s' to MultiLineString.", source.get("type")));
List<GeoJsonLineString> lines = new ArrayList<GeoJsonLineString>();
BasicDBList cords = (BasicDBList) source.get("coordinates");
for (Object line : cords) {
lines.add(new GeoJsonLineString(toListOfPoint((BasicDBList) line)));
}
return new GeoJsonMultiLineString(lines);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonGeometryCollectionConverter implements Converter<DBObject, GeoJsonGeometryCollection> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@SuppressWarnings("rawtypes")
@Override
public GeoJsonGeometryCollection convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"),
String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type")));
List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
for (Object o : (List) source.get("geometries")) {
geometries.add(convertGeometries((DBObject) o));
}
return new GeoJsonGeometryCollection(geometries);
}
private static GeoJson<?> convertGeometries(DBObject source) {
Object type = source.get("type");
if (ObjectUtils.nullSafeEquals(type, "Point")) {
return DbObjectToGeoJsonPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPoint")) {
return DbObjectToGeoJsonMultiPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "LineString")) {
return DbObjectToGeoJsonLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiLineString")) {
return DbObjectToGeoJsonMultiLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "Polygon")) {
return DbObjectToGeoJsonPolygonConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPolygon")) {
return DbObjectToGeoJsonMultiPolygonConverter.INSTANCE.convert(source);
}
throw new IllegalArgumentException(String.format("Cannot convert unknown GeoJson type %s", type));
}
}
static List<Double> toList(Point point) {
return Arrays.asList(point.getX(), point.getY());
}
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPoint}s.
*
* @param listOfCoordinatePairs
* @return
* @since 1.7
*/
@SuppressWarnings("unchecked")
static List<Point> toListOfPoint(BasicDBList listOfCoordinatePairs) {
List<Point> points = new ArrayList<Point>();
for (Object point : listOfCoordinatePairs) {
Assert.isInstanceOf(List.class, point);
List<Double> coordinatesList = (List<Double>) point;
points.add(new GeoJsonPoint(coordinatesList.get(0).doubleValue(), coordinatesList.get(1).doubleValue()));
}
return points;
}
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPolygon}.
*
* @param dbList
* @return
* @since 1.7
*/
static GeoJsonPolygon toGeoJsonPolygon(BasicDBList dbList) {
return new GeoJsonPolygon(toListOfPoint((BasicDBList) dbList.get(0)));
}
}

View File

@@ -23,7 +23,6 @@ import com.mongodb.DBRef;
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.5
*/
public interface LazyLoadingProxy {
@@ -34,7 +33,7 @@ public interface LazyLoadingProxy {
* @return
* @since 1.5
*/
Object getTarget();
Object initialize();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,17 +31,16 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.ParameterValueProvider;
@@ -75,7 +74,7 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s";
@@ -85,7 +84,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
@@ -98,10 +96,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(DbRefResolver dbRefResolver,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(new DefaultConversionService());
super(ConversionServiceFactory.createDefaultConversionService());
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
@@ -188,11 +187,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) {
return read(type, dbo, ObjectPath.ROOT);
return read(type, dbo, null);
}
@SuppressWarnings("unchecked")
private <S extends Object> S read(TypeInformation<S> type, DBObject dbo, ObjectPath path) {
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo, Object parent) {
if (null == dbo) {
return null;
@@ -210,15 +209,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, path);
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, parent);
}
if (typeToUse.isMap()) {
return (S) readMap(typeToUse, dbo, path);
return (S) readMap(typeToUse, dbo, parent);
}
if (dbo instanceof BasicDBList) {
throw new MappingException(String.format(INCOMPATIBLE_TYPES, dbo, BasicDBList.class, typeToUse.getType(), path));
throw new MappingException(String.format(INCOMPATIBLE_TYPES, dbo, BasicDBList.class, typeToUse.getType(), parent));
}
// Retrieve persistent entity info
@@ -228,57 +227,40 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for " + rawType.getName());
}
return read(persistentEntity, dbo, path);
return read(persistentEntity, dbo, parent);
}
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
DBObject source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, path.getCurrentObject());
entity, provider, parent);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, path);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) {
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, path);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, parent);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider);
final PersistentPropertyAccessor accessor = new ConvertingPropertyAccessor(entity.getPropertyAccessor(instance),
conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
final S result = instance;
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
accessor.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
final S result = wrapper.getBean();
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
// we skip the id property since it was already set
if (idProperty != null && idProperty.equals(prop)) {
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
accessor.setProperty(prop, getValueInternal(prop, dbo, evaluator, currentPath));
wrapper.setProperty(prop, getValueInternal(prop, dbo, evaluator, result));
}
});
@@ -294,13 +276,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
wrapper.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
DbRefProxyHandler handler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
MappingMongoConverter.this);
DbRefResolverCallback callback = new DefaultDbRefResolverCallback(dbo, currentPath, evaluator,
MappingMongoConverter.this);
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
@Override
public Object resolve(MongoPersistentProperty property) {
return getValueInternal(property, dbo, evaluator, parent);
}
}));
}
});
@@ -348,7 +330,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
typeMapper.writeType(type, dbo);
}
Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).getTarget() : obj;
Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).initialize() : obj;
writeInternal(target, dbo, type);
}
@@ -400,13 +382,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName());
}
final PersistentPropertyAccessor accessor = entity.getPropertyAccessor(obj);
final BeanWrapper<Object> wrapper = BeanWrapper.create(obj, conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
try {
Object id = accessor.getProperty(idProperty);
Object id = wrapper.getProperty(idProperty, Object.class);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {}
}
@@ -415,11 +397,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty) || !prop.isWritable()) {
if (prop.equals(idProperty)) {
return;
}
Object propertyObj = accessor.getProperty(prop);
Object propertyObj = wrapper.getProperty(prop);
if (null != propertyObj) {
@@ -433,12 +415,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
});
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Object propertyObj = accessor.getProperty(inverseProp);
Class<?> type = inverseProp.getType();
Object propertyObj = wrapper.getProperty(inverseProp, type);
if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp);
}
@@ -494,7 +474,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).getTarget();
obj = ((LazyLoadingProxy) obj).initialize();
}
// Lookup potential custom target type
@@ -508,7 +488,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object existingValue = accessor.get(prop);
BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue
: new BasicDBObject();
addCustomTypeKeyIfNecessary(ClassTypeInformation.from(prop.getRawType()), obj, propDbObj);
addCustomTypeKeyIfNecessary(type, obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
@@ -737,7 +717,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Adds custom type information to the given {@link DBObject} if necessary. That is if the value is not the same as
* the one given. This is usually the case if you store a subtype of the actual declared type of the property.
*
*
* @param type
* @param value must not be {@literal null}.
* @param dbObject must not be {@literal null}.
@@ -844,8 +824,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (target.getClass().equals(idProperty.getType())) {
id = target;
} else {
PersistentPropertyAccessor accessor = targetEntity.getPropertyAccessor(target);
id = accessor.getProperty(idProperty);
BeanWrapper<Object> wrapper = BeanWrapper.create(target, conversionService);
id = wrapper.getProperty(idProperty, Object.class);
}
if (null == id) {
@@ -856,14 +836,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
idMapper.convertId(id));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.ValueResolver#getValueInternal(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, com.mongodb.DBObject, org.springframework.data.mapping.model.SpELExpressionEvaluator, java.lang.Object)
*/
@Override
public Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath path) {
return new MongoDbPropertyValueProvider(dbo, evaluator, path).getPropertyValue(prop);
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
Object parent) {
return new MongoDbPropertyValueProvider(dbo, evaluator, parent).getPropertyValue(prop);
}
/**
@@ -871,13 +847,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Assert.notNull(targetType);
Class<?> collectionType = targetType.getType();
@@ -898,9 +872,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
path));
parent));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, path));
items.add(read(componentType, (DBObject) dbObjItem, parent));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
@@ -913,15 +887,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
*
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
* @param dbObject
* @return
*/
@SuppressWarnings("unchecked")
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, ObjectPath path) {
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) {
Assert.notNull(dbObject, "DBObject must not be null!");
Assert.notNull(path, "Object path must not be null!");
Assert.notNull(dbObject);
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
@@ -948,7 +920,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, path));
map.put(key, read(valueType, (DBObject) value, parent));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
} else {
@@ -960,6 +932,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map;
}
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {
Object obj = dbList.get(i);
if (obj instanceof BasicDBList) {
rootList.add(unwrapList((BasicDBList) obj, targetType.getComponentType()));
} else if (obj instanceof DBObject) {
rootList.add(read(targetType, (DBObject) obj));
} else {
rootList.add(obj);
}
}
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
@@ -1079,24 +1066,24 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private final DBObjectAccessor source;
private final SpELExpressionEvaluator evaluator;
private final ObjectPath path;
private final Object parent;
/**
* Creates a new {@link MongoDbPropertyValueProvider} for the given source, {@link SpELExpressionEvaluator} and
* {@link ObjectPath}.
* parent object.
*
* @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param path can be {@literal null}.
* @param parent can be {@literal null}.
*/
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, ObjectPath path) {
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, Object parent) {
Assert.notNull(source);
Assert.notNull(evaluator);
this.source = new DBObjectAccessor(source);
this.evaluator = evaluator;
this.path = path;
this.parent = parent;
}
/*
@@ -1112,7 +1099,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
return readValue(value, property.getTypeInformation(), path);
return readValue(value, property.getTypeInformation(), parent);
}
}
@@ -1125,7 +1112,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final ObjectPath path;
private final Object parent;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
@@ -1135,10 +1122,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, ObjectPath path) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
super(evaluator, conversionService, delegate);
this.path = path;
this.parent = parent;
}
/*
@@ -1147,44 +1134,28 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), path);
return readValue(object, parameter.getType(), parent);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent));
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, path);
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, path);
return (T) read(type, (DBObject) value, parent);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
@SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
if (rawType.equals(DBRef.class)) {
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
if (object != null) {
return (T) object;
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
}
/**
* Performs the fetch operation for the given {@link DBRef}.
*
@@ -1192,6 +1163,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return
*/
DBObject readRef(DBRef ref) {
return dbRefResolver.fetch(ref);
return ref.fetch();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,19 +20,13 @@ import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import org.bson.types.Code;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
@@ -40,7 +34,6 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
abstract class MongoConverters {
@@ -167,65 +160,4 @@ abstract class MongoConverters {
return source == null ? null : source.toString();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@Override
public String convert(Term source) {
return source == null ? null : source.getFormatted();
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum DBObjectToNamedMongoScriptCoverter implements Converter<DBObject, NamedMongoScript> {
INSTANCE;
@Override
public NamedMongoScript convert(DBObject source) {
if (source == null) {
return null;
}
String id = source.get("_id").toString();
Object rawValue = source.get("value");
return new NamedMongoScript(id, ((Code) rawValue).getCode());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum NamedMongoScriptToDBObjectConverter implements Converter<NamedMongoScript, DBObject> {
INSTANCE;
@Override
public DBObject convert(NamedMongoScript source) {
if (source == null) {
return new BasicDBObject();
}
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
builder.append("_id", source.getName());
builder.append("value", new Code(source.getCode()));
return builder.get();
}
}
}

View File

@@ -1,182 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* A path of objects nested into each other. The type allows access to all parent objects currently in creation even
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
private ObjectPath() {
this.items = Collections.emptyList();
}
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
* @param parent can be {@literal null}.
* @param item
*/
private ObjectPath(ObjectPath parent, ObjectPath.ObjectPathItem item) {
List<ObjectPath.ObjectPathItem> items = new ArrayList<ObjectPath.ObjectPathItem>(parent.items);
items.add(item);
this.items = Collections.unmodifiableList(items);
}
/**
* Returns a copy of the {@link ObjectPath} with the given {@link Object} as current object.
*
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
ObjectPathItem item = new ObjectPathItem(object, id, entity.getCollection());
return new ObjectPath(this, item);
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
public Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())) {
return object;
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
* @return
*/
public Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
if (items.isEmpty()) {
return "[empty]";
}
List<String> strings = new ArrayList<String>(items.size());
for (ObjectPathItem item : items) {
strings.add(item.object.toString());
}
return StringUtils.collectionToDelimitedString(strings, " -> ");
}
/**
* An item in an {@link ObjectPath}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -57,11 +57,6 @@ import com.mongodb.DBRef;
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
}
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -124,61 +119,6 @@ public class QueryMapper {
return result;
}
/**
* Maps fields used for sorting to the {@link MongoPersistentEntity}s properties. <br />
* Also converts properties to their {@code $meta} representation if present.
*
* @param sortObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedSort(DBObject sortObject, MongoPersistentEntity<?> entity) {
if (sortObject == null) {
return null;
}
DBObject mappedSort = getMappedObject(sortObject, entity);
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
}
/**
* Maps fields to retrieve to the {@link MongoPersistentEntity}s properties. <br />
* Also onverts and potentially adds missing property {@code $meta} representation.
*
* @param fieldsObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedFields(DBObject fieldsObject, MongoPersistentEntity<?> entity) {
DBObject mappedFields = fieldsObject != null ? getMappedObject(fieldsObject, entity) : new BasicDBObject();
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields.keySet().isEmpty() ? null : mappedFields;
}
private void mapMetaAttributes(DBObject source, MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
if (entity == null || source == null) {
return;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsField(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
}
private DBObject getMappedTextScoreField(MongoPersistentProperty property) {
return new BasicDBObject(property.getFieldName(), META_TEXT_SCORE);
}
/**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
*
@@ -334,8 +274,7 @@ public class QueryMapper {
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty()
&& (type.equals(DBRef.class) || entity.getIdProperty().getActualType().isAssignableFrom(type));
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
}
/**
@@ -383,16 +322,10 @@ public class QueryMapper {
*/
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBObject) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
return source;
}
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
return new DBRef(ref.getCollectionName(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
@@ -799,7 +732,7 @@ public class QueryMapper {
*/
@Override
public String getMappedKey() {
return path == null ? name : path.toDotPath(isAssociation() ? getAssociationConverter() : getPropertyConverter());
return path == null ? name : path.toDotPath(getPropertyConverter());
}
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
@@ -851,56 +784,5 @@ public class QueryMapper {
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE;
}
/**
* Return the {@link Converter} to use for creating the mapped key of an association. Default implementation is
* {@link AssociationConverter}.
*
* @return
* @since 1.7
*/
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
protected static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
}

View File

@@ -1,64 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* {@link ReflectiveDBRefResolver} provides reflective access to {@link DBRef} API that is not consistently available
* for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBRefResolver {
private static final Method FETCH_METHOD;
static {
FETCH_METHOD = findMethod(DBRef.class, "fetch");
}
/**
* Fetches the object referenced from the database either be directly calling {@link DBRef#fetch()} or
* {@link DBCollection#findOne(Object)}.
*
* @param db can be {@literal null} when using MongoDB Java driver in version 2.x.
* @param ref must not be {@literal null}.
* @return the document that this references.
*/
public static DBObject fetch(DB db, DBRef ref) {
Assert.notNull(ref, "DBRef to fetch must not be null!");
if (isMongo3Driver()) {
return db.getCollection(ref.getCollectionName()).findOne(ref.getId());
}
return (DBObject) invokeMethod(FETCH_METHOD, ref);
}
}

View File

@@ -194,48 +194,47 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return new UpdatePropertyConverter(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getAssociationConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new UpdateAssociationConverter(getAssociation(), key);
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
}
/**
* Special mapper handling positional parameter {@literal $} within property names.
* Converter to skip all properties after an association property was rendered.
*
* @author Christoph Strobl
* @since 1.7
* @author Oliver Gierke
*/
private static class UpdateKeyMapper {
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final Iterator<String> iterator;
protected UpdateKeyMapper(String rawKey) {
Assert.hasText(rawKey, "Key must not be null or empty!");
this.iterator = Arrays.asList(rawKey.split("\\.")).iterator();
this.iterator.next();
}
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Maps the property name while retaining potential positional operator {@literal $}.
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param property
* @return
* @param association must not be {@literal null}.
*/
protected String mapPropertyName(MongoPersistentProperty property) {
public AssociationConverter(Association<MongoPersistentProperty> association) {
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
/**
@@ -243,11 +242,10 @@ public class UpdateMapper extends QueryMapper {
* contained in the source update key.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final UpdateKeyMapper mapper;
private final Iterator<String> iterator;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
@@ -258,7 +256,8 @@ public class UpdateMapper extends QueryMapper {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.mapper = new UpdateKeyMapper(updateKey);
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
this.iterator.next();
}
/*
@@ -267,37 +266,9 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
public String convert(MongoPersistentProperty property) {
return mapper.mapPropertyName(property);
}
}
/**
* {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
*
* @author Christoph Strobl
*/
protected static class UpdateAssociationConverter extends AssociationConverter {
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
super(association);
this.mapper = new UpdateKeyMapper(key);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return super.convert(source) == null ? null : mapper.mapPropertyName(source);
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
}
}
}

View File

@@ -1,42 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Internal API to trigger the resolution of properties.
*
* @author Oliver Gierke
*/
interface ValueResolver {
/**
* Resolves the value for the given {@link MongoPersistentProperty} within the given {@link DBObject} using the given
* {@link SpELExpressionEvaluator} and {@link ObjectPath}.
*
* @param prop
* @param dbo
* @param evaluator
* @param parent
* @return
*/
Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath parent);
}

View File

@@ -0,0 +1,75 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
super(lowerLeft, upperRight);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
}

View File

@@ -0,0 +1,154 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
super(multiplier);
}
}

View File

@@ -0,0 +1,44 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
*
* @param value
*/
public Distance(double value) {
this(value, Metrics.NEUTRAL);
}
public Distance(double value, Metric metric) {
super(value, metric);
}
}

View File

@@ -1,42 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in GeoJSON ({@link http://geojson.org/}) format.
*
* @author Christoph Strobl
* @since 1.7
*/
public interface GeoJson<T extends Iterable<?>> {
/**
* String value representing the type of the {@link GeoJson} object.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geojson-objects
*/
String getType();
/**
* The value of the coordinates member is always an {@link Iterable}. The structure for the elements within is
* determined by {@link #getType()} of geometry.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geometry-objects
*/
T getCoordinates();
}

View File

@@ -1,96 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Defines a {@link GeoJsonGeometryCollection} that consists of a {@link List} of {@link GeoJson} objects.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#geometry-collection
*/
public class GeoJsonGeometryCollection implements GeoJson<Iterable<GeoJson<?>>> {
private static final String TYPE = "GeometryCollection";
private final List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
/**
* Creates a new {@link GeoJsonGeometryCollection} for the given {@link GeoJson} instances.
*
* @param geometries
*/
public GeoJsonGeometryCollection(List<GeoJson<?>> geometries) {
Assert.notNull(geometries, "Geometries must not be null!");
this.geometries.addAll(geometries);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJson<?>> getCoordinates() {
return Collections.unmodifiableList(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonGeometryCollection)) {
return false;
}
GeoJsonGeometryCollection other = (GeoJsonGeometryCollection) obj;
return ObjectUtils.nullSafeEquals(this.geometries, other.geometries);
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJsonLineString} is defined as list of at least 2 {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#linestring
*/
public class GeoJsonLineString extends GeoJsonMultiPoint {
private static final String TYPE = "LineString";
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonLineString(List<Point> points) {
super(points);
}
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param first must not be {@literal null}
* @param second must not be {@literal null}
* @param others can be {@literal null}
*/
public GeoJsonLineString(Point first, Point second, Point... others) {
super(first, second, others);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint#getType()
*/
@Override
public String getType() {
return TYPE;
}
}

View File

@@ -1,109 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiLineString} is defined as list of {@link GeoJsonLineString}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multilinestring
*/
public class GeoJsonMultiLineString implements GeoJson<Iterable<GeoJsonLineString>> {
private static final String TYPE = "MultiLineString";
private List<GeoJsonLineString> coordinates = new ArrayList<GeoJsonLineString>();
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link Point}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<Point>... lines) {
Assert.notEmpty(lines, "Points for MultiLineString must not be null!");
for (List<Point> line : lines) {
this.coordinates.add(new GeoJsonLineString(line));
}
}
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link GeoJsonLineString}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<GeoJsonLineString> lines) {
Assert.notNull(lines, "Lines for MultiLineString must not be null!");
this.coordinates.addAll(lines);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJsonLineString> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiLineString)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiLineString) obj).coordinates);
}
}

View File

@@ -1,116 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPoint} is defined as list of {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multipoint
*/
public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {
private static final String TYPE = "MultiPoint";
private final List<Point> points;
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param points points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonMultiPoint(List<Point> points) {
Assert.notNull(points, "Points must not be null.");
Assert.isTrue(points.size() >= 2, "Minimum of 2 Points required.");
this.points = new ArrayList<Point>(points);
}
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param first must not be {@literal null}.
* @param second must not be {@literal null}.
* @param others must not be {@literal null}.
*/
public GeoJsonMultiPoint(Point first, Point second, Point... others) {
Assert.notNull(first, "First point must not be null!");
Assert.notNull(second, "Second point must not be null!");
Assert.notNull(others, "Additional points must not be null!");
this.points = new ArrayList<Point>();
this.points.add(first);
this.points.add(second);
this.points.addAll(Arrays.asList(others));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Point> getCoordinates() {
return Collections.unmodifiableList(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPoint)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.points, ((GeoJsonMultiPoint) obj).points);
}
}

View File

@@ -1,93 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPolygon} is defined as a list of {@link GeoJsonPolygon}s.
*
* @author Christoph Strobl
* @since 1.7
*/
public class GeoJsonMultiPolygon implements GeoJson<Iterable<GeoJsonPolygon>> {
private static final String TYPE = "MultiPolygon";
private List<GeoJsonPolygon> coordinates = new ArrayList<GeoJsonPolygon>();
/**
* Creates a new {@link GeoJsonMultiPolygon} for the given {@link GeoJsonPolygon}s.
*
* @param polygons must not be {@literal null}.
*/
public GeoJsonMultiPolygon(List<GeoJsonPolygon> polygons) {
Assert.notNull(polygons, "Polygons for MultiPolygon must not be null!");
this.coordinates.addAll(polygons);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<GeoJsonPolygon> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPolygon)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiPolygon) obj).coordinates);
}
}

View File

@@ -1,72 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJson} representation of {@link Point}.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#point
*/
public class GeoJsonPoint extends Point implements GeoJson<List<Double>> {
private static final long serialVersionUID = -8026303425147474002L;
private static final String TYPE = "Point";
/**
* Creates {@link GeoJsonPoint} for given coordinates.
*
* @param x
* @param y
*/
public GeoJsonPoint(double x, double y) {
super(x, y);
}
/**
* Creates {@link GeoJsonPoint} for given {@link Point}.
*
* @param point must not be {@literal null}.
*/
public GeoJsonPoint(Point point) {
super(point);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Double> getCoordinates() {
return Arrays.asList(Double.valueOf(getX()), Double.valueOf(getY()));
}
}

View File

@@ -1,95 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
/**
* {@link GeoJson} representation of {@link Polygon}. Unlike {@link Polygon} the {@link GeoJsonPolygon} requires a
* closed border. Which means that the first and last {@link Point} have to have same coordinate pairs.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#polygon
*/
public class GeoJsonPolygon extends Polygon implements GeoJson<List<GeoJsonLineString>> {
private static final long serialVersionUID = 3936163018187247185L;
private static final String TYPE = "Polygon";
private List<GeoJsonLineString> coordinates = new ArrayList<GeoJsonLineString>();
/**
* Creates new {@link GeoJsonPolygon} from the given {@link Point}s.
*
* @param first must not be {@literal null}.
* @param second must not be {@literal null}.
* @param third must not be {@literal null}.
* @param fourth must not be {@literal null}.
* @param others can be {@literal null}.
*/
public GeoJsonPolygon(Point first, Point second, Point third, Point fourth, final Point... others) {
this(asList(first, second, third, fourth, others));
}
/**
* Creates new {@link GeoJsonPolygon} from the given {@link Point}s.
*
* @param points must not be {@literal null}.
*/
public GeoJsonPolygon(List<Point> points) {
super(points);
this.coordinates.add(new GeoJsonLineString(points));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<GeoJsonLineString> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
private static List<Point> asList(Point first, Point second, Point third, Point fourth, final Point... others) {
ArrayList<Point> result = new ArrayList<Point>(3 + others.length);
result.add(first);
result.add(second);
result.add(third);
result.add(fourth);
result.addAll(Arrays.asList(others));
return result;
}
}

View File

@@ -0,0 +1,54 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
/**
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
* from.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
private static final long serialVersionUID = 23421312312412L;
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
*
* @param content must not be {@literal null}.
*/
public GeoPage(GeoResults<T> results) {
super(results);
}
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}, {@link Pageable} and total.
*
* @param results must not be {@literal null}.
* @param pageable must not be {@literal null}.
* @param total
*/
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
super(results, pageable, total);
}
}

View File

@@ -0,0 +1,38 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Calue object capturing some arbitrary object plus a distance.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
/**
* Creates a new {@link GeoResult} for the given content and distance.
*
* @param content must not be {@literal null}.
* @param distance must not be {@literal null}.
*/
public GeoResult(T content, Distance distance) {
super(content, distance);
}
}

View File

@@ -0,0 +1,60 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
/**
* Value object to capture {@link GeoResult}s as well as the average distance they have.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
* to be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
/**
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
* given {@link GeoResult}s.
*
* @param results must not be {@literal null}.
*/
public GeoResults(List<? extends GeoResult<T>> results) {
super(results);
}
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
super(results, metric);
}
/**
* Creates a new {@link GeoResults} instance from the given {@link GeoResult}s and average distance.
*
* @param results must not be {@literal null}.
* @param averageDistance
*/
@PersistenceConstructor
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
super(results, averageDistance);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,16 +13,15 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
package org.springframework.data.mongodb.core.geo;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy);
}
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.mongodb.core.query.NearQuery;
/**
* Commonly used {@link Metrics} for {@link NearQuery}s.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public enum Metrics implements Metric {
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
private final double multiplier;
private Metrics(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
/**
* Represents a geospatial point value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Point extends org.springframework.data.geo.Point {
@PersistenceConstructor
public Point(double x, double y) {
super(x, y);
}
public Point(org.springframework.data.geo.Point point) {
super(point);
}
public double[] asArray() {
return new double[] { getX(), getY() };
}
public List<Double> asList() {
return asList(this);
}
public static List<Double> asList(org.springframework.data.geo.Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -0,0 +1,92 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Simple value object to represent a {@link Polygon}.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
public static final String COMMAND = "$polygon";
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param x
* @param y
* @param z
* @param others
*/
public <P extends Point> Polygon(P x, P y, P z, P... others) {
super(x, y, z, others);
}
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param points
*/
public <P extends Point> Polygon(List<P> points) {
super(points);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return asList(this);
}
/**
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
*
* @param polygon
* @return
*/
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
List<Point> points = polygon.getPoints();
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
for (Point point : points) {
tuples.add(Arrays.asList(point.getX(), point.getY()));
}
return tuples;
}
}

View File

@@ -0,0 +1,45 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Shape extends org.springframework.data.geo.Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
}

View File

@@ -22,7 +22,6 @@ import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.util.Assert;
/**
@@ -31,6 +30,7 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class Sphere implements Shape {
public static final String COMMAND = "$centerSphere";
@@ -73,13 +73,23 @@ public class Sphere implements Shape {
this(circle.getCenter(), circle.getRadius());
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
@Deprecated
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return new Point(this.center);
public org.springframework.data.mongodb.core.geo.Point getCenter() {
return new org.springframework.data.mongodb.core.geo.Point(this.center);
}
/**
@@ -131,21 +141,20 @@ public class Sphere implements Shape {
return result;
}
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
}
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
@Override
public String getCommand() {
return COMMAND;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,6 @@ import java.lang.annotation.Target;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Christoph Strobl
*/
@Target({ ElementType.TYPE })
@Documented
@@ -106,9 +105,9 @@ public @interface CompoundIndex {
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: 1, hybrid.h2: 1 } , { name: "hybrid.compound_index" } )
* db.root.createIndex( { nested.n1: 1, nested.n2: 1 } , { name: "nested.compound_index" } )
* db.hybrid.createIndex( { h1: 1, h2: 1 } , { name: "compound_index" } )
* db.root.ensureIndex( { hybrid.h1: 1, hybrid.h2: 1 } , { name: "hybrid.compound_index" } )
* db.root.ensureIndex( { nested.n1: 1, nested.n2: 1 } , { name: "nested.compound_index" } )
* db.hybrid.ensureIndex( { h1: 1, h2: 1 } , { name: "compound_index" } )
* </code>
* </pre>
*
@@ -130,10 +129,7 @@ public @interface CompoundIndex {
* stored in.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**
@@ -144,4 +140,14 @@ public @interface CompoundIndex {
*/
boolean background() default false;
/**
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @deprecated TTL cannot be defined for {@link CompoundIndex} having more than one field as key. Will be removed in
* 1.6.
* @see http://docs.mongodb.org/manual/tutorial/expire-data/
* @return
*/
@Deprecated
int expireAfterSeconds() default -1;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,6 @@ import java.lang.annotation.Target;
* @author Jon Brisbin
* @author Laurent Canet
* @author Thomas Darimont
* @author Christoph Strobl
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -63,9 +62,9 @@ public @interface GeoSpatialIndexed {
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: "2d" } , { name: "hybrid.index" } )
* db.root.createIndex( { nested.n1: "2d" } , { name: "nested.index" } )
* db.hybrid.createIndex( { h1: "2d" } , { name: "index" } )
* db.root.ensureIndex( { hybrid.h1: "2d" } , { name: "hybrid.index" } )
* db.root.ensureIndex( { nested.n1: "2d" } , { name: "nested.index" } )
* db.hybrid.ensureIndex( { h1: "2d" } , { name: "index" } )
* </code>
* </pre>
*
@@ -86,10 +85,7 @@ public @interface GeoSpatialIndexed {
* Name of the collection in which to create the index.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -35,17 +35,7 @@ import com.mongodb.DBObject;
public class Index implements IndexDefinition {
public enum Duplicates {
RETAIN, //
/**
* Dropping Duplicates was removed in MongoDB Server 2.8.0-rc0.
* <p>
* See https://jira.mongodb.org/browse/SERVER-14710
*
* @deprecated since 1.7.
*/
@Deprecated//
DROP
RETAIN, DROP
}
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,33 +24,22 @@ import org.springframework.util.ObjectUtils;
* Value object for an index field.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public final class IndexField {
enum Type {
GEO, TEXT, DEFAULT;
}
private final String key;
private final Direction direction;
private final Type type;
private final Float weight;
private final boolean isGeo;
private IndexField(String key, Direction direction, Type type) {
this(key, direction, type, Float.NaN);
}
private IndexField(String key, Direction direction, Type type, Float weight) {
private IndexField(String key, Direction direction, boolean isGeo) {
Assert.hasText(key);
Assert.isTrue(direction != null ^ (Type.GEO.equals(type) || Type.TEXT.equals(type)));
Assert.isTrue(direction != null ^ isGeo);
this.key = key;
this.direction = direction;
this.type = type == null ? Type.DEFAULT : type;
this.weight = weight == null ? Float.NaN : weight;
this.isGeo = isGeo;
}
/**
@@ -64,12 +53,12 @@ public final class IndexField {
@Deprecated
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order.toDirection(), Type.DEFAULT);
return new IndexField(key, order.toDirection(), false);
}
public static IndexField create(String key, Direction order) {
Assert.notNull(order);
return new IndexField(key, order, Type.DEFAULT);
return new IndexField(key, order, false);
}
/**
@@ -79,16 +68,7 @@ public final class IndexField {
* @return
*/
public static IndexField geo(String key) {
return new IndexField(key, null, Type.GEO);
}
/**
* Creates a text {@link IndexField} for the given key.
*
* @since 1.6
*/
public static IndexField text(String key, Float weight) {
return new IndexField(key, null, Type.TEXT, weight);
return new IndexField(key, null, true);
}
/**
@@ -121,20 +101,10 @@ public final class IndexField {
/**
* Returns whether the {@link IndexField} is a geo index field.
*
* @return true if type is {@link Type#GEO}.
* @return the isGeo
*/
public boolean isGeo() {
return Type.GEO.equals(type);
}
/**
* Returns wheter the {@link IndexField} is a text index field.
*
* @return true if type is {@link Type#TEXT}
* @since 1.6
*/
public boolean isText() {
return Type.TEXT.equals(type);
return isGeo;
}
/*
@@ -155,7 +125,7 @@ public final class IndexField {
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
&& this.type == that.type;
&& this.isGeo == that.isGeo;
}
/*
@@ -168,8 +138,7 @@ public final class IndexField {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(direction);
result += 31 * ObjectUtils.nullSafeHashCode(type);
result += 31 * ObjectUtils.nullSafeHashCode(weight);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
return result;
}
@@ -179,7 +148,6 @@ public final class IndexField {
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, direction: %s, type: %s, weight: %s]", key, direction, type, weight);
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2014 the original author or authors.
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,11 +23,6 @@ import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class IndexInfo {
private final List<IndexField> indexFields;
@@ -36,30 +31,14 @@ public class IndexInfo {
private final boolean unique;
private final boolean dropDuplicates;
private final boolean sparse;
private final String language;
/**
* @deprecated Will be removed in 1.7. Please use {@link #IndexInfo(List, String, boolean, boolean, boolean, String)}
* @param indexFields
* @param name
* @param unique
* @param dropDuplicates
* @param sparse
*/
@Deprecated
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
this(indexFields, name, unique, dropDuplicates, sparse, "");
}
public IndexInfo(List<IndexField> indexFields, String name, boolean unique, boolean dropDuplicates, boolean sparse,
String language) {
this.indexFields = Collections.unmodifiableList(indexFields);
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
this.language = language;
}
/**
@@ -105,23 +84,14 @@ public class IndexInfo {
return sparse;
}
/**
* @return
* @since 1.6
*/
public String getLanguage() {
return language;
}
@Override
public String toString() {
return "IndexInfo [indexFields=" + indexFields + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + ", language=" + language + "]";
+ dropDuplicates + ", sparse=" + sparse + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
@@ -129,7 +99,6 @@ public class IndexInfo {
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
result = prime * result + ObjectUtils.nullSafeHashCode(language);
return result;
}
@@ -168,9 +137,6 @@ public class IndexInfo {
if (unique != other.unique) {
return false;
}
if (!ObjectUtils.nullSafeEquals(language, other.language)) {
return false;
}
return true;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,6 @@ import java.lang.annotation.Target;
* @author Philipp Schneider
* @author Johno Crawford
* @author Thomas Darimont
* @author Christoph Strobl
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -89,9 +88,9 @@ public @interface Indexed {
*
* <pre>
* <code>
* db.root.createIndex( { hybrid.h1: 1 } , { name: "hybrid.index" } )
* db.root.createIndex( { nested.n1: 1 } , { name: "nested.index" } )
* db.hybrid.createIndex( { h1: 1} , { name: "index" } )
* db.root.ensureIndex( { hybrid.h1: 1 } , { name: "hybrid.index" } )
* db.root.ensureIndex( { nested.n1: 1 } , { name: "nested.index" } )
* db.hybrid.ensureIndex( { h1: 1} , { name: "index" } )
* </code>
* </pre>
*
@@ -109,13 +108,10 @@ public @interface Indexed {
boolean useGeneratedName() default false;
/**
* Collection name for index to be created on.
* Colleciton name for index to be created on.
*
* @return
* @deprecated The collection name is derived from the domain type. Fixing the collection via this attribute might
* result in broken definitions. Will be removed in 1.7.
*/
@Deprecated
String collection() default "";
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -60,4 +60,10 @@ public class MongoMappingEventPublisher implements ApplicationEventPublisher {
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(java.lang.Object)
*/
public void publishEvent(Object event) {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,6 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexRes
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
/**
@@ -43,8 +42,7 @@ import org.springframework.util.Assert;
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
public class MongoPersistentEntityIndexCreator implements ApplicationListener<MappingContextEvent<?, ?>> {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
@@ -54,7 +52,7 @@ public class MongoPersistentEntityIndexCreator implements
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -65,7 +63,7 @@ public class MongoPersistentEntityIndexCreator implements
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -92,7 +90,7 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<?, ?> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
@@ -102,7 +100,7 @@ public class MongoPersistentEntityIndexCreator implements
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes(event.getPersistentEntity());
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -132,8 +130,8 @@ public class MongoPersistentEntityIndexCreator implements
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection()).createIndex(indexDefinition.getIndexKeys(),
indexDefinition.getIndexOptions());
}
/**

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.core.index;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
@@ -29,9 +28,6 @@ import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.TextIndexIncludeOptions.IncludeStrategy;
import org.springframework.data.mongodb.core.index.TextIndexDefinition.TextIndexDefinitionBuilder;
import org.springframework.data.mongodb.core.index.TextIndexDefinition.TextIndexedFieldSpec;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -96,7 +92,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root));
indexInformation.addAll(potentiallyCreateTextIndexDefinition(root));
final CycleGuard guard = new CycleGuard();
@@ -192,86 +187,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return createCompoundIndexDefinitions(dotPath, collection, entity);
}
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(MongoPersistentEntity<?> root) {
TextIndexDefinitionBuilder indexDefinitionBuilder = new TextIndexDefinitionBuilder().named(root.getType()
.getSimpleName() + "_TextIndex");
if (StringUtils.hasText(root.getLanguage())) {
indexDefinitionBuilder.withDefaultLanguage(root.getLanguage());
}
try {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
}
TextIndexDefinition indexDefinition = indexDefinitionBuilder.build();
if (!indexDefinition.hasFieldSpec()) {
return Collections.emptyList();
}
IndexDefinitionHolder holder = new IndexDefinitionHolder("", indexDefinition, root.getCollection());
return Collections.singletonList(holder);
}
private void appendTextIndexInformation(final String dotPath,
final TextIndexDefinitionBuilder indexDefinitionBuilder, final MongoPersistentEntity<?> entity,
final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
guard.protect(persistentProperty, dotPath);
if (persistentProperty.isExplicitLanguageProperty() && !StringUtils.hasText(dotPath)) {
indexDefinitionBuilder.withLanguageOverride(persistentProperty.getFieldName());
}
TextIndexed indexed = persistentProperty.findAnnotation(TextIndexed.class);
if (includeOptions.isForce() || indexed != null || persistentProperty.isEntity()) {
String propertyDotPath = (StringUtils.hasText(dotPath) ? dotPath + "." : "")
+ persistentProperty.getFieldName();
Float weight = indexed != null ? indexed.weight()
: (includeOptions.getParentFieldSpec() != null ? includeOptions.getParentFieldSpec().getWeight() : 1.0F);
if (persistentProperty.isEntity()) {
TextIndexIncludeOptions optionsForNestedType = includeOptions;
if (!IncludeStrategy.FORCE.equals(includeOptions.getStrategy()) && indexed != null) {
optionsForNestedType = new TextIndexIncludeOptions(IncludeStrategy.FORCE, new TextIndexedFieldSpec(
propertyDotPath, weight));
}
try {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage(), e);
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.warn(
String.format("Potentially invald index structure discovered. Breaking operation for %s.",
entity.getName()), e);
}
} else if (includeOptions.isForce() || indexed != null) {
indexDefinitionBuilder.onField(propertyDotPath, weight);
}
}
}
});
}
/**
* Create {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} for {@link CompoundIndexes} of given type.
*
@@ -324,6 +239,16 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexDefinition.background();
}
int ttl = index.expireAfterSeconds();
if (ttl >= 0) {
if (indexDefinition.getIndexKeys().keySet().size() > 1) {
LOGGER.warn("TTL is not supported for compound index with more than one key. TTL={} will be ignored.", ttl);
} else {
indexDefinition.expire(ttl, TimeUnit.SECONDS);
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
@@ -641,41 +566,4 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return indexDefinition.getIndexOptions();
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
static class TextIndexIncludeOptions {
enum IncludeStrategy {
FORCE, DEFAULT;
}
private final IncludeStrategy strategy;
private final TextIndexedFieldSpec parentFieldSpec;
public TextIndexIncludeOptions(IncludeStrategy strategy, TextIndexedFieldSpec parentFieldSpec) {
this.strategy = strategy;
this.parentFieldSpec = parentFieldSpec;
}
public TextIndexIncludeOptions(IncludeStrategy strategy) {
this(strategy, null);
}
public IncludeStrategy getStrategy() {
return strategy;
}
public TextIndexedFieldSpec getParentFieldSpec() {
return parentFieldSpec;
}
public boolean isForce() {
return IncludeStrategy.FORCE.equals(strategy);
}
}
}

View File

@@ -1,336 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Collection;
import java.util.LinkedHashSet;
import java.util.Set;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* {@link IndexDefinition} to span multiple keys for text search.
*
* @author Christoph Strobl
* @since 1.6
*/
public class TextIndexDefinition implements IndexDefinition {
private String name;
private Set<TextIndexedFieldSpec> fieldSpecs;
private String defaultLanguage;
private String languageOverride;
TextIndexDefinition() {
fieldSpecs = new LinkedHashSet<TextIndexedFieldSpec>();
}
/**
* Creates a {@link TextIndexDefinition} for all fields in the document.
*
* @return
*/
public static TextIndexDefinition forAllFields() {
return new TextIndexDefinitionBuilder().onAllFields().build();
}
/**
* Get {@link TextIndexDefinitionBuilder} to create {@link TextIndexDefinition}.
*
* @return
*/
public static TextIndexDefinitionBuilder builder() {
return new TextIndexDefinitionBuilder();
}
/**
* @param fieldSpec
*/
public void addFieldSpec(TextIndexedFieldSpec fieldSpec) {
this.fieldSpecs.add(fieldSpec);
}
/**
* @param fieldSpecs
*/
public void addFieldSpecs(Collection<TextIndexedFieldSpec> fieldSpecs) {
this.fieldSpecs.addAll(fieldSpecs);
}
/**
* Returns if the {@link TextIndexDefinition} has fields assigned.
*
* @return
*/
public boolean hasFieldSpec() {
return !fieldSpecs.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
DBObject keys = new BasicDBObject();
for (TextIndexedFieldSpec fieldSpec : fieldSpecs) {
keys.put(fieldSpec.fieldname, "text");
}
return keys;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexOptions()
*/
@Override
public DBObject getIndexOptions() {
DBObject options = new BasicDBObject();
if (StringUtils.hasText(name)) {
options.put("name", name);
}
if (StringUtils.hasText(defaultLanguage)) {
options.put("default_language", defaultLanguage);
}
BasicDBObject weightsDbo = new BasicDBObject();
for (TextIndexedFieldSpec fieldSpec : fieldSpecs) {
if (fieldSpec.isWeighted()) {
weightsDbo.put(fieldSpec.getFieldname(), fieldSpec.getWeight());
}
}
if (!weightsDbo.isEmpty()) {
options.put("weights", weightsDbo);
}
if (StringUtils.hasText(languageOverride)) {
options.put("language_override", languageOverride);
}
return options;
}
/**
* @author Christoph Strobl
* @since 1.6
*/
public static class TextIndexedFieldSpec {
private final String fieldname;
private final Float weight;
/**
* Create new {@link TextIndexedFieldSpec} for given fieldname without any weight.
*
* @param fieldname
*/
public TextIndexedFieldSpec(String fieldname) {
this(fieldname, 1.0F);
}
/**
* Create new {@link TextIndexedFieldSpec} for given fieldname and weight.
*
* @param fieldname
* @param weight
*/
public TextIndexedFieldSpec(String fieldname, Float weight) {
Assert.hasText(fieldname, "Text index field cannot be blank.");
this.fieldname = fieldname;
this.weight = weight != null ? weight : 1.0F;
}
/**
* Get the fieldname associated with the {@link TextIndexedFieldSpec}.
*
* @return
*/
public String getFieldname() {
return fieldname;
}
/**
* Get the weight associated with the {@link TextIndexedFieldSpec}.
*
* @return
*/
public Float getWeight() {
return weight;
}
/**
* @return true if {@link #weight} has a value that is a valid number.
*/
public boolean isWeighted() {
return this.weight != null && this.weight.compareTo(1.0F) != 0;
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(fieldname);
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (!(obj instanceof TextIndexedFieldSpec)) {
return false;
}
TextIndexedFieldSpec other = (TextIndexedFieldSpec) obj;
return ObjectUtils.nullSafeEquals(this.fieldname, other.fieldname);
}
}
/**
* {@link TextIndexDefinitionBuilder} helps defining options for creating {@link TextIndexDefinition}.
*
* @author Christoph Strobl
* @since 1.6
*/
public static class TextIndexDefinitionBuilder {
private TextIndexDefinition instance;
private static final TextIndexedFieldSpec ALL_FIELDS = new TextIndexedFieldSpec("$**");
public TextIndexDefinitionBuilder() {
this.instance = new TextIndexDefinition();
}
/**
* Define the name to be used when creating the index in the store.
*
* @param name
* @return
*/
public TextIndexDefinitionBuilder named(String name) {
this.instance.name = name;
return this;
}
/**
* Define the index to span all fields using wilcard. <br/>
* <strong>NOTE</strong> {@link TextIndexDefinition} cannot contain any other fields when defined with wildcard.
*
* @return
*/
public TextIndexDefinitionBuilder onAllFields() {
if (!instance.fieldSpecs.isEmpty()) {
throw new InvalidDataAccessApiUsageException("Cannot add wildcard fieldspect to non empty.");
}
this.instance.fieldSpecs.add(ALL_FIELDS);
return this;
}
/**
* Include given fields with default weight.
*
* @param fieldnames
* @return
*/
public TextIndexDefinitionBuilder onFields(String... fieldnames) {
for (String fieldname : fieldnames) {
onField(fieldname);
}
return this;
}
/**
* Include given field with default weight.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder onField(String fieldname) {
return onField(fieldname, 1F);
}
/**
* Include given field with weight.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder onField(String fieldname, Float weight) {
if (this.instance.fieldSpecs.contains(ALL_FIELDS)) {
throw new InvalidDataAccessApiUsageException(String.format("Cannot add %s to field spec for all fields.",
fieldname));
}
this.instance.fieldSpecs.add(new TextIndexedFieldSpec(fieldname, weight));
return this;
}
/**
* Define the default language to be used when indexing documents.
*
* @param language
* @see http://docs.mongodb.org/manual/tutorial/specify-language-for-text-index/#specify-default-language-text-index
* @return
*/
public TextIndexDefinitionBuilder withDefaultLanguage(String language) {
this.instance.defaultLanguage = language;
return this;
}
/**
* Define field for language override.
*
* @param fieldname
* @return
*/
public TextIndexDefinitionBuilder withLanguageOverride(String fieldname) {
if (StringUtils.hasText(this.instance.languageOverride)) {
throw new InvalidDataAccessApiUsageException(String.format(
"Cannot set language override on %s as it is already defined on %s.", fieldname,
this.instance.languageOverride));
}
this.instance.languageOverride = fieldname;
return this;
}
public TextIndexDefinition build() {
return this.instance;
}
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* {@link TextIndexed} marks a field to be part of the text index. As there can be only one text index per collection
* all fields marked with {@link TextIndexed} are combined into one single index. <br />
*
* @author Christoph Strobl
* @since 1.6
*/
@Documented
@Target({ ElementType.FIELD })
@Retention(RetentionPolicy.RUNTIME)
public @interface TextIndexed {
/**
* Defines the significance of the filed relative to other indexed fields. The value directly influences the documents
* score. <br/>
* Defaulted to {@literal 1.0}.
*
* @return
*/
float weight() default 1.0F;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -35,11 +35,9 @@ import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.common.LiteralExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
/**
@@ -49,45 +47,36 @@ import org.springframework.util.StringUtils;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private static final SpelExpressionParser PARSER = new SpelExpressionParser();
private final String collection;
private final String language;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
private final Expression expression;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
* collection name to the entities simple type name.
*
* @param typeInformation must not be {@literal null}.
* @param typeInformation
*/
public BasicMongoPersistentEntity(TypeInformation<T> typeInformation) {
super(typeInformation, MongoPersistentPropertyComparator.INSTANCE);
this.parser = new SpelExpressionParser();
this.context = new StandardEvaluationContext();
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
Document document = rawType.getAnnotation(Document.class);
this.expression = detectExpression(document);
this.context = new StandardEvaluationContext();
if (document != null) {
this.collection = StringUtils.hasText(document.collection()) ? document.collection() : fallback;
this.language = StringUtils.hasText(document.language()) ? document.language() : "";
if (rawType.isAnnotationPresent(Document.class)) {
Document d = rawType.getAnnotation(Document.class);
this.collection = StringUtils.hasText(d.collection()) ? d.collection() : fallback;
} else {
this.collection = fallback;
this.language = "";
}
}
@@ -107,34 +96,8 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
return expression == null ? collection : expression.getValue(context, String.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getLanguage()
*/
@Override
public String getLanguage() {
return this.language;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getTextScoreProperty()
*/
@Override
public MongoPersistentProperty getTextScoreProperty() {
return getPersistentProperty(TextScore.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasTextScoreProperty()
*/
@Override
public boolean hasTextScoreProperty() {
return getTextScoreProperty() != null;
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
}
/*
@@ -144,22 +107,12 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
@Override
public void verify() {
verifyFieldUniqueness();
verifyFieldTypes();
}
private void verifyFieldUniqueness() {
AssertFieldNameUniquenessHandler handler = new AssertFieldNameUniquenessHandler();
doWithProperties(handler);
doWithAssociations(handler);
}
private void verifyFieldTypes() {
doWithProperties(new PropertyTypeAssertionHandler());
}
/**
* {@link Comparator} implementation inspecting the {@link MongoPersistentProperty}'s order.
*
@@ -241,31 +194,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return null;
}
/**
* Returns a SpEL {@link Expression} frór the collection String expressed in the given {@link Document} annotation if
* present or {@literal null} otherwise. Will also return {@literal null} it the collection {@link String} evaluates
* to a {@link LiteralExpression} (indicating that no subsequent evaluation is necessary).
*
* @param document can be {@literal null}
* @return
*/
private static Expression detectExpression(Document document) {
if (document == null) {
return null;
}
String collection = document.collection();
if (!StringUtils.hasText(collection)) {
return null;
}
Expression expression = PARSER.parseExpression(document.collection(), ParserContext.TEMPLATE_EXPRESSION);
return expression instanceof LiteralExpression ? null : expression;
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.
@@ -298,46 +226,4 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
properties.put(fieldName, property);
}
}
/**
* @author Christoph Strobl
* @since 1.6
*/
private static class PropertyTypeAssertionHandler implements PropertyHandler<MongoPersistentProperty> {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
potentiallyAssertTextScoreType(persistentProperty);
potentiallyAssertLanguageType(persistentProperty);
}
private void potentiallyAssertLanguageType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isExplicitLanguageProperty()) {
assertPropertyType(persistentProperty, String.class);
}
}
private void potentiallyAssertTextScoreType(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isTextScoreProperty()) {
assertPropertyType(persistentProperty, Float.class, Double.class);
}
}
private void assertPropertyType(MongoPersistentProperty persistentProperty, Class<?>... validMatches) {
for (Class<?> potentialMatch : validMatches) {
if (ClassUtils.isAssignable(potentialMatch, persistentProperty.getActualType())) {
return;
}
}
throw new MappingException(String.format("Missmatching types for %s. Found %s expected one of %s.",
persistentProperty.getField(), persistentProperty.getActualType(),
StringUtils.arrayToCommaDelimitedString(validMatches)));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,9 +27,7 @@ import org.slf4j.LoggerFactory;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.StringUtils;
@@ -41,7 +39,6 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class BasicMongoPersistentProperty extends AnnotationBasedPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
@@ -49,7 +46,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
private static final Logger LOG = LoggerFactory.getLogger(BasicMongoPersistentProperty.class);
private static final String ID_FIELD_NAME = "_id";
private static final String LANGUAGE_FIELD_NAME = "language";
private static final Set<Class<?>> SUPPORTED_ID_TYPES = new HashSet<Class<?>>();
private static final Set<String> SUPPORTED_ID_PROPERTY_NAMES = new HashSet<String>();
@@ -100,8 +96,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
}
// We need to support a wider range of ID types than just the ones that can be converted to an ObjectId
// but still we need to check if there happens to be an explicit name set
return SUPPORTED_ID_PROPERTY_NAMES.contains(getName()) && !hasExplicitFieldName();
return SUPPORTED_ID_PROPERTY_NAMES.contains(getName());
}
/*
@@ -135,8 +130,10 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
}
}
if (hasExplicitFieldName()) {
return getAnnotatedFieldName();
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
if (annotation != null && StringUtils.hasText(annotation.value())) {
return annotation.value();
}
String fieldName = fieldNamingStrategy.getFieldName(this);
@@ -149,26 +146,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return fieldName;
}
/**
* @return true if {@link org.springframework.data.mongodb.core.mapping.Field} having non blank
* {@link org.springframework.data.mongodb.core.mapping.Field#value()} present.
* @since 1.7
*/
protected boolean hasExplicitFieldName() {
return StringUtils.hasText(getAnnotatedFieldName());
}
private String getAnnotatedFieldName() {
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
if (annotation != null && StringUtils.hasText(annotation.value())) {
return annotation.value();
}
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getFieldOrder()
@@ -202,31 +179,4 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
public DBRef getDBRef() {
return findAnnotation(DBRef.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isLanguageProperty()
*/
@Override
public boolean isLanguageProperty() {
return getFieldName().equals(LANGUAGE_FIELD_NAME) || isExplicitLanguageProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isExplicitLanguageProperty()
*/
@Override
public boolean isExplicitLanguageProperty() {
return isAnnotationPresent(Language.class);
};
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isTextScoreProperty()
*/
@Override
public boolean isTextScoreProperty() {
return isAnnotationPresent(TextScore.class);
}
}

View File

@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
/**

Some files were not shown because too many files have changed in this diff Show More