Compare commits

..

103 Commits
1.9.x ... 1.5.x

Author SHA1 Message Date
Oliver Gierke
13823690fc DATAMONGO-1246 - After release cleanups. 2015-07-01 09:17:13 +02:00
Spring Buildmaster
fc0187eb8e DATAMONGO-1246 - Prepare next development iteration. 2015-06-30 23:18:05 -07:00
Spring Buildmaster
07bf5a4f7e DATAMONGO-1246 - Release version 1.5.6.RELEASE. 2015-06-30 23:18:02 -07:00
Oliver Gierke
274adf320d DATAMONGO-1246 - Prepare 1.5.6.RELEASE (Dijkstra SR6). 2015-07-01 08:06:03 +02:00
Oliver Gierke
aed983462f DATAMONGO-1246 - Updated changelog. 2015-07-01 08:06:03 +02:00
Oliver Gierke
0f5179eb26 DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:40 +02:00
Oliver Gierke
df40adb618 DATAMONGO-1248 - Updated changelog. 2015-06-30 14:03:28 +02:00
Oliver Gierke
339d8e6f14 DATAMONGO-1228 - Updated changelog. 2015-06-30 14:03:21 +02:00
Oliver Gierke
292e645387 DATAMONGO-1189 - Updated changelog. 2015-06-30 14:02:56 +02:00
Oliver Gierke
df7c007e9e DATAMONGO-1173 - Updated changelog. 2015-06-30 14:01:38 +02:00
Oliver Gierke
9f3465f63c DATAMONGO-1144 - Updated changelog. 2015-06-30 14:01:00 +02:00
Christoph Strobl
2e7c166d38 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-24 20:19:22 +02:00
Christoph Strobl
ac7e3a708a DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:47:01 +02:00
Oliver Gierke
fa50f74729 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:55:07 +02:00
Eddú Meléndez
2aba831389 DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:39:22 +02:00
Oliver Gierke
33c665fb98 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:13:49 +02:00
Oliver Gierke
55f83fa0f2 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:08:29 +02:00
Oliver Gierke
13f2472e2f DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:08 +02:00
Oliver Gierke
5de9994093 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:24:43 +01:00
Thomas Darimont
4d24742181 DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:23:36 +01:00
Oliver Gierke
4af876152c DATAMONGO-1155 - Upgraded mongo-next build profile to driver version 2.13.0.
Related ticket: DATAMONGO-1154.
2015-01-29 21:17:19 +01:00
Oliver Gierke
872e706d2a DATAMONGO-1143 - After release cleanups. 2015-01-27 18:33:15 +01:00
Spring Buildmaster
51c66723e9 DATAMONGO-1143 - Prepare next development iteration. 2015-01-27 06:39:05 -08:00
Spring Buildmaster
ccfab5d91a DATAMONGO-1143 - Release version 1.5.5.RELEASE (Dijkstra SR5). 2015-01-27 06:38:56 -08:00
Oliver Gierke
c94c3f3791 DATAMONGO-1143 - Prepare 1.5.5.RELEASE (Dijkstra SR5). 2015-01-27 15:10:01 +01:00
Oliver Gierke
4924ff54b6 DATAMONGO-1143 - Updated changelog. 2015-01-27 14:35:12 +01:00
Oliver Gierke
edd9b58d1c DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:03:12 +01:00
alex-on-java
dc6cbde842 DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 18:05:59 +01:00
Christoph Strobl
aaaf7e57de DATAMONGO-1143 - Upgrade dependencies.
mongo-java-driver: 2.12.1 > 2.12.5
2015-01-21 20:52:35 +01:00
Christoph Strobl
76538aeff1 DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:18:01 +01:00
Oliver Gierke
1181a21e51 DATAMONGO-1106 - Updated changelog. 2015-01-20 07:40:28 +01:00
Oliver Gierke
c7f07d2386 DATAMONGO-1079 - Updated changelog. 2015-01-20 07:40:03 +01:00
Oliver Gierke
054ad6fc12 DATAMONGO-1046 - Updated changelog. 2015-01-20 07:39:35 +01:00
Oliver Gierke
1b4ecac732 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:22:16 +01:00
Oliver Gierke
1ef03bf707 DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:49 +01:00
Oliver Gierke
a1febdb541 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:51:53 +01:00
Thomas Darimont
79fdf44d24 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:51:46 +01:00
Christoph Strobl
d8bb644b30 DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:51:38 +01:00
Oliver Gierke
73db7b06b9 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:38:23 +01:00
Oliver Gierke
8483f36f48 DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:45 +01:00
Christoph Strobl
b426bdd64a DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:59 +01:00
Christoph Strobl
3349454a51 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Original pull request: #241.
2014-11-27 17:14:46 +01:00
Mikhail Mikhaylenko
c04343a764 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 10:05:46 +01:00
Thomas Darimont
05311ea118 DATAMONGO-1094 - Fixed tests.
Due to changed behaviour in Springframework 3.2.12, the order by which nested config classes are analysed has changed. Therefore we need to configure @EnableMongoRepositories annotation on SimpleConfigWithRepositories as well.
2014-11-26 10:03:04 +01:00
Thomas Darimont
a18633f847 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 19:06:44 +01:00
Oliver Gierke
29e2f87ee1 DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:38 +01:00
Christoph Strobl
5c5accd12d DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:38:35 +01:00
Thomas Darimont
08e534adcd DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 14:16:31 +02:00
Christoph Strobl
914acfea16 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 12:09:37 +02:00
Christoph Strobl
a6728f851b DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:51 +02:00
Oliver Gierke
81a7515cb7 DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 16:29:59 +02:00
Christoph Strobl
3f4087dc13 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:04:14 +02:00
Thomas Darimont
0430962861 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-10-01 09:48:12 +02:00
Christoph Strobl
8bb62a65db DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:49:28 +02:00
Christoph Strobl
d6defbcb56 DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:23:51 +02:00
Oliver Gierke
ed47850f71 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:52:15 +02:00
Oliver Gierke
ca1cfdf659 DATAMONGO-1033 - After release cleanups. 2014-08-27 12:40:22 +02:00
Spring Buildmaster
95d31d5bb7 DATAMONGO-1033 - Prepare next development iteration. 2014-08-27 03:11:38 -07:00
Spring Buildmaster
a7c3ef2aa8 DATAMONGO-1033 - Release version 1.5.4.RELEASE (Dijkstra SR4). 2014-08-27 03:11:35 -07:00
Oliver Gierke
3320e49c0b DATAMONGO-1033 - Prepare 1.5.4.RELEASE (Dijkstra SR4). 2014-08-27 11:57:52 +02:00
Oliver Gierke
286efca52d DATAMONGO-1033 - Updated changelog. 2014-08-27 11:28:08 +02:00
Christoph Strobl
92926befc9 DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:58 +02:00
Oliver Gierke
703f24ae1c DATAMONGO-1021 - Updated changelog. 2014-08-27 07:18:28 +02:00
Oliver Gierke
febe703954 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.
2014-08-26 20:05:58 +02:00
Oliver Gierke
440d16ebc6 DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:27:39 +02:00
Christoph Strobl
11c2e90736 DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 15:23:59 +02:00
Thomas Darimont
816a567f29 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:32:21 +02:00
Oliver Gierke
e35486759b DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:26:02 +02:00
Christoph Strobl
b80c81f861 DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:25:53 +02:00
Oliver Gierke
005d21c0b6 DATAMONGO-1007 - After release cleanups. 2014-07-28 12:29:37 +02:00
Spring Buildmaster
3c8b7a54d6 DATAMONGO-1007 - Prepare next development iteration. 2014-07-28 02:23:57 -07:00
Spring Buildmaster
b2d59f2539 DATAMONGO-1007 - Release version 1.5.2 (Dijkstra SR2). 2014-07-28 02:14:12 -07:00
Oliver Gierke
1cc2830e95 DATAMONGO-1007 - Prepare 1.5.2.RELEASE (Dijkstra SR2). 2014-07-28 10:43:09 +02:00
Oliver Gierke
9a4d6f6fb7 DATAMONGO-1007 - Updated changelog. 2014-07-28 10:19:43 +02:00
Oliver Gierke
e96db8b69b DATAMONGO-981 - Updated changelog. 2014-07-25 06:20:22 +02:00
Christoph Strobl
13ce75779f DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:42:05 +02:00
Oliver Gierke
cc785ecf4f DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:35:39 +02:00
Thomas Darimont
41fe1809da DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:16:53 +02:00
Thomas Darimont
fd5d39f4d9 DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:53 +02:00
Oliver Gierke
71d97ff53a DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:55 +02:00
Thomas Darimont
f2f3faef08 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:47 +02:00
Thomas Darimont
b1a488098b DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
Added additional constructor that accepts CriteriaDefinition to not force clients to extends Criteria. We deprecated the original constructor and delegate to the new one. We keep the old one to remain binary compatible.

Original pull request: #206.
2014-07-17 09:01:48 +02:00
Christoph Strobl
fd18a8b82c DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-10 07:08:43 +02:00
Christoph Strobl
761d2748c5 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:47:03 +02:00
Christoph Strobl
e2f3966763 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:25 +02:00
Oliver Gierke
2b3e5461c2 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:34:48 +02:00
Christoph Strobl
ca4db459c4 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:19:21 +02:00
Oliver Gierke
e0d0a5dc31 DATAMONGO-971 - After release cleanups. 2014-06-30 16:47:27 +02:00
Spring Buildmaster
18761a39c4 DATAMONGO-971 - Prepare next development iteration. 2014-06-30 07:02:21 -07:00
Spring Buildmaster
ad25751dbb DATAMONGO-971 - Release version 1.5.1.RELEASE (Dijkstra SR1). 2014-06-30 07:02:16 -07:00
Oliver Gierke
879ca6d149 DATAMONGO-971 - Prepare 1.5.1.RELEASE (Dijkstra SR1). 2014-06-30 15:01:20 +02:00
Oliver Gierke
8a40cf421c DATAMONGO-971 - Updated changelog. 2014-06-30 15:01:02 +02:00
Oliver Gierke
fbbb7b6bf7 DATAMONGO-955 - Updated changelog. 2014-06-30 10:56:34 +02:00
Christoph Strobl
0596403081 DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:26:51 +02:00
Oliver Gierke
d9eb18df4d DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:13:17 +02:00
Christoph Strobl
132e4a9839 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:54:24 +02:00
Thomas Darimont
4acf8caac1 DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:44 +02:00
Christoph Strobl
9de3c88e6b DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:25 +02:00
Christoph Strobl
6115c9562b DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:40 +02:00
Christoph Strobl
963a222616 DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:20:55 +02:00
Thomas Darimont
7a2de49ac1 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:20:53 +02:00
Thomas Darimont
8380806d7a DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:46 +02:00
Oliver Gierke
e60479e47a DATAMONGO-936 - Prepare next development iteration. 2014-05-22 12:22:35 +02:00
609 changed files with 11263 additions and 36897 deletions

View File

@@ -1,9 +0,0 @@
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that:
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).
- [ ] You provide your full name and an email address registered with your GitHub account. If youre a first-time submitter, make sure you have completed the [Contributors License Agreement form](https://support.springsource.com/spring_committer_signup).

View File

@@ -1,38 +0,0 @@
language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
- PROFILE=mongo34-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.2-precise
packages:
- mongodb-org-server
- mongodb-org-shell
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

View File

@@ -1,27 +0,0 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

1
CONTRIBUTING.MD Normal file
View File

@@ -0,0 +1 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

View File

@@ -1,3 +0,0 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

View File

@@ -1,2 +0,0 @@
lombok.nonNull.exceptionType = IllegalArgumentException
lombok.log.fieldName = LOG

128
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>https://projects.spring.io/spring-data-mongodb</url>
<url>http://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.8.12.BUILD-SNAPSHOT</version>
<version>1.4.7.BUILD-SNAPSHOT</version>
</parent>
<modules>
@@ -28,9 +28,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.12.12.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<springdata.commons>1.8.7.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.12.5</mongo>
<mongo.osgi>2.12.5</mongo.osgi>
</properties>
<developers>
@@ -39,7 +39,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -50,7 +50,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -61,7 +61,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -72,7 +72,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -83,7 +83,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -94,7 +94,7 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -107,7 +107,16 @@
<id>mongo-next</id>
<properties>
<mongo>2.15.0-SNAPSHOT</mongo>
<mongo>2.13.0</mongo>
</properties>
</profile>
<profile>
<id>mongo-3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -118,95 +127,6 @@
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.4</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -228,7 +148,7 @@
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
<url>http://repo.spring.io/plugins-release</url>
</pluginRepository>
</pluginRepositories>

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,7 +14,7 @@
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>2.0.0</jpa>
<jpa>1.0.0.Final</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
@@ -59,8 +59,8 @@
<!-- JPA -->
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -37,8 +37,6 @@ import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
@@ -47,7 +45,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
protected final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
@@ -78,25 +76,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
throw new DataIntegrityViolationException("Unble to convert property " + key + ": Invalid metadata, "
+ ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
@@ -111,9 +109,9 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
@@ -132,7 +130,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
@@ -154,7 +152,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -166,7 +164,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,10 +32,6 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -30,7 +30,6 @@ import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.WriteConcern;
/**
@@ -38,7 +37,6 @@ import com.mongodb.WriteConcern;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @auhtor Christoph Strobl
*/
public class MongoLog4jAppender extends AppenderSkeleton {
@@ -60,8 +58,8 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.ACKNOWLEDGED;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.UNACKNOWLEDGED;
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
@@ -130,7 +128,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new MongoClient(host, port);
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
@@ -162,7 +160,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && !props.isEmpty()) {
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -22,44 +22,37 @@ import java.util.Calendar;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
private static final Logger log = Logger.getLogger(NAME);
MongoClient mongo;
Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
@Before
public void setUp() throws Exception {
mongo = new MongoClient("localhost", 27017);
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
}
@After
public void tearDown() {
db.getCollection(collection).remove(new BasicDBObject());
db.getCollection(collection).drop();
}
@Test
@@ -71,6 +64,7 @@ public class MongoLog4jAppenderIntegrationTests {
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.2.2.230">
<context version="7.1.9.205">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,15 +32,8 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="CDI">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.cdi.**"/>
</element>
<stereotype name="Unrestricted"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
@@ -82,11 +75,6 @@
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Script">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.script.**"/>
</element>
</element>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/>
@@ -94,7 +82,6 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
@@ -117,11 +104,6 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="MapReduce">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapreduce.**"/>
</element>
</element>
<element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/>
@@ -130,10 +112,8 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|MapReduce" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
@@ -188,32 +168,7 @@
</element>
<element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.querydsl.**"/>
</element>
</element>
<element type="Subsystem" name="Slf4j">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.slf4j.**"/>
</element>
</element>
<element type="Subsystem" name="Jackson">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.fasterxml.jackson.**"/>
</element>
</element>
<element type="Subsystem" name="DOM">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.w3c.dom.**"/>
</element>
</element>
<element type="Subsystem" name="AOP Alliance">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.aopalliance.**"/>
</element>
</element>
<element type="Subsystem" name="Guava">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.google.common.**"/>
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
</element>
</element>
</architecture>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,19 +11,18 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version>
<version>1.5.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -51,7 +50,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -59,14 +58,14 @@
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
<scope>provided</scope>
@@ -138,20 +137,6 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
@@ -159,19 +144,6 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
@@ -183,7 +155,7 @@
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public class BulkOperationException extends DataAccessException {
private static final long serialVersionUID = 73929601661154421L;
private final List<BulkWriteError> errors;
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);
this.errors = source.getWriteErrors();
this.result = source.getWriteResult();
}
public List<BulkWriteError> getErrors() {
return errors;
}
public BulkWriteResult getResult() {
return result;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -28,9 +28,6 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -38,15 +35,17 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.FieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.PropertyNameFieldNamingStrategy;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
@@ -55,7 +54,6 @@ import com.mongodb.MongoClient;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -72,10 +70,7 @@ public abstract class AbstractMongoConfiguration {
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
@@ -134,10 +129,7 @@ public abstract class AbstractMongoConfiguration {
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -52,10 +52,10 @@ import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -361,9 +361,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {
Assert.notNull(filters, "TypeFilters must not be null");
Assert.notNull(filters);
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,24 +15,24 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
/**
@@ -71,6 +71,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
@@ -84,11 +85,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
@@ -105,58 +102,29 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder
.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
*/
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
private final MappingMongoConverter converter;
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
}

View File

@@ -1,85 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -1,198 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final Pattern GROUP_PATTERN = Pattern.compile("(\\\\?')(.*?)\\1");
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : extractCredentialsString(text)) {
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(
MongoCredential.createCredential(userNameAndPassword[0], database, userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private List<String> extractCredentialsString(String source) {
Matcher matcher = GROUP_PATTERN.matcher(source);
List<String> list = new ArrayList<String>();
while (matcher.find()) {
String value = StringUtils.trimLeadingCharacter(matcher.group(), '\'');
list.add(StringUtils.trimTrailingCharacter(value, '\''));
}
if (!list.isEmpty()) {
return list;
}
return Arrays.asList(source.split(","));
}
private static String[] extractUserNameAndPassword(String text) {
int index = text.lastIndexOf(DATABASE_DELIMINATOR);
index = index != -1 ? index : text.lastIndexOf(OPTIONS_DELIMINATOR);
return index == -1 ? new String[] {} : text.substring(0, index).split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (dbSeperationIndex == -1) {
return "";
}
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
private static void verifyUsernameAndPasswordPresent(String[] source) {
verifyUserNamePresent(source);
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,10 +18,6 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -38,7 +34,6 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
@@ -47,22 +42,9 @@ import com.mongodb.MongoURI;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Viktor Khoroshko
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
private static final Set<String> MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES;
static {
Set<String> mongoUriAllowedAdditionalAttributes = new HashSet<String>();
mongoUriAllowedAdditionalAttributes.add("id");
mongoUriAllowedAdditionalAttributes.add("write-concern");
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
@@ -82,25 +64,29 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
BeanDefinition mongoUri = getMongoUri(element, parserContext);
if (mongoUri != null) {
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
// Defaulting
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -161,42 +147,14 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
*
* @param element must not be {@literal null}.
* @param parserContext
* @return {@literal null} in case no client-/uri defined.
* @param uri
* @return
*/
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
private BeanDefinition getMongoUri(String uri) {
boolean hasClientUri = element.hasAttribute("client-uri");
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
int allowedAttributesCount = 1;
for (String attribute : MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES) {
if (element.hasAttribute(attribute)) {
allowedAttributesCount++;
}
}
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -22,7 +22,6 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -34,7 +33,6 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,10 +15,14 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -32,7 +36,6 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
@@ -61,8 +64,7 @@ public class MongoParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
@@ -73,4 +75,19 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -24,7 +24,6 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -34,13 +33,13 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
private MongoParsingUtils() {
}
/**
* Parses the mongo replica-set element.
@@ -55,14 +54,12 @@ abstract class MongoParsingUtils {
}
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
@@ -83,58 +80,13 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
@@ -151,56 +103,4 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,66 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,145 +0,0 @@
/*
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
/** Perform bulk operations in parallel. Processing will continue on errors. */
UNORDERED
};
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(List<Pair<Query, Update>> updates);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(List<Query> removes);
/**
* Execute all bulk operations using the default write concern.
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws org.springframework.data.mongodb.BulkOperationException if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,367 +0,0 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.Collections;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
private final BulkOperationContext bulkOperationContext;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link BulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
* @since 1.9.12
*/
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver == null ? DefaultWriteConcernResolver.INSTANCE
: writeConcernResolver;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) {
bulk.insert((DBObject) document);
return this;
}
DBObject sink = new BasicDBObject();
mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
for (Object document : documents) {
insert(document);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
for (Pair<Query, Update> update : updates) {
upsert(update.getFirst(), update.getSecond());
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
bulk.find(getMappedQuery(query.getQueryObject())).remove();
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName,
bulkOperationContext.getEntityType(), null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(getMappedQuery(query.getQueryObject()));
if (upsert) {
if (multi) {
builder.upsert().update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.upsert().updateOne(getMappedUpdate(update.getUpdateObject()));
}
} else {
if (multi) {
builder.update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.updateOne(getMappedUpdate(update.getUpdateObject()));
}
}
return this;
}
private DBObject getMappedUpdate(DBObject update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
}
private DBObject getMappedQuery(DBObject query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private BulkWriteOperation getBulkWriteOptions(BulkMode bulkMode) {
DBCollection collection = mongoOperations.getCollection(collectionName);
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
}
throw new IllegalStateException("BulkMode was null!");
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
@NonNull BulkMode bulkMode;
MongoPersistentEntity<?> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
Class<?> getEntityType() {
return entity != null ? entity.getType() : null;
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,15 +15,17 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -36,16 +38,15 @@ import com.mongodb.MongoException;
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
private final QueryMapper mapper;
private final Class<?> type;
/**
* Creates a new {@link DefaultIndexOperations}.
@@ -54,26 +55,12 @@ public class DefaultIndexOperations implements IndexOperations {
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
this(mongoOperations, collectionName, null);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.type = type;
}
/*
@@ -81,45 +68,16 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY,
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
}
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.createIndex(indexDefinition.getIndexKeys());
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
if (entityType != null) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext()
.getPersistentEntities();
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
}
return null;
}
});
}
@@ -149,12 +107,10 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
collection.resetIndexCache();
return null;
}
});
@@ -167,9 +123,7 @@ public class DefaultIndexOperations implements IndexOperations {
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@@ -179,7 +133,37 @@ public class DefaultIndexOperations implements IndexOperations {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
indexInfoList.add(IndexInfo.indexInfoOf(ix));
DBObject keyDbObject = (DBObject) ix.get("key");
int numberOfElements = keyDbObject.keySet().size();
List<IndexField> indexFields = new ArrayList<IndexField>(numberOfElements);
for (String key : keyDbObject.keySet()) {
Object value = keyDbObject.get(key);
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
}
return indexInfoList;

View File

@@ -1,188 +0,0 @@
/*
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(false, args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(boolean quote, Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String && quote ? String.format("'%s'", arg)
: this.mongoOperations.getConverter().convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(true, args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -1,32 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* Default {@link WriteConcernResolver} resolving the {@link WriteConcern} from the given {@link MongoAction}.
*
* @author Oliver Gierke
*/
enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,72 +0,0 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private final DBObject source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
}
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
}
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see https://jira.mongodb.org/browse/SERVER-21024
*/
public double getAverageDistance() {
Object averageDistance = source.get("avgDistance");
return averageDistance == null ? Double.NaN : (Double) averageDistance;
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
@Bean
public GeoJsonModule geoJsonModule() {
return new GeoJsonModule();
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface IndexOperations {
@@ -51,11 +51,7 @@ public interface IndexOperations {
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -25,5 +25,5 @@ package org.springframework.data.mongodb.core;
*/
public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -27,8 +27,7 @@ import com.mongodb.Mongo;
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
* @author Thomas Darimont
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
@@ -36,11 +35,10 @@ public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo, "Mongo must not be null!");
Assert.notNull(mongo);
this.mongo = mongo;
}
@@ -86,16 +84,16 @@ public class MongoAdmin implements MongoAdminOperations {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,189 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,295 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,13 +18,12 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
@@ -35,7 +34,6 @@ import com.mongodb.MongoClient;
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -45,7 +43,9 @@ public abstract class MongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {}
private MongoDbUtils() {
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -65,24 +65,11 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
@@ -122,9 +109,22 @@ public abstract class MongoDbUtils {
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
if (credentialsGiven && !authDb.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
// TX sync active, bind new database to thread
@@ -181,36 +181,16 @@ public abstract class MongoDbUtils {
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
ReflectiveDbInvoker.requestDone(db);
db.requestDone();
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,25 +15,23 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException;
import com.mongodb.MongoCursorNotFoundException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import com.mongodb.MongoServerSelectionException;
import com.mongodb.MongoSocketException;
import com.mongodb.MongoTimeoutException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -42,23 +40,9 @@ import com.mongodb.MongoException;
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -67,42 +51,41 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses.
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
if (ex instanceof DuplicateKey || ex instanceof DuplicateKeyException) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
if (ex instanceof Network || ex instanceof MongoSocketException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
if (ex instanceof CursorNotFound || ex instanceof MongoCursorNotFoundException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoServerSelectionException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoTimeoutException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
if (ex instanceof BulkWriteException) {
return new BulkOperationException(ex.getMessage(), (BulkWriteException) ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
if (code == 11000 || code == 11001) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
} else if (code == 12000 || code == 13440) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,7 +20,9 @@ import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -38,14 +40,12 @@ import com.mongodb.WriteConcern;
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private Mongo mongo;
private MongoOptions mongoOptions;
private String host;
@@ -53,11 +53,9 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* @param mongoOptions
*/
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
@@ -68,6 +66,7 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
@@ -76,19 +75,30 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
}
/**
* Configures the host to connect to.
*
* @param host
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
@@ -102,13 +112,12 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
}
/*
@@ -119,6 +128,14 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -129,10 +146,10 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
protected Mongo createInstance() throws Exception {
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
@@ -158,42 +175,18 @@ public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements Pers
mongo.setWriteConcern(writeConcern);
}
return mongo;
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
public void destroy() throws Exception {
this.mongo.close();
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
*
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,12 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -34,21 +33,17 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy).
*
*
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
@@ -57,11 +52,12 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
* The collection name used for the specified class by this template.
*
*
* @param entityClass must not be {@literal null}.
* @return
*/
@@ -71,7 +67,7 @@ public interface MongoOperations {
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
*/
CommandResult executeCommand(String jsonCommand);
@@ -79,7 +75,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
@@ -87,29 +83,15 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
@@ -121,7 +103,7 @@ public interface MongoOperations {
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance.
* @return a result object returned by the action or <tt>null</tt>
@@ -132,7 +114,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the entity collection of the specified class.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param entityClass class that determines the collection to use
* @param <T> return type
* @param action callback object that specifies the MongoDB action
@@ -144,7 +126,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into
* @param action callback object that specifies the MongoDB action the callback action.
@@ -158,41 +140,24 @@ public interface MongoOperations {
* href=http://www.mongodb.org/display/DOCS/Java+Driver+Concurrency>Java Driver Concurrency</a>}
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query
* @param entityType
* @return
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
*
* @param entityClass class that determines the collection to create
* @return the created collection
*/
<T> DBCollection createCollection(Class<T> entityClass);
/**
* Create a collection with a name based on the provided entity class using the options.
*
* Create a collect with a name based on the provided entity class using the options.
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -201,15 +166,15 @@ public interface MongoOperations {
/**
* Create an uncapped collection with the provided name.
*
*
* @param collectionName name of the collection
* @return the created collection
*/
DBCollection createCollection(String collectionName);
/**
* Create a collection with the provided name and options.
*
* Create a collect with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -218,7 +183,7 @@ public interface MongoOperations {
/**
* A set of collection names.
*
*
* @return list of collection names
*/
Set<String> getCollectionNames();
@@ -227,7 +192,7 @@ public interface MongoOperations {
* Get a collection by name, creating it if it doesn't exist.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return an existing collection or a newly created one.
*/
@@ -237,7 +202,7 @@ public interface MongoOperations {
* Check to see if a collection with a name indicated by the entity class exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -247,7 +212,7 @@ public interface MongoOperations {
* Check to see if a collection with a given name exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -257,7 +222,7 @@ public interface MongoOperations {
* Drop the collection with the name indicated by the entity class.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the collection to drop/delete.
*/
<T> void dropCollection(Class<T> entityClass);
@@ -266,64 +231,25 @@ public interface MongoOperations {
* Drop the collection with the given name.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection to drop/delete.
*/
void dropCollection(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection associated with the given entity class
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, etc. is not available for {@literal update} or
* {@literal remove} operations in bulk mode due to the lack of domain type information. Use
* {@link #bulkOps(BulkMode, Class, String)} to get full type specific support.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
*/
BulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType);
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection associated with the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -332,7 +258,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list
* @return the converted collection
*/
@@ -346,7 +272,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
* @return the converted collection
@@ -356,7 +282,7 @@ public interface MongoOperations {
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -371,7 +297,7 @@ public interface MongoOperations {
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -385,7 +311,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
@@ -398,7 +324,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
@@ -409,7 +335,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
@@ -422,7 +348,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
@@ -435,7 +361,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -448,7 +374,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes additional map-reduce options.
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -462,7 +388,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of
* INLINE
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -476,7 +402,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query and additional map-reduce options
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -493,7 +419,7 @@ public interface MongoOperations {
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @return
@@ -504,7 +430,7 @@ public interface MongoOperations {
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @param collectionName the collection to trigger the query against. If no collection name is given the entity class
@@ -522,7 +448,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -539,7 +465,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -549,10 +475,8 @@ public interface MongoOperations {
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element. <br />
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
*
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
@@ -561,7 +485,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @return
@@ -570,7 +494,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @param collectionName name of the collection to check for objects.
@@ -586,7 +510,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -602,7 +526,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -614,7 +538,7 @@ public interface MongoOperations {
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
*
*
* @param <T>
* @param id the id of the document to return.
* @param entityClass the type the document shall be converted into.
@@ -624,7 +548,7 @@ public interface MongoOperations {
/**
* Returns the document with the given id from the given collection mapped onto the given target class.
*
*
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
@@ -634,9 +558,9 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -646,9 +570,9 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -659,10 +583,10 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -673,10 +597,10 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -697,7 +621,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -714,7 +638,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -725,7 +649,7 @@ public interface MongoOperations {
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
*
*
* @param query
* @param entityClass must not be {@literal null}.
* @return
@@ -733,27 +657,13 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support.
*
* Returns the number of documents for the given {@link Query} querying the given collection.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
long count(Query query, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
@@ -762,13 +672,13 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection.
*/
void insert(Object objectToSave);
@@ -780,7 +690,7 @@ public interface MongoOperations {
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -788,7 +698,7 @@ public interface MongoOperations {
/**
* Insert a Collection of objects into a collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param entityClass class that determines the collection to use
*/
@@ -796,7 +706,7 @@ public interface MongoOperations {
/**
* Insert a list of objects into the specified collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param collectionName name of the collection to store the object in
*/
@@ -805,7 +715,7 @@ public interface MongoOperations {
/**
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
*
* @param collectionToSave the list of objects to save.
*/
void insertAll(Collection<? extends Object> objectsToSave);
@@ -819,10 +729,10 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
void save(Object objectToSave);
@@ -839,7 +749,7 @@ public interface MongoOperations {
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -848,7 +758,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
@@ -858,10 +768,8 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
*
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -873,7 +781,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
@@ -885,7 +793,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -896,10 +804,8 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
*
* the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -911,7 +817,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -924,7 +830,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -935,10 +841,8 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the specified collection that matches the query document criteria with the
* provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support.
*
* provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -950,7 +854,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -962,14 +866,14 @@ public interface MongoOperations {
/**
* Remove the given object from the collection by id.
*
*
* @param object
*/
WriteResult remove(Object object);
/**
* Removes the given object from the given collection.
*
*
* @param object
* @param collection must not be {@literal null} or empty.
*/
@@ -978,7 +882,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
*/
@@ -987,7 +891,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -996,22 +900,18 @@ public interface MongoOperations {
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
* conversion/mapping done for any criteria using the id field. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
*
* conversion/mapping done for any criteria using the id field.
*
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
WriteResult remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
*
* @param query must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* @return
* @since 1.5
*/
@@ -1019,7 +919,7 @@ public interface MongoOperations {
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
*
* @param query
* @param entityClass
* @return
@@ -1031,7 +931,7 @@ public interface MongoOperations {
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -1042,9 +942,8 @@ public interface MongoOperations {
/**
* Returns the underlying {@link MongoConverter}.
*
*
* @return
*/
MongoConverter getConverter();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,48 +17,41 @@ package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import com.mongodb.MongoOptions;
/**
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* A factory bean for construction of a {@link MongoOptions} instance.
*
* @author Graeme Rocher
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
@SuppressWarnings("deprecation")
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost;
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime;
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout;
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout;
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive;
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime;
private int writeNumber = DEFAULT_MONGO_OPTIONS.w;
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout;
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
private MongoOptions options;
/**
* Configures the maximum number of connections allowed per host until we will block.
*
@@ -151,10 +144,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
/**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
@@ -164,9 +154,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
@@ -175,9 +163,7 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
@@ -208,41 +194,40 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
this.sslSocketFactory = sslSocketFactory;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
public void afterPropertiesSet() {
MongoOptions options = new MongoOptions();
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
options.connectionsPerHost = connectionsPerHost;
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
options.maxWaitTime = maxWaitTime;
options.connectTimeout = connectTimeout;
options.socketTimeout = socketTimeout;
options.socketKeepAlive = socketKeepAlive;
options.autoConnectRetry = autoConnectRetry;
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
options.slaveOk = slaveOk;
options.w = writeNumber;
options.wtimeout = writeTimeout;
options.fsync = writeFsync;
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
this.options = options;
}
return options;
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
public MongoOptions getObject() {
return this.options;
}
/*
@@ -252,4 +237,12 @@ public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
public Class<?> getObjectType() {
return MongoOptions.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,109 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -1,134 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -1,62 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -1,158 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -1,67 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -1,84 +0,0 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,7 +19,6 @@ import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -28,8 +27,6 @@ import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -40,7 +37,6 @@ import com.mongodb.WriteConcern;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -58,9 +54,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
@@ -71,9 +65,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
@@ -85,9 +77,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
@@ -100,44 +90,16 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@Deprecated
@SuppressWarnings("deprecation")
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
true, uri.getDatabase());
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"),
@@ -155,25 +117,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
@@ -195,7 +138,6 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -26,9 +26,7 @@ import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -38,37 +36,21 @@ import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Alessio Fachechi
* @since 1.3
*/
public class Aggregation {
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
private final List<AggregationOperation> operations;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
@@ -77,30 +59,16 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -110,7 +78,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -120,52 +88,20 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
this.operations = aggregationOperations;
this.options = options;
this.operations = Arrays.asList(aggregationOperations);
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
*
* @return
*/
public static String previousOperation() {
@@ -174,7 +110,7 @@ public class Aggregation {
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -184,7 +120,7 @@ public class Aggregation {
/**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -194,7 +130,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
@@ -204,7 +140,7 @@ public class Aggregation {
/**
* Creates a new {@link GroupOperation} for the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -214,7 +150,7 @@ public class Aggregation {
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -224,7 +160,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
*
* @param sort must not be {@literal null}.
* @return
*/
@@ -234,7 +170,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
@@ -245,28 +181,17 @@ public class Aggregation {
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
*
* @param elementsToSkip must not be less than zero.
* @return
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
*/
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return
*/
public static SkipOperation skip(long elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
*
* @param maxElements must not be less than zero.
* @return
*/
@@ -276,7 +201,7 @@ public class Aggregation {
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
*
* @param criteria must not be {@literal null}.
* @return
*/
@@ -284,40 +209,12 @@ public class Aggregation {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link LookupOperation}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(String from, String localField, String foreignField, String as) {
return lookup(field(from), field(localField), field(foreignField), field(as));
}
/**
* Creates a new {@link LookupOperation} for the given {@link Fields}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(Field from, Field localField, Field foreignField, Field as) {
return new LookupOperation(from, localField, foreignField, as);
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
*
* @see Fields#fields(String...)
* @param fields must not be {@literal null}.
* @return
* @see Fields#fields(String...)
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
@@ -325,7 +222,7 @@ public class Aggregation {
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
@@ -334,32 +231,9 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
@@ -373,22 +247,14 @@ public class Aggregation {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
if (operation instanceof InheritsFieldsAggregationOperation) {
context = new InheritingExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), context);
} else {
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), context);
}
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
@@ -404,7 +270,7 @@ public class Aggregation {
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
@@ -436,51 +302,4 @@ public class Aggregation {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface AggregationExpression {
/**
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
* @param context
* @return
*/
DBObject toDbObject(AggregationOperationContext context);
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,106 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
public enum AggregationFunctionExpressions {
SIZE;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.size());
for (Object value : values) {
args.add(unpack(value, context));
}
return new BasicDBObject("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,189 +0,0 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -28,8 +28,6 @@ import com.mongodb.DBObject;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
@@ -47,8 +45,8 @@ public class AggregationResults<T> implements Iterable<T> {
*/
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults, "List of mapped results must not be null!");
Assert.notNull(rawResults, "Raw results must not be null!");
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = Collections.unmodifiableList(mappedResults);
this.rawResults = rawResults;
@@ -92,16 +90,6 @@ public class AggregationResults<T> implements Iterable<T> {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -24,10 +24,9 @@ import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.4
*/
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
@@ -38,12 +37,11 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext rootContext) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
@@ -81,7 +79,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
@@ -90,22 +88,6 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
Assert.notNull(name, "Name must not be null!");
FieldReference exposedField = resolveExposedField(field, name);
if (exposedField != null) {
return exposedField;
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
/**
* Resolves a {@link field}/{@link name} for a {@link FieldReference} if possible.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return the resolved reference or {@literal null}
*/
protected FieldReference resolveExposedField(Field field, String name) {
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
@@ -129,6 +111,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return new FieldReference(new ExposedField(name, true));
}
}
return null;
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -30,7 +30,6 @@ import org.springframework.util.StringUtils;
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
@@ -84,15 +83,6 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -196,24 +186,15 @@ public final class Fields implements Iterable<Field> {
private final String target;
/**
* Creates an aggregation field with the given {@code name}.
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* as well.
*
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
* @param key
*/
public AggregationField(String name) {
this(name, null);
public AggregationField(String key) {
this(key, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);
@@ -236,10 +217,6 @@ public final class Fields implements Iterable<Field> {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -16,28 +16,17 @@
package org.springframework.data.mongodb.core.aggregation;
/**
* {@link AggregationOperation} that exposes {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s. A {@link FieldsExposingAggregationOperation} implementing the
* {@link InheritsFieldsAggregationOperation} will expose fields from its parent operations. Not implementing
* {@link InheritsFieldsAggregationOperation} will replace existing exposed fields.
*
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s.
*
* @author Thomas Darimont
* @author Mark Paluch
*/
public interface FieldsExposingAggregationOperation extends AggregationOperation {
/**
* Returns the fields exposed by the {@link AggregationOperation}.
*
*
* @return will never be {@literal null}.
*/
ExposedFields getFields();
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
static interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
}
}

Some files were not shown because too many files have changed in this diff Show More