Compare commits

..

622 Commits

Author SHA1 Message Date
Spring Buildmaster
77dce53c7a DATAMONGO-1282 - Release version 1.8.0.RELEASE (Gosling GA). 2015-09-01 02:12:26 -07:00
Oliver Gierke
73f268e7c4 DATAMONGO-1282 - Prepare 1.8.0.RELEASE (Gosling GA). 2015-09-01 09:44:21 +02:00
Oliver Gierke
075d7d8131 DATAMONGO-1282 - Updated changelog. 2015-09-01 09:44:11 +02:00
Christoph Strobl
206337044a DATAMONGO-1280 - Updated "What’s new" section in reference documentation.
Original pull request: #319.
2015-08-31 12:55:30 +02:00
Christoph Strobl
55b44ff7aa DATAMONGO-1275 - Fixed broken links in reference documentation.
Original pull request: #318.
2015-08-22 13:16:49 +02:00
Christoph Strobl
ae48639ae9 DATAMONGO-1275 - Added documentation for optimistic locking.
Original pull request: #318.
2015-08-22 13:16:45 +02:00
Oliver Gierke
6b5e78f810 DATAMONGO-1256 - Polishing.
Minor Javadoc polishing.

Original pull request: #316.
2015-08-07 14:04:49 +02:00
Christoph Strobl
3e485e0a88 DATAMONGO-1256 - MongoMappingEvents now expose the collection name they're issued for.
We now directly expose the collection name via MongoMappingEvent.getCollectionName(). Therefore we added new constructors to all the events, deprecating the previous ones. 

Several overloads have been added to MongoEventListener, deprecating previous API. We’ll call the deprecated from the new ones until their removal.

Original pull request: #316.
2015-08-07 14:04:47 +02:00
Oliver Gierke
335c78f908 DATAMONGO-1269 - Polishing.
Original pull request: #314.
2015-08-06 11:00:36 +02:00
Christoph Strobl
b103e4eaf6 DATAMONGO-1269 - Retain position parameter in property path.
We now retain position parameters in paths used in queries when mapping the field name. This allows to map "list.1.name" to the name property of the first element in the list.

The change also fixes a glitch in mapping java.util.Map like structures having numeric keys.

Original pull request: #314.
2015-08-06 11:00:36 +02:00
Oliver Gierke
c4a6c63d23 DATAMONGO-1268 - After release cleanups. 2015-08-04 14:09:20 +02:00
Spring Buildmaster
4a4f10f97b DATAMONGO-1268 - Prepare next development iteration. 2015-08-04 04:37:14 -07:00
Spring Buildmaster
a5712daab7 DATAMONGO-1268 - Release version 1.8.0.RC1 (Gosling RC1). 2015-08-04 04:37:12 -07:00
Oliver Gierke
28cb1ef106 DATAMONGO-1268 - Prepare 1.8.0.RC1 (Gosling RC1). 2015-08-04 13:08:49 +02:00
Oliver Gierke
0d99a3e527 DATAMONGO-1268 - Updated changelog. 2015-08-04 13:08:49 +02:00
Christoph Strobl
9da43263ce DATAMONGO-1263 - Index resolver considers generic type argument of collection elements.
We now consider the potential generic type argument of collection elements. 
Prior to this change an index within List<GenericWrapper<ConcreteWithIndex>> would not have been resolved.

Original pull request: #312.
2015-08-04 08:48:57 +02:00
Oliver Gierke
784e199068 DATAMONGO-1266 - Fixed domain type lookup for methods returning primitves.
If a repository query method returned a primitive, that primitive was exposed as domain type which e.g. caused deleteBy…(…) methods to fail that returned a void.

We now shortcut the MongoEntityMetadata lookup in MongoQueryMethod to use the repository's domain type if a primitive or wrapper is returned.
2015-08-03 11:53:10 +02:00
Oliver Gierke
1ffee802c0 DATAMONGO-1261 - Updated changelog. 2015-07-28 16:42:58 +02:00
Christoph Strobl
6f0ac7f0c2 DATAMONGO-1254 - Grouping after projection in aggregation now uses correct aliased field name.
We now push the aliased field name down the aggregation pipeline for projections including operations. This allows to reference them in a later stage. Prior to this change the field reference was potentially resolved to the target field of the operation which did not result in an error but lead to false results.

Original pull request: #311.
2015-07-27 14:15:33 +02:00
Christoph Strobl
941d4d8985 DATAMONGO-1260 - Prevent accidental authentication misconfiguration on SimpleMongoDbFactory.
We now reject configuration using MongoClient along with UserCredentials in SimpleMongoDbFactory. This move favors the native authentication mechanism provided via MongoCredential.

<mongo:mongo-client id="mongo-client-with-credentials" credentials="jon:warg@snow?uri.authMechanism=PLAIN" />

Original pull request: #309.
2015-07-27 14:08:42 +02:00
Oliver Gierke
44c76d8ffb DATAMONGO-1257 - We now hint to credential quoting from the XSD.
The namespace XSD now mentions the capability of quoting more complex credentials in case they validly contain a comma.
2015-07-27 13:47:11 +02:00
Oliver Gierke
df9a9f5fb6 DATAMONGO-1257 - Polishing.
Made internal helper methods in MongoCredentialPropertyEditor static where possible.

Original pull request: #310.
2015-07-24 18:40:57 +02:00
Christoph Strobl
bebd0fa0e6 DATAMONGO-1257 - <mongo:mongo-client /> element now supports usernames with a comma.
We now allow grouping credentials by enclosing them in single quotes like this:

credentials='CN=myName,OU=myOrgUnit,O=myOrg,L=myLocality,ST=myState,C=myCountry?uri.authMechanism=MONGODB-X509'

We also changed the required argument checks to be more authentication mechanism specific which means the pattern is now username[:password@database][?options].

Original pull request: #310.
2015-07-24 18:40:56 +02:00
Oliver Gierke
594e90789d DATAMONGO-1244 - Polishing.
Minor reformattings and extracted a method to improve digestability.

Original pull request: #306.
2015-07-08 10:18:35 +02:00
Thomas Darimont
f2ab42cb80 DATAMONGO-1244 - Improved handling of expression parameters in StringBasedMongoQuery.
Replaced regex based parsing of dynamic expression based parameters with custom parsing to make sure we also support complex nested expression objects.
Previously we only supported simple named or positional expressions. Since MongoDBs JSON based query language uses deeply nested objects to express queries, we needed to improve the handling here.

Manual parsing is tedious and more verbose than regex based parsing but it gives us more control over the whole parsing process.

We also dynamically adjust  the quoting so that we only output quoted parameters if necessary.

This enables to express complex filtering queries the use Spring Security constructors like:
```
@Query("{id: ?#{ hasRole('ROLE_ADMIN') ? {$exists:true} : principal.id}}")
List<User> findAllForCurrentUserById();
```

Original pull request: #306.
2015-07-08 10:18:27 +02:00
Christoph Strobl
3224fa8ce7 DATAMONGO-1251 - Fixed potential NullPointerException in UpdateMapper.
We now explicitly handle the possibility of the source object a type hint needs to be calculated for being null.
2015-07-07 09:57:46 +02:00
Oliver Gierke
ce156c1344 DATAMONGO-1250 - Fixed inline code formatting in reference docs. 2015-07-04 19:07:09 +02:00
Oliver Gierke
434e553022 DATAMONGO-1250 - Fixed accidental duplicate invocation of value conversion in UpdateMapper.
UpdateMapper.getMappedObjectForField(…) invokes the very same method of the super class but handed in an already mapped value so that value conversion was invoked twice.

This was especially problematic in cases a dedicated converter had been registered for an object that is already a Mongo-storable one (e.g. an enum-to-string converter and back) without indicating which of the tow converter is the reading or the writing one. This basically caused the source value converted back and forth during the update mapping creating the impression the value wasn't converted at all.

This is now fixed by removing the superfluous mapping.
2015-07-04 19:00:21 +02:00
Oliver Gierke
de5b5ee4b0 DATAMONGO-1246 - Updated changelog. 2015-07-01 10:00:16 +02:00
Oliver Gierke
60636bf56d DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:31 +02:00
Oliver Gierke
1ca71f93e9 DATAMONGO-1248 - Updated changelog. 2015-06-30 13:58:37 +02:00
Oliver Gierke
63ff39bed6 DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
cb0b9604d4 DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
1dbe3b62d7 DATAMONGO-1125 - Improve exception message for index creation errors.
We now use MongoExceptionTranslator to potentially convert exceptions during index creation into Springs DataAccessException hierarchy. In case we encounter an error code indicating DataIntegrityViolation we try to fetch existing index data and append it to the exceptions message.

Original pull request: #302.
2015-06-24 20:28:23 +02:00
Christoph Strobl
5c0707d221 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:50:05 +02:00
Christoph Strobl
c4ffc37dd5 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:21:23 +02:00
Christoph Strobl
aaf93b0f6f DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:19 +02:00
Thomas Darimont
23eab1e84f DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:37:47 +02:00
Oliver Gierke
218f32e552 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:49:03 +02:00
Eddú Meléndez
62fbe4d08c DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:37:22 +02:00
Oliver Gierke
41ffd00619 DATAMONGO-1228 - After release cleanups. 2015-06-02 11:58:11 +02:00
Spring Buildmaster
98b9a604cf DATAMONGO-1228 - Prepare next development iteration. 2015-06-02 01:29:04 -07:00
Spring Buildmaster
01468b640a DATAMONGO-1228 - Release version 1.8.0.M1 (Gosling M1). 2015-06-02 01:29:01 -07:00
Oliver Gierke
4d96b036a2 DATAMONGO-1228 - Prepare 1.8.0.M1 (Gosling M1). 2015-06-02 09:29:53 +02:00
Oliver Gierke
2d1ac15e24 DATAMONGO-1228 - Updated changelog. 2015-06-02 08:24:47 +02:00
Oliver Gierke
2c27e8576f DATAMONGO-990 - Polishing.
Removed EvaluationExpressionContext from all AbstractMongoQuery implementations that don't actually need it and from AbstractMongoQuery itself, too. Cleaned up test cases after that.

Moved SpEL related tests into AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types. JavaDoc and assertion polishes.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Thomas Darimont
67f638d953 DATAMONGO-990 - Add support for SpEL expressions in @Query.
Ported and adapted support for SpEL expressions @Query annotations from Spring Data JPA. StringBasedMongoQuery can now evaluate SpEL fragments in queries with the help of the given EvaluationContextProvider. Introduced EvaluationContextProvider to AbstractMongoQuery. Exposed access to actual parameter values in MongoParameterAccessor.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Oliver Gierke
ea5bd5f7d3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Christoph Strobl
394f695416 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Stefan Ganzer
e4db466ab9 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:21:01 +02:00
Christoph Strobl
ee04c014c9 DATAMONGO-1134 - Add support for $geoIntersects.
We now support $geoIntersects via Criteria.intersects(…) using GeoJSON types.

Original pull request: #295.
2015-06-01 12:36:20 +02:00
Christoph Strobl
ea84f08de8 DATAMONGO-1216 - Skip authentication via AuthDB for MongoClient.
We now skip authentication via an explicit AuthDB when requesting a DB via a MongoClient instance.

Related ticket: DATACMNS-1218
Original pull request: #296.
2015-06-01 12:10:14 +02:00
Christoph Strobl
7d8a2b2d56 DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
We added deprecation hints to the description sections of elements and attributes within the spring-mongo.xsd of 1.7. Also we’ve added (for 1.8) a configuration attribute to db-factory allowing to set a client-uri creating a MongoClientURI instead of a MongoURI that will be passed on to MongoDbFactory. Just as 'uri', 'client-uri' will not allow additional configuration options like username, password next to it.

Original pull request: #296
2015-06-01 12:10:14 +02:00
Christoph Strobl
995d1e5aac DATAMONGO-1202 - Polishing.
Moved and renamed types into test class.
Added collection cleanup and missing author information.

Original pull request: #293.
2015-06-01 09:23:35 +02:00
Thomas Darimont
3b918492ae DATAMONGO-1202 - More robust type inspection for @Indexed properties.
We now use TypeInformation in IndexResolver to lookup the root PersistentEntity for resolving @Indexed properties to ensure that we retrieve the same PersistentEntity that was stored. Previously we used the Class to lookup up the PersistentEntity which yielded a partially processed result.

Original pull request: #293.
2015-06-01 09:08:31 +02:00
Christoph Strobl
66b419163c DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
We now check against the used driver version before requesting db instance from factory. Potential improvements on fetch strategy for MongoDB Java Driver 3 will be handled in DATAMONGO-1194.

Related tickets: DATAMONGO-1194.
Original pull request: #286.
2015-06-01 08:09:50 +02:00
Oliver Gierke
52bff39c22 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:12:47 +02:00
Domenique Tilleuil
d151a13e87 DATAMONGO-1208 - Use QueryCursorPreparer for streaming in MongoTemplate.
We now use the QueryCursorPreparer honor skip, limit, sort, etc. for streaming.

Original pull request: #297.
Polishing pull request: #298.
2015-05-21 09:00:33 +02:00
Oliver Gierke
5e7e7d3598 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:07:30 +02:00
Oliver Gierke
356248bd05 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:51:34 +02:00
Oliver Gierke
73a60153f6 DATAMONGO-1211 - Adapt to changes in Spring Data Commons.
Tweaked method signatures in MongoRepositoryFactory after some signature changes in Spring Data Commons. Use newly introduced getTragetRepositoryViaReflection(…) to obtain the repository instance via the super class.

Added repositoryBaseClass() attribute to @EnableMongoRepositories.

Related tickets: DATACMNS-542.
2015-05-02 14:49:31 +02:00
Oliver Gierke
67cf0e62a7 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:31 +02:00
Oliver Gierke
21fbcc3e67 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:11:55 +02:00
Oliver Gierke
0d63ff92a0 DATAMONGO-1192 - Switched to Spring 4.1's CollectionFactory. 2015-03-31 17:16:44 +02:00
Oliver Gierke
983645e222 DATAMONGO-1189 - After release cleanups. 2015-03-23 14:00:52 +01:00
Spring Buildmaster
d2805bfa47 DATAMONGO-1189 - Prepare next development iteration. 2015-03-23 13:03:26 +01:00
Spring Buildmaster
3f16b30631 DATAMONGO-1189 - Release version 1.7.0.RELEASE (Fowler GA). 2015-03-23 13:03:07 +01:00
Oliver Gierke
8ebcbe3c5c DATAMONGO-1189 - Prepare 1.7.0.RELEASE (Fowler GA). 2015-03-23 12:34:49 +01:00
Oliver Gierke
363bed5c37 DATAMONGO-1189 - Updated changelog. 2015-03-23 12:03:56 +01:00
Christoph Strobl
1547a646dd DATAMONGO-1189 - DATAJPA-692 - Polish reference docs before release.
Add repository query return types to reference doc.
Fall back to locally available Spring Data Commons reference docs as the remote variant doesn't seem to work currently
2015-03-23 11:17:25 +01:00
Oliver Gierke
1408d51065 DATAMONGO-979 - Polishing.
Minor JavaDoc and code style polishes.

Original pull request: #272.
2015-03-23 09:32:52 +01:00
Thomas Darimont
f5c319f18f DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Introduced AggregationExpression interface to be able to represent arbitrary MongoDB expressions that can be used in projection and group operations. Supported function expressions are provided via the AggregationFunctionExpressions enum.

Original pull request: #272.
2015-03-23 09:32:26 +01:00
Christoph Strobl
a3c29054d0 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:24 +01:00
Oliver Gierke
01533ca34c DATAMONGO-1181 - Register GeoJsonModule with @EnableSpringDataWebSupport.
Added the necessary configuration infrastructure to automatically register the GeoJsonModule as Spring bean when @EnableSpringDataWebSupport is used. This is implemented by exposing a configuration class annotated with @SpringDataWebConfigurationMixin.

Added Spring WebMVC as test dependency to be able to write an integration test. Polished GeoJsonModule to hide the actual serializers.

Original pull request: #283.
Related ticket: DATACMNS-660.
2015-03-17 19:40:57 +01:00
Christoph Strobl
a1f6dc6db4 DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
Added GeoJsonModule providing JsonDeserializers for GeoJsonPoint, GeoJsonMultiPoint, GeoJsonLineString, GeoJsonMultiLineString, GeoJsonPolygon and GeoJsonMultiPolygon.

Original pull request: #283.
2015-03-17 19:40:57 +01:00
Oliver Gierke
37d53d936d DATAMONGO-1179 - Polishing. 2015-03-10 14:29:22 +01:00
Christoph Strobl
bc0a2df653 DATAMONGO-1179 - Update reference documentation.
Added new-features section. Updated links and requirements. Added section for GeoJSON support. Updated Script Operations section. Added return type Stream to repositories section. Updated keyword list.

Original pull request: #281.
2015-03-10 14:29:22 +01:00
Oliver Gierke
7e50fd8273 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Thomas Darimont
ba560ffbad DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Oliver Gierke
50ca32c8b9 DATAMONGO-1173 - After release cleanups. 2015-03-05 19:41:05 +01:00
Spring Buildmaster
bdfe3af505 DATAMONGO-1173 - Prepare next development iteration. 2015-03-05 07:47:13 -08:00
Spring Buildmaster
798b56055d DATAMONGO-1173 - Release version 1.7.0.RC1. 2015-03-05 07:47:11 -08:00
Oliver Gierke
ce68e4a070 DATAMONGO-1173 - Prepare 1.7.0.RC1 (Fowler RC1). 2015-03-05 16:31:00 +01:00
Oliver Gierke
5da3130d26 DATAMONGO-1173 - Updated changelog. 2015-03-05 15:54:23 +01:00
Oliver Gierke
6687cdc101 DATAMONGO-1110 - Polishing.
Moved to newly introduced Range type in Spring Data Commons to more safely bind minimum and maximum distances. Changed internal APIs to always use a Range<Distance> which gets populated based on the method signature's characteristics: if only one Distance parameter is found it's interpreted as a range with upper bound only.

Removed invalid testcase for minDistance on 2D index.

Original pull request: #277.
2015-03-05 15:35:42 +01:00
Christoph Strobl
7e74ec6b62 DATAMONGO-1110 - Add support for $minDistance.
We now support $minDistance for NearQuery and Criteria. Please keep in mind that minDistance is only available for MongoDB 2.6 and better and can only be combined with $near or $nearSphere operator depending on the defined index type. Usage of $minDistance with NearQuery is only possible when a 2dsphere index is present. We also make sure $minDistance operator gets correctly nested when using GeoJSON types.

It is now possible to use a Range<Distance> parameter within the repository queries. This allows to define near queries like:

findByLocationNear(Point point, Range<Distance> distances);

The lower bound of the range is treated as the minimum distance while the upper one defines the maximum distance from the given point. In case a Distance parameter is provided it will serve as maxDistance.

Original pull request: #277.
2015-03-05 15:34:45 +01:00
Thomas Darimont
b887fa70a5 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-03-05 15:30:35 +01:00
Oliver Gierke
1c6ab25253 DATAMONGO-1135 - Polishing.
A few polishing changes to the GeoConverters.
2015-03-05 14:28:11 +01:00
Christoph Strobl
1c43a3d1ee DATAMONGO-1135 - Add support for GeoJson.
We’ve added special types representing GeoJson structures. This allows to use those within both queries and domain types.

GeoJson types should only be used in combination with a 2dsphere index as 2d index is not able to handle the structure. Though legacy coordinate pairs and GeoJson types can be mixed inside MongoDB, we currently do not support conversion of legacy coordinates to GeoJson types.
2015-03-05 14:28:11 +01:00
Thomas Darimont
60ca1b3509 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:21:12 +01:00
Oliver Gierke
39d9312005 DATAMONGO-479 - Polishing.
Removed ServersideJavaScript abstraction as we still had to resort on instanceof checks and it created more ambiguities than it helped (e.g. in a script with name and code, which of the two get's executed?). We now have an ExecutableMongoScript which is code only and a NamedMongoScript, which basically is the former assigned to a name. Execution can be triggered on the former or a name.

ScriptOperations.exists(…) now returns a primitive boolean to avoid null checks. JavaDoc.

Original pull request: #254.
2015-03-04 15:18:46 +01:00
Christoph Strobl
a0e42f5dfe DATAMONGO-479 - Add support for calling functions.
We added ScriptOperations to MongoTemplate. Those allow storage and execution of java script function directly on the MongoDB server instance. Having ScriptOperations in place builds the foundation for annotation driver support in repository layer.

Original pull request: #254.
2015-03-04 15:18:40 +01:00
Oliver Gierke
7a3aff12a5 DATAMONGO-1165 - Polishing.
Renamed MongoOperations executeAsStream(…) to stream(…). Make use of Spring Data Commons StreamUtils in AbstractMongoQuery's StreamExecution. Moved test case from PersonRepositoryIntegrationTests to AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types.

Original pull request: #274.
2015-03-03 22:33:33 +01:00
Thomas Darimont
d4f1ef8704 DATAMONGO-1165 - Add support for Java 8 Stream as return type for repository methods.
Added support for a MongoDB Cursor backed Iterator that allows the usage of a Java 8 Stream at the repository level.

Original pull request: #274.
2015-03-03 20:56:47 +01:00
Oliver Gierke
a86d704bec DATAMONGO-1158 - Polishing.
MongoFactoryBean, MongoOptionsFactoryBean, MongoClientFactoryBean and MongoClientOptionsFactoryBean now extend AbstractFactoryBean to get a lot of the lifecycle callbacks without further code.

Added non-null assertions to newly introduced methods on MongoOperations/MongoTemplate.

Moved MongoClientVersion into util package. Introduced static imports for ReflectionUtils and MongoClientVersion for all references in the newly introduced Invoker types.

Some formatting, JavaDoc polishes, suppress deprecation warnings. Added build profile for MongoDB Java driver 3.0 as well as the following snapshot.

Original pull request: #273.
2015-03-02 21:50:27 +01:00
Christoph Strobl
57ab27aa5b DATAMONGO-1158 - Add Support for MongoDB Java driver 3.0.
We now support mongo-java-driver version 2.x and 3.0 along with MongoDB Server 2.6.7 and 3.0.0.

Please note that some of the configurations options might no longer be valid when used with version 3 of the MongoDB Java driver. Have a look at the table below so see some of the major differences in using version 2.x or 3.0

                      | 2.x                  | 3.0
----------------------+----------------------+-----------------------------------------------
default WriteConcern  | NONE                 | UNACKNOWLEDGED
----------------------+----------------------+-----------------------------------------------
option for slaveOk    | available            | ignored
----------------------+----------------------+-----------------------------------------------
option for autoConnect| available            | ignored
----------------------+----------------------+-----------------------------------------------
write result checking | available            | ignored (errors are exceptions anyway)
----------------------+----------------------+-----------------------------------------------
rest index cache      | available            | throws UnsupportedOperationException
----------------------+----------------------+-----------------------------------------------
DBRef resolution      | via DBRef.fetch      | via collection.findOne
----------------------+----------------------+-----------------------------------------------
MapReduce Options     | applied              | ignored
----------------------+----------------------+-----------------------------------------------
authentication        | via UserCredentials  | via MongoClient
----------------------+----------------------+-----------------------------------------------
WriteConcernException | not available        | translated to DataIntegretyViolationException
----------------------+----------------------+-----------------------------------------------
executeInSession      | available            | requestStart/requestDone commands ignored.
----------------------+----------------------+-----------------------------------------------
index creation        | via createIndex      | via createIndex
----------------------+----------------------+-----------------------------------------------

We need to soften the exception validation a bit since the message is slightly different when using different storage engines in a MongoDB 3.0 environment.

Added an explicit <mongo-client /> element and <client-options /> to the configuration schema. These elements will replace existing <mongo /> and <options /> elements in a subsequent release. Added credentials attribute to <mongo-client /> which allows to define a set of credentials used for setting up the MongoClient correctly using authentication data. We now reject <mongo-options /> configuration when using MongoDB Java driver generation 3.0 and above.

Original pull request: #273.
2015-03-02 20:26:50 +01:00
Thomas Darimont
909cc8b5d3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:18 +01:00
Thomas Darimont
b7acbc4347 DATAMONGO-1167 - Added QueryDslPredicateExecutor.findAll(Predicate, Sort).
We now support findAll on QueryDslMongoRepository that accepts a Querydsl Predicate and a Sort and returns a List<T>.

Original pull request: #275.
2015-02-24 09:51:39 +01:00
Oliver Gierke
d276306ddc DATAMONGO-1162 - Adapt to API changes in Spring Data Commons. 2015-02-06 12:48:26 +01:00
Oliver Gierke
25b98b7ad2 DATAMONGO-1154 - Upgraded to MongoDB Java driver 2.13.0. 2015-01-29 20:54:39 +01:00
Oliver Gierke
819b424142 DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:04:16 +01:00
Oliver Gierke
5d0328ba4b DATAMONGO-1144 - Updated changelog. 2015-01-28 20:46:19 +01:00
Oliver Gierke
b219cff29c DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:59 +01:00
Oliver Gierke
409eeaf962 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:46 +01:00
Oliver Gierke
4e5e8bd026 DATAMONGO-1146 - Polishing.
Added missing @Override annotations to QueryDslMongoRepository methods.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:14 +01:00
Thomas Darimont
b91ec53ae0 DATAMONGO-1146 - Added QueryDslMongoRepository.exists(…) which accepts a Querydsl predicate.
Added explicit test case for QueryDslMongoRepository.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:06 +01:00
Oliver Gierke
ce0624b8b0 DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:57:56 +01:00
alex-on-java
b4de2769cf DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 17:51:44 +01:00
Thomas Darimont
3f7b0f1eb6 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:47:15 +01:00
Thomas Darimont
4055365c57 DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:04:01 +01:00
Thomas Darimont
db7f782ca6 DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:16:12 +01:00
Christoph Strobl
cde9d8d23a DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:17:13 +01:00
Oliver Gierke
3dd9b0a2b6 DATAMONGO-1136 - Polishing.
Polished equals(…) / hashCode() methods in GeoCommand.

Original pull request: #263.
2015-01-12 19:42:50 +01:00
Christoph Strobl
59e54cecd2 DATAMONGO-1136 - Use $geoWithin instead of $within for geo queries.
We now use the $geoWithin operator for geospatial criteria which requires to run on  at least MongoDB 2.4.

Original pull request: #263.
2015-01-12 19:42:15 +01:00
Oliver Gierke
5ed7e8efc2 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:10:10 +01:00
Oliver Gierke
fa85adfe0b DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:32 +01:00
Oliver Gierke
a3e4f44a64 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Thomas Darimont
4a7a485e62 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Christoph Strobl
c353e02b3e DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:45:26 +01:00
Christophe Fargette
1c2964cab4 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:37 +01:00
Oliver Gierke
47e083280a DATACMNS-1131 - We now register the ThreeTen back port converters by default.
Related ticket: DATACMNS-628.
2015-01-05 19:11:54 +01:00
Oliver Gierke
7db003100b DATAMONGO-1129 - Upgraded to MongoDB Java driver 2.12.4.
Added Travis build configuration, too.
2014-12-31 14:20:52 +01:00
Oliver Gierke
f814b1ef47 DATAMONGO-1128 - Added test cases to validate Optional mapping.
Added test cases to make sure Optional instances are handled correctly and the converters are actually applied to the nested value.
2014-12-31 13:59:43 +01:00
Oliver Gierke
f3d2ae366e DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:47:54 +01:00
Oliver Gierke
b6ecce3aa2 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:37:33 +01:00
Oliver Gierke
c5235be9a7 DATAMONGO-1106 - After release cleanups. 2014-12-01 13:44:58 +01:00
Spring Buildmaster
23300de9d4 DATAMONGO-1106 - Prepare next development iteration. 2014-12-01 13:36:36 +01:00
Spring Buildmaster
41dc57c84f DATAMONGO-1106 - Release version 1.7.0.M1 (Fowler M1). 2014-12-01 13:36:32 +01:00
Oliver Gierke
85d1fe1ce6 DATAMONGO-1106 - Prepare 1.7.0.M1 (Fowler M1). 2014-12-01 12:26:52 +01:00
Oliver Gierke
ac6067ad53 DATAMONGO-1106 - Updated changelog. 2014-12-01 12:25:53 +01:00
Thomas Darimont
173a62b5ce DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:09:14 +01:00
Oliver Gierke
cbbafce73d DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:23 +01:00
Oliver Gierke
2e74c19995 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:33:19 +01:00
Thomas Darimont
a212b7566c DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:33:18 +01:00
Oliver Gierke
08faa52ef4 DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:24:25 +01:00
Oliver Gierke
33bc4fffd9 DATAMONGO-1079 - Updated changelog. 2014-11-28 12:06:05 +01:00
Christoph Strobl
eca2108e15 DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:38 +01:00
Christoph Strobl
dab6034eb9 DATAMONGO-943 - Add support for $position to Update $push $each.
We now support $position on update.push.

Original pull request: #248.
2014-11-28 11:41:51 +01:00
Christoph Strobl
461e7d05d7 DATAMONGO-1092 - Ensure compatibility with MongoDB 2.8.0.rc0 and java driver 2.13.0-rc0.
We updated GroupByResults to allow working with changed data types returned for count and keys and fixed assertion on error message for duplicate keys.
Using java-driver 2.12.x when connecting to an 2.8.0.rc-0 instance is likely to cause trouble with authentication. This is the intended behavior.

2.8.0-rc0 throws error when removing elements from a collection that does not yet exist, which is different to what 2.6.x does.

The java-driver 2.13.0-rc0 works perfectly fine with a 2.6.x Server instance.
We deprecated Index.Duplicates#DROP since it has been removed in MongoDB 2.8

Original pull request: #246.
2014-11-28 11:33:12 +01:00
Oliver Gierke
10c37b101d DATAMONGO-1105 - Added implementation of QueryDslPredicateExecutor.findAll(OrderSpecifier<?>... orders).
Renamed QuerydslRepositorySupportUnitTests to QuerydslRepositorySupportTests as it's an integration test.
2014-11-28 10:37:21 +01:00
Christoph Strobl
81f2c910f7 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Added support for NotContaining along the way.

Original pull request: #241.
2014-11-27 17:06:39 +01:00
Thomas Darimont
1fd97713c1 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:45:35 +01:00
Oliver Gierke
2d3eeed9ec DATAMONGO-1102 - Added support for Java 8 date/time types.
We're now able to persist and read non-time-zoned JDK 8 date/time types (LocalDate, LocalTime, LocalDateTime) to and from Date instances.
2014-11-27 16:28:36 +01:00
Christoph Strobl
b22eb6f12f DATAMONGO-1101 - Add support for $bit to Update.
We now support bitwise and/or/xor operations for Update.
2014-11-26 11:26:56 +01:00
Mikhail Mikhaylenko
dfb0a2a368 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:40:30 +01:00
Thomas Darimont
03bcc56429 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:32:16 +01:00
Christoph Strobl
457fda3fc3 DATAMONGO-1097 - Add support for $mul to Update.
We now support multiply on Update allowing to multiply the value of the given key by a multiplier.
2014-11-24 20:38:44 +01:00
Oliver Gierke
54cee64610 DATAMONGO-1100 - Upgrade to new PersistentPropertyAccessor API. 2014-11-20 15:12:25 +01:00
Christoph Strobl
477499248a DATAMONGO-1086 - Mapping fails for collection with two embbeded types that extend a generic abstract.
We now use the type information of the raw property type to check if we need to include _class.
2014-11-20 15:12:25 +01:00
Oliver Gierke
3b70b6aeee DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:03 +01:00
Christoph Strobl
163762e99e DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:37:56 +01:00
Oliver Gierke
b99833df75 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:35:51 +01:00
Thomas Darimont
4be6231426 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:12 +01:00
Christoph Strobl
4673e3d511 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:28:22 +01:00
Christoph Strobl
00e48cc424 DATAMONGO-1050 - Explicitly annotated Field should not be considered Id.
We changed the id resolution to skip properties having an explicit name set via @Field unless they are marked with @Id. This means that

@Field(“id”) String id;

will be stored as “id” within mongodb. Prior to this change the fieldname would have been changed to “_id”.
Added tests to ensure proper field mapping for various "id" field variants.

Original pull request: #225.
2014-10-23 11:39:17 +02:00
Christoph Strobl
f8453825fb DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:55:50 +02:00
Christoph Strobl
6cda9ab939 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 11:36:15 +02:00
Oliver Gierke
831d667896 DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:13:53 +02:00
Christoph Strobl
17c342895a DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:45 +02:00
Christoph Strobl
6ef518e6a0 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:30 +02:00
Oliver Gierke
ddee2fbb12 DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:18 +02:00
Christoph Strobl
6512c2cdfb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:14 +02:00
Oliver Gierke
0eee05adaa DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 15:32:18 +02:00
Christoph Strobl
17e0154ff3 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:06:22 +02:00
Thomas Darimont
2780f60c65 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-09-30 12:55:24 +02:00
Christoph Strobl
7dd3450362 DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:40:26 +02:00
Oliver Gierke
ca4b2a61b8 DATAMONGO-1046 - After release cleanups. 2014-09-15 14:30:23 +02:00
Oliver Gierke
d2ecd65ca5 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:27:21 +02:00
Spring Buildmaster
03bd49f6c8 DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 03:12:04 -07:00
Spring Buildmaster
51607c5ed8 DATAMONGO-1046 - Release version 1.6.0.RELEASE (Evans GA). 2014-09-05 03:12:02 -07:00
Oliver Gierke
e2cbd3ee28 DATAMONGO-1046 - Prepare 1.6.0.RELEASE (Evans GA). 2014-09-05 11:43:58 +02:00
Oliver Gierke
5944e6b57e DATAMONGO-1046 - Updated changelog. 2014-09-05 09:23:31 +02:00
Oliver Gierke
efd46498ef DATAMONGO-1033 - Updated changelog. 2014-09-05 07:31:54 +02:00
Christoph Strobl
3d705a737f DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:47:13 +02:00
Christoph Strobl
996c57bccf DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:21:55 +02:00
Oliver Gierke
a31e72ff06 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:51:31 +02:00
Mark Paluch
f07d8fca8c DATAMONGO-1036 - Improved detection of custom implementations for CDI repositories.
Adapted to API changes in CDI extension.

Related ticket: DATACMNS-565.
2014-09-01 13:51:20 +02:00
Christoph Strobl
69dbdee01f DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:39 +02:00
Oliver Gierke
dedb9f3dc0 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.

Improved toString() method on ObjectPath to create more helpful output.
2014-08-26 20:07:46 +02:00
Oliver Gierke
7d69b840fe DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:16:18 +02:00
Christoph Strobl
4eaef300cb DATAMONGO-1025 - Fix creation of nested named index.
Deprecated collection attribute for @Indexed, @CompoundIndex, @GeoSpatialIndexed. Removed deprecated attribute `expireAfterSeconds` from @CompoundIndex.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Christoph Strobl
ec1a6b5edd DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Oliver Gierke
adc5485c09 DATAMONGO-1032 - Polished Asciidoctor documentation. 2014-08-26 14:24:51 +02:00
Oliver Gierke
f622b2916d DATAMONGO-1021 - After release cleanups. 2014-08-13 16:32:42 +02:00
Spring Buildmaster
26be0cf948 DATAMONGO-1021 - Prepare next development iteration. 2014-08-13 07:02:43 -07:00
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
Spring Buildmaster
d861fecdb8 DATAMONGO-981 - Release version 1.6.0.M1. 2014-07-10 10:38:55 -07:00
Oliver Gierke
f280e23095 DATAMONGO-981 - Prepare 1.6.0.M1 (Evans M1). 2014-07-10 19:28:34 +02:00
Oliver Gierke
ed0e1d92c0 DATAMONGO-981 - Updated changelog. 2014-07-10 17:14:24 +02:00
Christoph Strobl
d82fc22659 DATAMONGO-944 - Add support for $currentDate to Update.
Added currentDate and currentTimestamp to Update.

Original pull request: #200.
2014-07-10 15:13:59 +02:00
Thomas Darimont
6616d6788c DATAMONGO-975 - Add support for extracting date/time components from a field projection.
We added some extract-methods to ProjectionOperationBuilder to be able to extract date / time components from projected fields.

Original pull request: #204.
2014-07-10 12:45:17 +02:00
Christoph Strobl
322a7cf033 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:48:02 +02:00
Christoph Strobl
0f487c10ba DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-09 21:21:04 +02:00
Christoph Strobl
11417144bd DATAMONGO-980 - Use meta annotations from commons for @Score.
We now use Spring Data Commons' @ReadOnlyProperty to meta-annotate @Score to mark it as read-only property.

Original pull request: #201.
Related tickets: DATACMNS-534.
2014-07-09 14:51:21 +02:00
Christoph Strobl
dafc59b163 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:04 +02:00
Oliver Gierke
566f9a80c4 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:36:01 +02:00
Christoph Strobl
89a42c5648 DATAMONGO-976 - Add support for reading $meta projection on textScore into document.
We introduced @TextScore that can be used to mark a property to take { $meta : “textScore” }. In contrast to @Transient the value will be considered when reading documents.
The value can and will only get picked up if the score field is retrieved from the store.

Original pull request: #198.
2014-07-07 11:43:43 +02:00
Christoph Strobl
83ffbb00e8 DATAMONGO-850 - Add support for full text search via $text
Using TextQuery and TextCriteria allows creation of queries using $text $search.

{ $meta : “textScore” } can be included using TextQuery.includeScore. As the fieldname used for textScore must not be fixed to “score” one can use the overload taking the fieldname to override the default.

Original pull request: #198.
2014-07-07 11:39:47 +02:00
Christoph Strobl
84913cecab DATAMONGO-937 - Add support for creating text index.
We now support creating text index based on information gathered on domain types.

Using @TextIndexed marks properties to be considered for the full text index. Use the weight attribute to influence document scoring during search operations.

Please note that using @TextIndexed on entity properties forces all properties of any sub document to be considered as part of the text index. Any set weight will in that case be propagated to the siblings taking the most recent weight information into account, which means that a the weight attribute can be overridden for properties in sub documents.

The setting the index default language can be done via @Document(language) while @Language can be used to define the language_override field.

As text search is disabled by default for mongodb 2.4 we use a jUnit ClassRule to restrict integration tests potentially creating text index (as the entities for testing are found in the classpath) to only be executed in when a 2.6+ mongodb server is present.

For usage hints please see section 6.3.4 (Text Indexes) of reference manual.

Original pull request: #198.
2014-07-07 11:30:08 +02:00
Christoph Strobl
998bb09a92 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:16:57 +02:00
Oliver Gierke
cd68a8db54 DATAMONGO-977 - Removed reflective detection of Spring 4 in DBRef proxy creation.
After the Spring 4 upgrade we can now directly use the Objenesis infrastructure of it.
2014-07-02 09:23:19 +02:00
Oliver Gierke
df8477d180 DATAMONGO-955 - Updated changelog. 2014-06-30 10:57:12 +02:00
Christoph Strobl
244fbae0ce DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:25:26 +02:00
Oliver Gierke
19e08a52c0 DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:12:50 +02:00
Thomas Darimont
6389b1bb73 DATAMONGO-954 - Add support for system variables in aggregation operations.
Add assumes for appropriate MongoDB version.
2014-06-26 14:19:24 +02:00
Thomas Darimont
cadcbf6106 DATAMONGO-954 - Add support for system variables in aggregation operations.
We now support referring to system variables like for instance $$ROOT or $$CURRENT from within aggregation framework pipeline projection and group expressions.

Original pull request: #190.
2014-06-25 15:53:44 +02:00
Christoph Strobl
118f007ca6 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:12:32 +02:00
Christoph Strobl
cbb32bd29d DATAMONGO-950 - Add support for limiting the query result in the query derivation mechanism.
When deriving the query from its method name we check for the limit set on the PartTree to pass this on to the created query. PagedExecution not takes the overall limit into account, skips a query execution entirely (if the Pageable is out of scope completely) or alters the query limits accordingly.

Note, that there has been significant rework of this compared to the pull request to avoid new API in Query and extensive changes in MongoTemplate's QueryCursorPreparer.

Original pull request: #191.
2014-06-25 14:58:59 +02:00
Thomas Darimont
9858dcd740 DATAMONGO-960 - Allow to pass options to the Aggregation Pipeline.
We introduced AggregationOptions abstraction to conveniently construct option objects that can be handed to an Aggregation via the new Aggregation.withOptions(...) factory method. For more details, see the Builder class' JavaDoc.

Note that we exposed the "rawResults" in AggregationResults and put a null guard in MongoTemplate aggregate in order to support the "explain" option.

Original pull request: #195.
2014-06-25 13:25:57 +02:00
Thomas Darimont
1fb76d135b DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:30 +02:00
Oliver Gierke
bb62c8b2f1 DATAMONGO-958 - Switch to FieldNamingStrategy SPI in Spring Data Commons. 2014-06-20 21:27:24 +02:00
Christoph Strobl
2cbe7bf885 DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:14 +02:00
Christoph Strobl
6043f6b74d DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:30 +02:00
Christoph Strobl
ef1366592a DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:27:57 +02:00
Thomas Darimont
01cf9fb8f3 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:27:54 +02:00
Thomas Darimont
285c406d5d DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:06 +02:00
Oliver Gierke
ad29e52a57 DATAMONGO-936 - After release cleanups. 2014-05-20 19:54:03 +02:00
Spring Buildmaster
3cfe207c83 DATAMONGO-936 - Prepare next development iteration. 2014-05-20 09:35:38 -07:00
Spring Buildmaster
c7e65cbc40 DATAMONGO-936 - Release version 1.5.0.RELEASE. 2014-05-20 09:35:35 -07:00
Oliver Gierke
b8e02efb04 DATAMONGO-936 - Prepare 1.5 GA.
Removed obsolete readme.txt.
2014-05-20 17:14:04 +02:00
Oliver Gierke
c7f20fb836 DATAMONGO-936 - Prepare 1.5 GA. 2014-05-20 16:57:34 +02:00
Oliver Gierke
28a0202ef4 DATAMONGO-936 - Updated changelog. 2014-05-20 16:26:20 +02:00
Christoph Strobl
164e947045 DATAMONGO-926 - Avoid StackOverflowError while resolving index structures.
We now guard cyclic non transient, non DBRef property references while inspecting domain types for potentially index structures. To do so we check on the properties path and owning type to determine potential cycles. If found we log a warn message pointing to path, entity and property potentially causing problems and skip processing for this path.

Original pull request: #180.
2014-05-19 19:09:40 +02:00
Christoph Strobl
7e65c0c87d DATAMONGO-367 - Nested @Indexed should not trigger creation of separate collection.
The issue has been solved along with DATAMONGO-888 (Pull Request: #162). We have created additional tests to explicitly check it has truly been fixed.
2014-05-19 17:19:28 +02:00
Christoph Strobl
9c1f753f17 DATAMONGO-929 - Use property path for keys of Indexed and CompoundIndex.
Index creation failed for @Indexed and @CompoundIndex as the resolved dotPath was not used for creation. We now not only resolve the dotPath but also use it within the key for index definition. In case of a nested compound index the key definition is enhanced by the provided path.

When leaving the key definition empty for nested compound index we'll create an index for the whole nested document. Trying to create a compound index on root level not providing key information leads to InvalidDataApiUsageException.

Original pull request: #179.
2014-05-19 17:14:36 +02:00
Oliver Gierke
0f821eb52d DATAMONGO-925, DATAMONGO-928 - Polishing.
Only reject attribute setup if abbreviation is activated and a custom strategy is configured. Additional test cases for the rejection case and a custom, over-configuration (explicitly setting abbreviation to false, which is the default anyway).

Related pull request: #177.
2014-05-19 17:07:50 +02:00
Ryan Tenney
e3aadd63ab DATAMONGO-928 - Removed explicit default value for abbreviate-field-names from namespace XSD.
The default for boolean attributes leaks into the evaluation of XML namespace attributes which causes us being unable to detect whether two attributes have been set in a conflicting way.

Fix the documentation on the field-naming-strategy-ref attribute.

Original pull request: #183.
Related pull request: #177.
Related ticket: DATAMONGO-925.
2014-05-19 16:44:03 +02:00
Christoph Strobl
aa06d520df DATAMONGO-647 - Added test case to show that field names are mapped correctly.
Additional test added to check if the issue has truly been resolved by DATAMONGO-888.

Original pull request: #181.
Related pull Request: #162.
Related ticket: DATAMONGO-888.
2014-05-19 14:38:06 +02:00
Oliver Gierke
4fa1d4ba97 DATAMONGO-919 - After release cleanups. 2014-05-02 15:45:06 +02:00
Spring Buildmaster
ba9f11b345 DATAMONGO-919 - Prepare next development iteration. 2014-05-02 05:58:16 -07:00
Spring Buildmaster
4777dd2e5e DATAMONGO-919 - Release version 1.5.0.RC1. 2014-05-02 05:58:13 -07:00
Oliver Gierke
64b4591b72 DATAMONGO-919 - Prepare 1.5.0.RC1.
Upgraded to Spring Data Parent 1.4.0.RC1 and Spring Data Commons 1.8.0.RC1. Switched to milestone repository.
2014-05-02 14:50:48 +02:00
Oliver Gierke
bac5961fd8 DATAMONGO-919 - Updated changelog. 2014-05-02 14:50:48 +02:00
Thomas Darimont
b3cac862d8 DATAMONGO-924 - Improve aggregation field reference resolving.
Previously we didn't support referring to aliased fields defined in former stages of an aggregation pipeline. We now also consider field aliases during field reference lookup.

Original pull request: #176.
2014-05-02 14:46:13 +02:00
Oliver Gierke
b8ab2ad539 DATAMNOG-919 - Forward port changelog entries from bugfix branch. 2014-05-02 12:48:18 +02:00
Christoph Strobl
618d5bd5e9 DATAMONGO-919 - Prepare Release 1.5.0.RC1
Upgrade AspectJ Maven plugin to recent version which enables building the module on java 8 using AspectJ 1.8.
2014-05-01 21:02:01 +02:00
Oliver Gierke
096c3278b3 DATAMONGO-92 - Upgraded to MongoDB Java driver 2.12.1. 2014-04-30 08:51:18 +02:00
Kim Toms
c9d9976c22 DATAMONGO-920 - Improve debug message for delete events in AbstractMongoEventListener.
Adjusted debug message to reflect the actual operation.

Original pull request: #95.
2014-04-29 15:52:12 +02:00
Thomas Darimont
65497f93d4 DATAMONGO-827 - Allow index creation use mongodb auto generated names.
We now support letting MongoDB generate index names by introducing the attribute "useGeneratedName" to the @Indexed, @GeoSpatialIndex, @CompoundIndex annotations.
2014-04-28 19:58:56 +02:00
Christoph Strobl
af8a53bef6 DATAMONGO-909 - Compound index on inherited class should be created correctly.
With the overhaul of the index creation done in DATAMONGO-899 the CompoundIndex annotation is not longer just looked up at the concrete type but rather all its interfaces and super classes. So we just added an additional test to verify this behaviour.
2014-04-28 19:58:56 +02:00
Oliver Gierke
916b856e97 DATAMONGO-899 - Polished API of new index creation abstractions.
Removed the introduction of the IndexDefinition being collection aware again. The collection an index is created in is now held in the IndexDefinitionHolder. This is mostly due to the fact that the IndexDefinition implementations can be used with MongoTemplate and the index opoerations take a collection alongside the index definition.

Made the IndexResolver API package protected so that we can further change it going forward. We should think about deprecating the collectionName attributes on index annotations as it doesn't make too much sense to manually configure the collection name for the indexes as the collection is predefined through the domain type setting here. This would allow us to remove the entire collection handling code inside the IndexResolver implementation.

Turned IndexDefinitionHolder into a value object.

Original pull request: #168.
2014-04-28 19:58:50 +02:00
Christoph Strobl
7848da63f2 DATAMONGO-899 - Ensure proper creation of index structures for nested entities.
Index creation did not consider the properties path when creating the index. This lead to broken index creation when nesting entities that might require index structures.

Off now index creation traverses the entities property path for all top level entities (namely those holding the @Document annotation) and creates the index using the full property path.

This is a breaking change as having an entity to carry the @Document annotation has not been required by now.

Original Pull Request: #168
2014-04-28 16:50:26 +02:00
Christoph Strobl
f359a1d31a DATAMONGO-847 - Allow usage of Query within an Update clause.
In case we detect Query within a value used for an Update value we map the query itself to build the expression to use. This allows to form query statements for e.g. $pull using the same API as for the query itself.

Update update = new Update().pull("list", query(where("value").in("foo", "bar")));

Original Pull Request: #172.
2014-04-28 13:30:13 +02:00
Thomas Darimont
72d645feae DATAMONGO-917 - Improve Spring 4.0 framework version detection to avoid NullPointerExceptions.
We now check for the presence of DefaultParameterNameDiscoverer in order to determine if we are running with a Spring version later than 4.0 since this avoids potential NullPointerExceptions in cases where the package version information is not available e.g. in cases where the application was bundled into an uberjar e.g. via the maven-shade-plugin.

Original pull request: #173.
2014-04-28 13:21:09 +02:00
Thomas Darimont
72fe382bba DATAMONGO-913 - Improve DBRef handling in for LazyLoadingProxies.
We now use the captured DBRef of a given LazyLoadingProxy in MappingMongoConverter.toDBRef(..) in order to avoid a new DBRef creation that would fail for the proxy.

Original pull request: #174.
2014-04-28 13:07:57 +02:00
Thomas Darimont
d25e840cf5 DATAMONGO-914 - Improve resolving of lazy-loading proxies for classes that override equals(…)/hashCode().
We now properly resolve lazy-loading proxies for @DBRef's when an overridden equals or hash code method is called with Spring 4. We fall back to our old Objenesis proxy generation in order to circumvent the default handling for overridden hashcCode() and equals(…) methods in CglibAopProxies generated by Spring 4.

If we detect that we run with Spring 4 we use the repacked Objenesis that is included in Spring 4. Previously the generated proxy used some generic hashCode() or equals(…) logic that did not trigger a proper lazy loading in such cases.

Original pull request: #171.
2014-04-23 09:29:33 +02:00
Thomas Darimont
df1775572a DATAMONGO-912 - Consider custom conversions in all stages of an aggregation pipeline.
We now consider custom mongo conversions in all stages of an aggregation pipeline. Previously we did this only for the first stage and returned object basically unmapped in later stages. We now pass the root AggregationOperationContext on to nested ExposedFieldsAggregationOperationContexts so that those can delegate any mongo Mapping to the root context.

Original pull request: #170.
2014-04-23 09:00:56 +02:00
Christoph Strobl
86670cd49f DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:36:11 +02:00
Christoph Strobl
31a4bf906e DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:11:12 +02:00
Christoph Strobl
599291e8b7 DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails. 

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:11:00 +02:00
Thomas Darimont
f5a04fb9fb DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:55:57 +02:00
Oliver Gierke
88558b67c3 DATAMONGO-866 - Polishing for new field naming strategy configuration support.
Use the camel case split logic from Spring Data Commons (introduced for DATACMNS-486) in a common CamelCaseSplittingFieldNamingStrategy super class.

MappingMongoConverterParser now also rejects the configuration if both abbreviate-field-names and field-naming-strategy-ref are configured.
2014-04-10 18:36:37 +02:00
Ryan Tenney
d52cb255e0 DATAMONGO-866 - Add means to configure field naming strategy. 2014-04-10 18:03:55 +02:00
Oliver Gierke
6c214cbc37 DATAMONGO-910 - Upgraded to latest MongoDB Java driver 2.12.0.
Removed special mongo-osgi property as the driver version is now a valid OSGi version number.
2014-04-10 16:55:22 +02:00
Jeff Yemin
01012c1448 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:45 +02:00
Christoph Strobl
4c7befb910 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:43:51 +02:00
Christoph Strobl
b62669ec8f DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:33:34 +02:00
Oliver Gierke
0fb74caf9b DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 12:12:29 +02:00
Oliver Gierke
9623dac01f DATAMONGO-901 - Fixed regression of not registering type predicting post processor.
The changes for DATAMONGO-843 introduced a regression by skipping the registration of the RepositoryInterfaceAwareBeanPostProcessor. This can cause the wiring of repository bean definitions to fail depending on in which order the bean definitions get instantiated.

This change reintroduces the registration and adds an explicit test case for it.
2014-04-02 17:58:36 +02:00
Spring Buildmaster
695b27968c DATAMONGO-859 - Prepare next development iteration. 2014-03-31 17:10:40 +02:00
Spring Buildmaster
22933e4493 DATAMONGO-859 - Release version 1.5.0.M1. 2014-03-31 08:04:03 -07:00
Thomas Darimont
40aa6bbdd5 DATAMONGO-859 - Prepare release 1.5 M1.
Updated readme.md and mongodb.xml to reflect recent version. Updated Spring Data Commons and Spring Data Build versions in pom.xml. Update pom.xml to use release repository. Updated docbkx to use recent Spring Data Commons version. Updated changelog to reflect changes and releases.

Original pull request: #161.
2014-03-31 16:47:15 +02:00
Christoph Strobl
5e43f5846a DATAMONGO-471 - Add support for $each when using $addToSet.
Additionally to Update.addToSet(String, Object) the method 'addToSet(String)' has been introduced, returning a builder to allow the creation of $addToSet command for either single value, or multiple values using $each.

Using value:
new Update().addToSet("key").value("spring");

Using each:
new Update().addToSet("key").each("spring", "data", "mongodb");

Original Pull Request: #157.
2014-03-31 15:25:14 +02:00
Thomas Darimont
2cfd4781bc DATAMONGO-884 - Improved handling for Object methods in LazyLoadingInterceptor.
We now handle invocations of equals(…)/hashCode()/toString()  methods that are not overridden with custom proxy aware logic. This avoids potentially NullPointerExceptions and makes it easier to debug code that deals with proxies (due to a proper toString representation of a proxy).

Original pull request: #158.
2014-03-31 15:19:30 +02:00
Thomas Darimont
031ab0c07b DATACMNS-482 - Fix compiler error due to changes in SD Commons.
Fixed a compiler that got introduced by making the geospatial types in Spring Data Commons serializable.
2014-03-31 15:14:08 +02:00
Thomas Darimont
10f69f6623 DATAMONGO-884 - Fix potential NullPointerException for lazy DBRefs.
We now initialize the proxy in case an Object-method is called that is overridden in the traget class. Removed the additional check for initialization and to-DBRef-methods as they're repeated in the target method.

Original pull requests: #152, #153.
2014-03-27 17:57:39 +01:00
Thomas Darimont
d7b03915a7 DATAMONGO-858 - Revised rendering of geo spatial structures.
Switched back to the old style of rendering (as in 1.4.x) of DBObjects when they are used as values in persistent domain objects, adjusted the GeoConverters accordingly. In order to render geo structures correctly when they are used within a query we now wrap them in a GeoCommand that triggers a different Shape rendering.

We now render the metric that was used in the Distance definition of the radius of a Circle or Sphere.
2014-03-27 13:18:55 +01:00
Oliver Gierke
ed55d48a53 DATAMONGO-858 - Polishing.
Moved to use the newly introduced geo types from Spring Data Commons. Adde deprecation warning suppression everywhere else.

Adapted Sonargraph architecture description file and split up namespace registration into repository specific stuff and everything else.
2014-03-27 13:18:55 +01:00
Thomas Darimont
d5ed4e0ac2 DATAMONGO-858 - Add support for common geospatial structures.
Backed the geo spatial structures of SD MongoDB by the new geo spatial structures in SD commons. Deprecated the MongoDB geo spatial types to make users aware that we're going to remove them in one of the following development iterations. Added custom conversions for basic geo spatial types.

We deliberately choose not to let Circle extends CMNS geo.Circle since it would break clients that use the legacy Circle API (getRadius() returns a Distance in CMNS where as it returns a plain double in Mongo).
2014-03-27 13:18:55 +01:00
Oliver Gierke
75194730e9 DATAMONGO-887 - Added unit tests to verify TreeMaps can be converted. 2014-03-27 08:58:53 +01:00
Oliver Gierke
a09183d2eb DATAMONGO-880 - Minor polishing in lazy-loading area.
Took the change to add @since tags to the types introduced for lazy loading. Polished JavaDoc where necessary. Removed methods solely existing for testing purposes and use reflection in tests to minimize the API being published.
2014-03-20 09:27:37 +01:00
Thomas Darimont
45dd3cd988 DATAMONGO-880 - Improved handling of persistence of lazy-loaded DBRefs.
Added LazyLoadingProxy interface that will be implemented by every LazyLoading-proxy that is created by the DefaultDbRefResolver. Clients can now cast those proxies to this interface and call it's methods initialize a proxy explicitly or to get the referenced DBRef if possible.

We now keep a reference to the DBRef that lead to the creation of a LazyLoadingProxy in order to be able to reuse it in case one assigns the proxy to a field that should be a DBRef. This avoids unnecessary conversion.

Previously saving of proxies wasn't possible since the mapping infrastructure did not know how to extract the entity information from the proxy. We now either store the DBRef backed by the proxy directly or we initialize the proxy first and use the result of LazyLoadingProxy.initialize().

Original pull request: #151.
2014-03-20 09:26:08 +01:00
Oliver Gierke
b24e34c360 DATAMONGO-883 - Adapted to changes in auditing config in Spring Data MongoDB. 2014-03-18 20:10:43 +01:00
Oliver Gierke
fa9b5efdab DATAMONGO-882 - Adapted to removal of obsolete generics in BeanWrapper. 2014-03-18 20:08:22 +01:00
Oliver Gierke
8f2ced8ada DATAMONGO-881 - Allow custom conversions to override default conversions.
User provided converters are now registered *after* the default converters to make sure they enjoy precedence over the default ones. 

This is achieved by inverting the order of converters after the conversions have been registered. This is necessary as the registration order for convertible pairs is different from the one of the converters. For the pairs, earlier registered instances take precedence, while for the actual converter instances, instances registered later trump ones registered before.
2014-03-18 09:32:28 +01:00
Oliver Gierke
ff92cf1429 DATAMONGO-566 - Polishing.
Inlined a few methods to reduce the number of indirections. Added a bit of missing JavaDoc here and there. StringBasedMongoQuery now prevents a manually defined query from being marked as both count and delete query.

Polished test cases a little.

Original pull request: #147.
2014-03-17 17:51:54 +01:00
Christoph Strobl
ba48290a3e DATAMONGO-566 - Add support for derived delete-by queries.
Using keywords remove or delete in derived query, or setting @Query(delete=true) removes documents matching the query. If the return type is assignable to Number, the total number of affected documents is returned. In case the return type is collection like the query is executed against the store in first place. All documents included in the resulting collection are deleted in a subsequent call.

Additionally findAllAndRemove(…) methods have been added to MongoTemplate.

Original pull request: #147.
2014-03-17 17:51:08 +01:00
Oliver Gierke
70e5efd0d9 DATAMONGO-877 - Added guard against null-package in AbstractMappingConfiguration.
AbstractMappingConfiguration.getMappingBasePackage() now quards against a null package returned for the configuration class. This can happen if the class resides in the default package.
2014-03-10 12:47:26 +01:00
Oliver Gierke
4eae229bff DATAMONGO-876 - Adapt to API changes introduced for better property access config.
Adapted usage of BeanWrapper as the property access is now solely defined via the PersistentProperty. Adapted MongoPersistentEntityIndexCreator to lookup annotations via PersistentProperty instead of the backing field. Removed code from BasicMongoPersistentProperty which is now already implemented in the Spring Data Commons types.
2014-03-07 14:38:23 +01:00
Oliver Gierke
47f0607c49 DATAMONGO-809 - Polishing.
Added ticket references to test cases in GridFsTemplateIntegratinoTests.
2014-03-07 08:30:43 +01:00
Martin Baumgartner
753e794194 DATAMONGO-809 - Filename is now optional when storing files to GridFS.
Added method overloads to GridFsOperations and GridFsTemplate to store files without a filename given.

Original pull request: #119.
2014-03-07 08:30:43 +01:00
Thomas Darimont
d27bec8ed5 DATAMONGO-773 - Verify that @DBRef fields can be included in query.
Added test cases to verify that projection search with included @DBRef fields works as expected.

Original pull request: #142.
2014-03-06 13:34:17 +01:00
Christoph Strobl
c63f7f75dc DATAMONGO-868 - MongoTemplate.findAndModify(…) increases version if not handled manually.
MongoTemplate.findAndModify(…) increments the version property in case it's not manually set in the Update object given.

Original Pull Request: #141.
2014-03-06 11:51:13 +01:00
Christoph Strobl
84040518cf DATAMONGO-863 - UpdateMapper doesn't convert raw DBObjects anymore.
UpdateMapper now only performs simple conversion if it encounters a DBObject, instead of deep inspection on keywords used. This allows to use custom clauses nested in Update for operations not directly supported.

Original Pull Request: #138.
2014-03-06 11:45:35 +01:00
Christoph Strobl
c66b9a538c DATAMONGO-821 - Fixed handling of keyword expressions for DBRefs.
Query Mapper skips DBRef conversion in case the given source value is a nested DBObject. This allows to directly use mongodb operators wrapped in DBObject on association properties.

Original Pull Request: #139.
2014-03-06 11:22:39 +01:00
Oliver Gierke
a0c6b9aa64 DATAMONGO-843 - Register default MongoMappingContext for auditing.
The MongoAuditingRegistrar now also register a fallback MongoMappingContext in case none is present in the BeanDefinitionRegistry.
2014-03-06 09:10:31 +01:00
Thomas Darimont
9370c1ee01 DATAMONGO-843 - Improvements in auditing configuration.
Repositories now declare a fallback MappingContext in case none is configured explicitly to make sure @EnableMongoAuditing also works without an explicit MappingContext bean defined.

AuditingEntityListener is now referring to the IsNewAwareAuditingHandler via an intermediate ObjectFactory to prevent the downstream dependencies from being instantiated eagerly at listener init time.

This is to prevent circular initialization dependencies as Spring accesses ApplicationEventListener beans very early in the container lifecycle to check whether they might be interested in a certain even and just dropped immediately afterwards.

Changed BeanNames.MAPPING_CONTEXT constant to mongoMappingContext to let the XML configuration be consistent with AbstractMongoConfiguration.mongoMappingContext().
2014-03-05 19:50:27 +01:00
Oliver Gierke
2839e9491f DATAMONGO-871 - Add support for arrays as query method return types.
Changed AbstractMongoQuery to potentially convert all query execution results using the DefaultConversionService in case the query result doesn't match the expected return value.

This allows arrays to be returned for collection queries as the conversion service cam transparently convert between collections and arrays.
2014-03-05 10:03:09 +01:00
Oliver Gierke
6963f9e07a DATAMONGO-870 - Added support for sliced query execution.
Added support for Slice as return type for query methods. The execution will expand the requested page size by one to read one more element than actually requested. If that additional element is returned, it will considered to be an indicator for whether a next slice is available.

Related issues: DATACMNS-397.
2014-03-04 17:37:26 +01:00
Thomas Darimont
8dd08a36a0 DATAMONGO-865 - Adjust test dependencies to avoid ClassNotFoundException during test runs.
Added jul-to-slf4j dependency to avoid exceptions being logged during test runs.

Original pull request: #135.
2014-03-04 09:43:49 +01:00
Christoph Strobl
a908e89ef7 DATAMONGO-829 - NearQuery should not default 'num' to zero.
NearQuery now ignores query.getLimit() equal to zero, when adding Query to NearQuery. This has to be done as limit is defaulted to zero within Query which then results in unintended propagation of the parameter.

In case 'num' should be explicitly set to zero one might use 'NearQuery.num(0)' as an alternative to the query approach.

Introduced 'null' check for 'NearQuery.query(Query)' and 'NearQuery.with(Pageable)' along the way.

Original Pull Request: #133
2014-03-03 15:24:09 +01:00
Christoph Strobl
5ace4032ed DATAMONGO-862 - Fixed handling of unmapped paths for updates.
UpdateMapper uses key instead of cleaned property path when not directly pointing to a property.

Original pull request: #132.
2014-02-27 16:55:54 +01:00
Oliver Gierke
621b299f6f DATAMONGO-833 - Add support for reading EnumSets and EnumMaps.
Switched to use Spring Data Commons' CollectionFactory that is capable of creating EnumSets and EnumMaps. Added unit test inspired by pull request #113 for EnumSets and an additional one for EnumMaps.

Slightly refactored the algorithm for reading maps to prevent repeated type lookups.

Related pull request: #113.
2014-02-26 05:35:55 +01:00
Spring Buildmaster
a2628d1b74 DATAMONGO-854 - Prepare next development iteration. 2014-02-24 15:31:41 +01:00
Spring Buildmaster
294616432d DATAMONGO-854 - Release version 1.4.0 RELEASE. 2014-02-24 06:25:32 -08:00
Christoph Strobl
47dd512f95 DATAMONGO-854 - Prepare 1.4.0.RELEASE.
Update artifact version in readme for release and snapshot.
Use commons 1.7.0 resources in docbkx.
Update changelog.
Update version information in notice and readme.

Original pull request: #130.
2014-02-24 15:11:40 +01:00
Thomas Darimont
f16e8d85e5 DATAMONGO-856 - Documentation update.
Removed outdated why-spring-data-doc. Removed CouchDB reference from requirements document. Fixed some typos. Added missing opening <para>-element. Fixed rendering of author information. Added copyright and product name information.

Original pull request: #128.
2014-02-24 11:43:58 +01:00
Christoph Strobl
eb03ae61f2 DATAMONGO-856 - Documentation updates.
Updated references from springsource.org to spring.io. Updated references to mongodb.org. Update vendor to Pivotal Software, Inc. Update required/recommended versions. Update CustomConversions configuration section. Added missing section id's. Fixed some typos. Added missing JavaDoc.

Original pull request: #128.
2014-02-24 11:42:59 +01:00
Christoph Strobl
5be66a3fee DATAMONGO-853 - Update does not allow null keys anymore.
Added check for blank / null keys when adding key to Update.

Original pull request: #129.
2014-02-24 10:51:48 +01:00
Thomas Darimont
d88e4c0e3e DATAMONGO-468 - Verify that one can use a domain object in DbRef field updates.
Added test case to demonstrate that using a domain object as a value for a DbRef field update is already supported.

Original pull request: #127.
2014-02-21 14:03:35 +01:00
Christoph Strobl
57d1449008 DATAMONGO-852 - Update keeps track of fields to be modified.
Update holds a set of fields that modifications are registered for. This information is used to determine if a modification is registered for the version field of a versioned entity. The change was introduced since the present solution did not correctly find the version property correctly within the DBObject resulting from the mapped update.

In case version property is already included in Update automatic version update via $inc is will be skipped.

Original pull request: #126
2014-02-21 13:57:58 +01:00
Oliver Gierke
8d00a0d926 DATAMONGO-404 - Polishing of DBRef creation in Update clauses.
Refactored the internals of UpdateMapper to simplify the code a little. Removed the special converter in favor of handling the mapped key generation directly. This can be removed again, once DATACMNS-444 is fixed.

MetadataBackedField.getPath(String) now also rejects PersistentPropertyPaths the refer to anything else but the id property in case it traverses an association.

Changed MetadataBackedField to return the association property in calls to ….getProperty() as it is the PersistentProperty to hand to the mapping infrastructure for object conversion.

Changed MappingMongoConverter to also check, whether the given source object handed into DBRef creation is of the ID type and simply use that for DBRef creation. This allows creating DBRefs from ids as well.
2014-02-19 21:13:01 +01:00
Oliver Gierke
e3fa844488 DATAMONGO-854 - Upgraded to Spring Data Commons snapshots.
Upgraded to snapshots of Spring Data Commons and the build parent.
2014-02-19 20:02:34 +01:00
Thomas Darimont
58bee75a6b DATAMONGO-404 - Fixed Update.pull(…) handling to work with DBRefs.
We now support pointing to DBRef-mapped properties in Update.pull(…) and also allow to refer to the id of the DBRef to avoid having to create an instance of the entity.
2014-02-19 18:00:49 +01:00
Oliver Gierke
a402395f5c DATAMONGO-849 - Fixed invalid class reference in readme.
Minor sample code optimizations to use MongoClient instead of Mongo. Fixed repository URL.
2014-02-17 16:38:25 +01:00
Oliver Gierke
9d5f8f3ba0 DATAMONGO-848 - Added tweaks to be compatible with Java driver 2.12.
Added build profile to be able to build against next Mongo Java driver version (2.12.0-rc0) currently. Tweaked Bundlor version replacements to allow binding non-OSGi compatible Mongo driver versions.

Exception translator now handles newly introduced MongoServerSelectionException which the driver throws to indicate it can't connect to a MongoDB instance as of driver version 2.12. GridFsTemplate now uses an empty query object instead of null to indicate that no query should be used. 

Adapted test cases to be able to deal with the slightly changed representation of serverUsed in command results (2.12 removed leading slash).
2014-02-17 12:42:19 +01:00
Christoph Strobl
7ebf953063 DATAMONGO-354 - Update.pushAll(…) now supports multiple values.
Update.pushAll(…) now is a multiFieldOperation which allows to send values for different fields within one command.

Original Pull Request: #122.
2014-02-17 11:45:15 +01:00
Christoph Strobl
617ebe0ca7 DATAMONGO-828 - Fixed version checks for updates in MongoTemplate.
Added inspection of the query object to check if the update should only apply to a given version. If so and no documents have been updated we still throw an OptimisticLockingException. For all other cases - like UpdateFirst - zero affected documents is fine.

Original Pull Request: #121.
2014-02-17 11:30:38 +01:00
Christoph Strobl
7f76789664 DATAMONGO-410 - Added test case to show that UpdateMapper considers custom converter.
Original pull request: #124.
2014-02-17 11:03:52 +01:00
Oliver Gierke
81e5919ace DATAMONGO-812 - Added assumptions to not break tests on old MongoDB versions. 2014-02-11 18:50:09 +01:00
Oliver Gierke
efd74956dc DATAMONGO-812 - Polishing.
Changed convertToMongoType(…) to forward type hints to recursive calls to make sure type information is written if a TypeInformation was provided initially. Make sure that UpdateMapper hands in an initial type hint to the converter to make sure type information gets written.

Changed the signature of QueryMapper.getMappedObjectForField(…) to allow customizing the entire entry being added to the result. This is in preparation of more advanced mappings that might have to customize the mapped key.

Fixed newly introduced test cases in MongoTemplateTests.

Original pull request: #112.
2014-02-11 17:39:08 +01:00
Christoph Strobl
49eee40f7e DATAMONGO-812 - Add support for $push $each since $pushAll is deprecated.
$pushAll has been deprecated in MongoDB 2.4. Instead of calling pushAll one can use push in combination with each. The abstraction for pushAll will remain in code for now but may be removed in a subsequent version.

Original pull request: #112.
2014-02-11 17:39:07 +01:00
Thomas Darimont
8e93b844c7 DATAMONGO-830 - Prevent NullPointerException during cache warmup in CustomConversions.
We now use a ConcurrentHashMap to cache the results of custom read target lookups in order to avoid having to traverse the readingPairs for every lookup. The use of ConcurrentHashMap should also prevent potentially NullPointerExceptions from being thrown if custom conversions are initialized in heavily threaded environments.

Original pull request: #117.
2014-02-10 18:49:01 +01:00
Thomas Darimont
3e64432f1a DATAMONGO-842 - Improve documentation in GridFS section.
Rephrased wording for better understanding.

Original pull request: #120.
2014-02-10 10:30:57 +01:00
Thomas Darimont
88c968ad36 DATAMONGO-840 - Improve support for nested field references in SpEL expressions within Projections.
We now correctly add a compound expression that represents a field reference to the previous operation arguments if necessary.

Original pull request: #118.
2014-02-10 10:18:14 +01:00
Thomas Darimont
99eefe0773 DATAMONGO-838 - Cannot refer to expression based field in group operation.
Previously we didn't set a proper target value for the generated expression field. As a potential fix we just use the alias as the target field.

Original pull request: #116.
2014-02-10 09:59:15 +01:00
Oliver Gierke
3d4569be14 DATAMONGO-826 - Remove obsolete milestone repository. 2014-02-09 14:33:27 +01:00
Spring Buildmaster
57455c4a26 DATAMONGO-826 - Prepare next development iteration. 2014-01-29 06:04:53 -08:00
Spring Buildmaster
f9e20d12b2 DATAMONGO-826 - Release version 1.4.0.RC1. 2014-01-29 06:04:50 -08:00
Thomas Darimont
4d6152c65e DATAMONGO-826 - Prepare release 1.4.0.RC1.
Updated project metadata (changelog, notice, readme) and bumped versions.
Fixed broken links in the documentation.
2014-01-29 14:42:52 +01:00
Thomas Darimont
d81cc53c12 DATAMONGO-837 - Upgrade MongoDB java driver to 2.11.4.
Upgraded the MongoDB java driver from 2.11.3 (current) to the recommended bug fix version 2.11.4.
2014-01-29 13:27:46 +01:00
Oliver Gierke
af4b84ea43 DATAMONGO-835 - Code cleanups. 2014-01-28 12:46:36 +01:00
Thomas Darimont
f9110828bc DATAMONGO-407 - Fixed query mapping for updates using collection references.
When an update clause contained a collection element reference (….$.…) we failed to write the type information of the target value object as the key was not translated into a correct property path correctly. We now strip the reference literals and re-apply them when the mapped key is generated.
2014-01-27 19:39:00 +01:00
Christoph Strobl
f301837be5 DATAMONGO-807 - findAndModify(…) now retains type information.
Using findAndUpdate(…) did not retain type information when used to update a whole nested type instead of single fields within the type. We now use the UpdateMapper instead of QueryMapper in doFindAndModify(…).

Original pull request: #110.
2014-01-27 18:12:44 +01:00
Martin Baumgartner
4d29d937eb DATAMONGO-823 - Add bucket attribute to <mongo:gridFsTemplate />.
Original pull request: #114.
2014-01-25 22:05:20 +01:00
Thomas Darimont
86c11bc614 DATAMONGO-790 - Ensure compatibility with Spring Framework 4.0.
The GenericApplicationContext doesn't refresh automatically in Spring 4.x we thus we have to call refresh manually. Also the test for the custom application listener registration fails on 4.0, so I adapted it to run on both versions.
2014-01-24 18:53:42 +01:00
Oliver Gierke
be34b4e503 DATAMONGO-822 - Enable CDI repositories to be instantiated eagerly.
From the CDI extension we now use the callback newly introduced in Spring Data Commons to enable it to trigger eager initialization.

See also: DATACMNS-416.
2014-01-24 18:53:35 +01:00
Thomas Darimont
ebfa2c5689 DATAMONGO-686 - Fixed potential race-condition in QueryMapper.
We now create a new a new DBObject in QueryMapper#getMappedValue() instead of replacing them in the original objects in order to prevent the original query object being manipulated.

Added test case to verify that the original DBObject is not manipulated.

Original pull request: #111.
2014-01-23 13:05:31 +01:00
Oliver Gierke
b245ef2d9e DATAMONGO-826 - Polished poms.
Removed Spring dependency versions and fixed repository declarations.
2014-01-23 12:57:41 +01:00
Christoph Strobl
5ef40d54bc DATAMONGO-811 - UpdateFirst and updateMulti should increase version.
Version of documents are increased when updated via MongoTemplate.updateFirst and MongoTemplate.updateMulti just as it is done when calling MongoTemplate.save(...).

Original pull request #109
2014-01-22 11:03:23 +01:00
Laurent Canet
c679dba438 DATAMONGO-778 - Improved support for geospatial indexing with @GeoSpatialIndexed.
We now support to create geospatial indices of type 2D sphere and geoHaystack using the @GeospatialIndexed annotation on fields.

Original pull request #82, #104.
2014-01-14 15:18:36 +01:00
Thomas Darimont
fd6e4000b5 DATAMONGO-816 - Improve query handling in MongoTemplate.executeQuery().
We now process the given query with the queryMapper before passing it on to the executeQueryInternal(…) in order to deal with potentially required query modifications, e.g. enum value conversions.

Original pull request: #108.
2014-01-14 14:56:00 +01:00
Thomas Darimont
c12a27a8f8 DATAMONGO-805 - Excluding DBRef field in a query causes a MappingException.
Previously we tried to convert all DBRef associations into appropriate DBRef structures even if they were to be ignored. We now ignore excluded properties in DBRef associations correctly.

Original pull request: #102.
2014-01-14 14:32:10 +01:00
Oliver Gierke
df2184f204 DATAMONGO-824 - Added contribution guidelines.
Linked to the actual contribution guidelines maintained in the Spring Data Build project.
2014-01-14 13:43:37 +01:00
Oliver Gierke
e9c8644d23 DATACMNS-414 - Adapted to removed generics in AuditingHandler API. 2014-01-12 20:05:45 +01:00
Thomas Darimont
c730b8f479 DATAMONGO-813 - Improve handling for non-existing resources in GridFSTemplate.
We now return null for a non-existing resource instead of throwing a NPE.

Original pull request: #106.
2013-12-13 15:19:01 +01:00
Thomas Darimont
f3b31fc467 DATAMONGO-808 - Improve ServerAddressPropertyEditor to support IPv6 addresses.
Improved parsing of ServerAddress to be able to handle IPv6 addresses correctly. We now use the actor ServerAddress(InetAddress) to be able to pass an IPv6 address. The constructor which takes a String as the hostname can't deal with IPv6 addresses directly because it tries to extract a port at the wrong location of such an address.

This change should not change the behavior too much, since the constructor ServerAddress(String, int) already calls InetAddress.getByName(...) internally.

Original pull request: #103.
2013-12-13 14:59:12 +01:00
Oliver Gierke
f778b2554c DATAMONGO-799 - Refer to next version of build parent and Spring Data Commons. 2013-12-09 17:24:42 +01:00
Oliver Gierke
9d292f64b9 DATAMONGO-806 - Fixed invalid rendering of id field references.
Tweaked the rendering of projection operations to always use the field based reference lookup to make sure the reference gets rendered aliased. Moved value calculation logic into FieldReference.

Original pull request: #101.
2013-12-09 16:42:29 +01:00
Thomas Darimont
8ab038f83c DATAMONGO-806 - No property _id found for type com.entity.User.
Added test case to reproduce the issue.
2013-12-09 16:42:29 +01:00
Martin Baumgartner
689552c28e DATAMONGO-726 - Fixed classname references in namespace XSDs.
Original pull request: #100.
2013-12-09 09:59:51 +01:00
Thomas Darimont
9ea9912b23 DATAMONGO-799 - Fix failing test in MongoTemplateTests on MongoDB 2.5.x.
Generalized exception message matching to reflect the changed exception message in MongoDB 2.5.x that also works with previous versions of MongoDB.

Original pull request: #97.
2013-12-04 12:19:12 +01:00
Thomas Darimont
a952ce5d2b DATAMONGO-804 - Fix default annotation attribute value for repositoryImplementationPostfix().
Changed repositoryImplementationPostfix() in EnableMongoRepositories to "Impl" to be consistent with other EnableXXXRepositories annotations. Note that this change is of rather documenting nature, as the defaulting of the configuration is applied in DefaultRepositoryConfiguration.getImplementationPostfix() in Spring Data Commons.

Original pull request: #99.
2013-12-04 12:13:14 +01:00
Oliver Gierke
b88d960893 DATAMONGO-802 - Changed return type of AbstractMongoConfiguration.mongoDbFactroy().
Changed the return type of said method to MongoDbFactory to allow subclasses to return a completely different implementation class.
2013-12-04 12:13:01 +01:00
Spring Buildmaster
e44d1f5f9a DATAMONGO-799 - Prepare next development iteration. 2013-11-19 04:19:12 -08:00
Spring Buildmaster
2b5e2361a8 DATAMONGO-799 - Release version 1.4.0.M1. 2013-11-19 04:19:10 -08:00
Thomas Darimont
5737f2d19d DATAMONGO-801 - Prepare release 1.4.0 M1.
Updated project metadata and bumped versions.
2013-11-19 13:12:06 +01:00
Oliver Gierke
60494a6904 DATAMONGO-800 - Improved AuditingIntegrationTests.
Added a tiny Thread.sleep(…) to make sure the assertion works on fast machines. If the operations after the first step all happen within a millisecond, it will fail.
2013-11-18 13:50:42 +01:00
Oliver Gierke
ceb561e3e4 DATAMONGO-792 - Polishing of JavaConfig support for auditing.
Original pull request: #94.
2013-11-12 11:57:25 +01:00
Thomas Darimont
e2d0220cea DATAMONGO-792 - Add support to configure auditing via JavaConfig.
Introduces the necessary infrastructure in form of MongoAuditingRegistrar to configure auditing with MongoDB via Annotation config. The MongoDB auditing feature can be enabled by annotating a configuration class with the EnableMongoAuditing annotation. Added section to reference documentation.

Original pull request: #94.
2013-11-12 11:50:46 +01:00
Oliver Gierke
ea33e8b8c6 DATAMONGO-348 - Improved interceptor for lazy-loaded DBRefs.
We're now routing calls to methods declared on Object into the proxy to not accidentally resolve the lazy loading proxy on access of methods like toString() etc.
2013-11-08 19:19:16 +01:00
Oliver Gierke
506b6a2e85 DATAMONGO-795 - More predictable behavior in CustomConversions.
The target type lookup previously was unpredictable in cases two converters were registered for the same source type. We now use LinkedHashMaps to register the converters and also make sure that we prefer manually registered converters over the default ones.

Related pull request: #96.
2013-11-07 15:11:50 +01:00
Thomas Darimont
7c0eee9e09 DATAMONGO-789 - Support login via different authentication database.
MongoDbUtils now supports to perform the authentication against a dedicated authenticationDatabase - if no authenticationDatabase is given explicitly then the given regular database will be used. The authentication database can be configured via the authentication-dbname attribute of the db-factory element in xml config or by overriding the getAuthenticationDatabaseName() method of AbstractMongoConfiguration.

Original pull request: #92.
2013-11-06 11:59:48 +01:00
Thomas Darimont
332e5eb715 DATAMONGO-791 - Added newAggregation(…) overloads to accept a List.
Aggregations can now be constructed from a list of AggregateOperations. This simplifies the usage in cases where one has to conditionally in- or exclude AggregateOperations from an AggregationPipeline.

Original pull request: #93.
2013-11-06 11:00:31 +01:00
Oliver Gierke
39ee9b56e2 DATAMONGO-793 - Adapt test cases to new initialization model of repositories.
Moved tests for nested repositories to a separate package to prevent initialization of repositories that are not meant to be instantiated actually.
2013-11-05 17:29:05 +01:00
Thomas Darimont
8fb390ee88 DATAMONGO-348 - Added serialization support for lazy-loading for DBRefs. 2013-11-05 15:22:19 +01:00
Oliver Gierke
df1c4496dc DATAMONGO-348 - Polishing of lazy loading implementation.
Extracted DelegatingDbRefResolver and associates from MappingMongoConverter. Let MongoDbFactory expose PersistenceExceptionTranslator only to prevent invalid dependency to core package. Renamed DbRefResolveCallback to ResolverCallback. Removed AbstractDbRefResolver and moved the functionality implemented there (triggering of exception translation) into the DbRefResolver.

MappingMongoConverter now uses a slightly extended version of DbRefResolver so that we can essentially replace the MongoDbFactory dependency with the DbRefResolver one.

Added support for Objenesis based lazy-loading proxies to support domain classes without a default constructor. Explicitly check for Spring 4 being present as with it the default ProxyFactory already supports that out of the box.

Added missing JavaDoc and assertions. A lot of cleanups and removal of deprecation warnings in test cases.
2013-11-04 20:21:10 +01:00
Thomas Darimont
b808fd3003 DATAMONGO-348 - Add support for lazy loading of DbRefs.
Introduced DbRefResolver interface in order to be able to abstract how a DbRef is resolved that is used in MappingMongoConverter#doWithAssociations. The present behaviour was to resolve a DbRef eagerly. This functionality is now implemented by EagerDbRefResolver. In order to support lazy loading we have to provide some means to define the desired loading behaviour. This can now be done via the "lazy"-Attribute on @DbRef which defaults to false.
If the attribute is set to true the LazyDbRefResolver is used to create a Proxy that eagerly loads the required data on demand when one of the (non-Object) proxy methods is called. MongoDbFactory now exposes a MongoExceptionTranslator that is now used by the MappingMongoConverter and MongoTemplate.

Introduced a DelegatingDbRefResolver that can delegate to a DbRefResolveCallback in order  to perform the actual DbRef resolution.
We now use cglib-proxies if necessary if the referenced association is a concrete class.

Added unit tests for lazy loading of interface types, concrete collection types and concrete domain types. Exposed state from LazyLoadingInterceptor for better testability.

Added unit tests for lazy loading of classes with custom PersistenceConstructor.
Moved integration tests for PersonRepository into its own test class.
2013-11-04 20:15:24 +01:00
Oliver Gierke
ed12298271 DATAMONGO-788 - Polishing.
Slightly changed the way the the simple reference rendering for projections is implemented. Introduced an isAliased() method on Field to be able to determine whether the field reference has been renamed explicitly.

Original pull request: #90.
2013-10-31 14:40:09 +00:00
Thomas Darimont
682798325b DATAMONGO-788 - Projection operations do not render synthetic fields properly.
Introduced boolean isSynthetic() attribute to FieldReference to determine if the given reference points to a synthetic field, as this controls whether we render a simple include (1) or a concrete reference ($fieldName) E.g. isSynthetic() would be true for a field reference to _id.

Original pull request: #90.
2013-10-31 14:39:48 +00:00
Oliver Gierke
0e69021486 DATAMONGO-787 - Upgraded to Spring Framework 3.2.4.
Upgraded to Spring Data Commons 1.3.0.BUILD-SNAPSHOT to benefit from upgrade to Spring 3.2.4 Added a workaround in the code introduced for DATAMONGO-774 which now runs into SPR-11031.
2013-10-25 20:57:21 +02:00
Komi Serge Innocent
ae7e24f1b6 DATAMONGO-746 - Creating an IndexInfo now also works with Doubles.
DefaultIndexOperations is now able to detect that in contrast to what's currently documented in the MongoDB reference documentation, it apparently returns double values for the index direction.

Original pull request: #67.
2013-10-25 10:48:57 +02:00
Thomas Darimont
94d4fa613c DATAMONGO-780 - Add support for nested repositories.
Support for considering nested repository interfaces can now be configured on the EnableMongoRepositories annotation via the considerNestedRepositories property.

Original pull request: #87.
2013-10-24 14:56:02 +02:00
Max Leuthäuser
39c9593b39 DATAMONGO-782 - Fixed typo in documentation.
Original pull request: #86.
2013-10-21 17:33:49 +02:00
Oliver Gierke
6e5f3661a8 DATAMONGO-774 - Some polishing in the reference documentation. 2013-10-17 15:54:05 +02:00
Thomas Darimont
2bd78e0bf0 DATAMONGO-774 - Added usage examples for SpEL expressions in projections to the reference documentation.
Original pull request: #81.
2013-10-17 15:54:05 +02:00
Oliver Gierke
dd59cdc59a DATAMONGO-774 - A round of Jürgenization for SpEL support in aggregation framework support.
Introduced dedicated spel package and extracted value objects to encapsulate and express information about the node transformation in a more semantical way.

Moved a lot of the logic contained in the SpelExpressionTransformer into the value objects for cohesiveness and testability. Updated Sonargraph architecture model to reflect the new packages we've introduced.

Original pull request: #81.
2013-10-17 15:54:01 +02:00
Thomas Darimont
7e471e2301 DATAMONGO-774 - Support SpEL expressions to define projection operations in the aggregation framework.
ProjectionOperations can now be built using SpEL expressions as this significantly shortens the code needed to express the project, especially for slightly more complex mathematical expressions.

Projection now has an ….andExpression(…) method that takes a SpEL expression and optional arguments that can be referred to via their index, i.e. a SpEl expression "5 + [0]" can be expanded using ….andExpression("5 + [0]", 7).… so that the projection can be prepared and dynamically get values bound.

Original pull request: #81.
2013-10-17 15:43:29 +02:00
Oliver Gierke
0871a43831 DATAMONGO-764 - Polished JavaDoc in MongoOptionsFactoryBean. 2013-10-15 14:04:09 +02:00
Mike Saavedra
710e77dabe DATAMONGO-764 - Added SSL support to Mongo configuration options.
Added support for allowing Mongo clients to use secure SSL connections by introducing the "ssl" property in MongoOptionsFactoryBean that will enable the use of the configured SSLSocketFactory to create SSLSockets. If no custom SSLSocketFactory is configured SSLSocketFactory#getDefault() will be used. We introduce this configuration in a new version of spring-mongo-1.4.xsd.

Applied Mike Saavedra's pull request (#75) with the above mentioned extensions.

Original pull request: #83.
2013-10-15 14:03:36 +02:00
Oliver Gierke
9c996617e8 DATAMONGO-630 - Polishing in UpdateTests. 2013-10-13 15:23:11 +02:00
Becca Gaspard
eebd49ab8d DATAMONGO-630 - Support $setOnInsert update operator.
Original pull request: #70.
2013-10-13 15:13:12 +02:00
Oliver Gierke
fb979b1734 DATAMONGO-752 - Improved keyword detection in QueryMapper.
The check for keywords in QueryMapper now selectively decides between checks for a nested keyword (DBObject) object and the check for a simple key. This allows the usage of criteria values starting with $ (e.g. { 'myvalue' : '$334' }) without the value being considered a keyword and thus erroneously triggering a potential conversion of the value.

Moved more logic for a keyword into the Keyword value object.
2013-10-13 13:48:00 +02:00
Oliver Gierke
b5c88938e0 DATAMONGO-534 - GridFsTemplate supports queries with sorting.
Upgraded to Mongo Java Driver 2.11.3 to be able to forward the sorting options contained in a Query to eventually be able to return sorted results when querying for files.
2013-10-13 12:41:06 +02:00
Oliver Gierke
4027770701 DATAMONGO-777 - Upgraded to Mongo Java Driver 2.11.3.
Upgraded to latest mongo Java Driver release to benefit from new API introduced (required to fix DATAMONGO-534). Adapted test cases to API changes in DuplicateKey.
2013-10-13 12:41:06 +02:00
Oliver Gierke
a120ce2bb1 DATAMONGO-776 - Improvement in TypeBasedAggregationOperationContext.
TypeBasedAggregationOperationContext's getReference(…) doesn't use a PropertyPath anymore to avoid running the camel case property resolution.
2013-10-10 14:03:25 +02:00
Thomas Darimont
a5d40a049d DATAMONGO-769 - Improve support for arithmetic operators in AggregationFramework.
We now support the usage of field references in arithmetic projection operations.

Original pull request: #80.
2013-10-10 11:09:04 +02:00
Thomas Darimont
f0f12d5296 DATAMONGO-771 - Fix raw JSON string handling in MongoTemplate.insert(…).
We now support insertion of JSON objects as plain strings via MongoTemplate.insert(…).

Original pull request: #79.
2013-10-08 12:46:03 +02:00
Oliver Gierke
24e06cf219 DATAMONGO-761 - Fix path key lookup for non-properties in SpringDataMongoDBSerializer.
In our Querydsl MongodbSerializer implementation we now only inspect the MongoPersistentProperty for a field name if the given path is really a property path. Previously we tried to always resolve a persistent property even if the given path was an array index path, a map key or the like.
2013-10-08 12:37:18 +02:00
Oliver Gierke
1b83ff0382 DATAMONGO-766 - Support for nested field references through @Field.
The MappingMongoConverter now supports reading and writing nested fields from and to documents by using @Field(…) with a path expression, e.g. @Field("a.b"). We now correctly create the nested objects and also reuse the previously created intermediates when populating further properties.

Not that this might cause the need to define field ordering explicitly as later properties might override the values set using a path reference in @Field.
2013-10-08 12:23:24 +02:00
Thomas Darimont
fe41202f96 DATAMONGO-770 - Add support for IgnoreCase in query derivation.
We now support IgnoreCase and AllIgnoreCase in predicate expression for derived queries.

Original pull request: #78.
2013-10-07 17:13:33 +02:00
Thomas Darimont
78235b4799 DATAMONGO-768 - Improve documentation of how to use @PersistenceConstructor.
Added an additional section to chapter 7.3 that describes the parameter value binding when the @PersistenceConstructor annotation including a small usage example. Added a concrete example for the @Value annotation that uses SpEL.

Original pull request: #77.
2013-10-01 14:04:34 +02:00
Thomas Darimont
51ece4353b DATAMONGO-759 - Improved rendering of GroupOperation.
GroupOperation gets the _id field now rendered as null if no group fields were added to the operation. Previously it was rendered as empty document (i.e. { }). While this was technically correct as well, we're now closer to what the MongoDB reference documentation describes.

Original pull request: #73.
2013-09-30 19:20:05 +02:00
Thomas Darimont
51bab838b0 DATAMONGO-757 - Align output of projection operation with MongoDB defaults.
Adjusted FieldProjection to generate an appropriate representation of included / excluded fields (namely :1 for included and :0 for excluded).
Polished guards to handle only _id is allowed to be excluded (DATAMONGO-758).

Original pull request: #76.
2013-09-27 12:56:01 +02:00
Thomas Darimont
361f9daa45 DATAMONGO-753 - Add support for nested field references in aggregation operations.
Aggregation pipelines now correctly handle nested field references in aggregation operations. We introduced FieldsExposingAggregationOperation to mark AggregationOperations that change the set of exposed fields available for processing by later AggregationOperations. Extracted context state out of AggregationOperation to ExposedFieldsAggregationContext for better separation of concerns. Modified toDbObject(…) in Aggregation to only replace the aggregation context when the current AggregationOperation is a FieldExposingAggregationOperation.

Original pull request: #74.
2013-09-26 14:29:04 +02:00
Thomas Darimont
56b23a6dbe DATAMONGO-758 - Current mongodb Versions only support to explicitly exclude the _id property in a projection.
Added guard to FieldProjection.from within ProjectionOption to prevent users from excluding fields other than "_id".
2013-09-24 16:38:20 +02:00
Thomas Darimont
9e15c17e26 DATAMONGO-760 - Add ability to override CRUD methods using @Query.
Annotated @Query with @QueryAnnotation to mark it as a custom query annotation so that one can redeclare a repository CRUD method and let it execute the annotated query instead of triggering the generic implementation.

Original pull request: #71.
2013-09-24 16:09:28 +02:00
Thomas Darimont
a3c77a43b6 DATAMONGO-740 - Prepare for next iteration.
Updated pom.xml to refer to the latest SD Commons Version 1.7.0.BUILD-SNAPSHOT.
2013-09-18 13:50:02 +02:00
Oliver Gierke
55169e2e11 DATAMONGO-740 - Updated readme after 1.4.0.RELEASE release. 2013-09-09 16:10:55 -07:00
Spring Buildmaster
24672e6bdd DATAMONGO-740 - Prepare next development iteration. 2013-09-09 15:56:23 -07:00
Spring Buildmaster
1a46abfbb9 DATAMONGO-740 - Release version 1.3.0.RELEASE. 2013-09-09 15:56:20 -07:00
Thomas Darimont
61284228dd DATAMONGO-740 - Prepare 1.3.0 RELEASE. 2013-09-09 15:50:04 -07:00
Thomas Darimont
8cb92de1ee DATAMONGO-445 - Allow to skip unnecessary elements in NearQuery.
Added support for skipping elements for NearQuery in MongoTemplate. As mongodb currently (2.4.4) doesn't support he skipping of elements in geoNear-Queries we skip the unnecessary elements ourselves. We use the limit & skip information from the given query or an explicitly passed Pageable.

Original pull request: #64.
2013-08-22 09:20:19 +02:00
Oliver Gierke
5d3cc3fa04 DATAMONGO-743 - Added DBObjectToStringConverter.
Added a reading converter to dump DBObject instances into a String directly to enable executing queries against MongoDB into a String version of the query result without marshaling the DBObject into a Map first.
2013-08-22 08:55:07 +02:00
Thomas Darimont
c0b99740dc DATAMONGO-742 - Document CDI integration in reference documentation.
Added chapter for CDI Integration under the new chapter Miscellaneous.

Original pull request: #63.
2013-08-13 12:20:58 +02:00
Randy Watler
595bbd3aa7 DATAMONGO-737 - Register TransactionSynchronization holder once per Mongo instance.
Original pull request: #62.
2013-08-12 09:51:02 +02:00
Chuong Ngo
5d2fc31164 DATAMONGO-738 - Allow to pass collectionName along with entityClass as parameter to update methods in MongoTemplate.
Added appropriate overloaded methods to MongoOperations and MongoTemplate. Applied pull request from Chuong Ngo <chuong.h.ngo.ctr@mail.mil>.
Original pull request: #57.
2013-08-09 15:30:00 +02:00
Thomas Darimont
a9dc0fae69 DATAMONGO-725 - Improve configurability and documentation of TypeMapper on MappingMongoConverter.
Added new attribute type-mapper-ref to the mapping-converter element in spring-mongo-1.3.xsd in order to support the configuration of custom-type-mappers. Removed the unsupported attributes "mongo-ref" and "mongo-template-ref" from the mapping-converter element in spring-mongo-1.3.xsd because they are not considered anymore.

Updated MappingMongoConverterParser to be aware of the new attribute. Added examples for configuring a custom MongoTypeMapper the usage of @TypeAlias to the reference documentation.

Original pull request: #61.
2013-08-09 13:44:53 +02:00
Thomas Darimont
0605c7b753 DATAMONGO-507 - Reject incorrect usage of Criteria#not().
Added a guard to Criteria#(and|or|nor)Operator to prevent wrapping $and, $or or $nor expressions in a $not expression as mongodb currently doesn't support this. Added test case to CriteriaTests to verify that not() works as specified.

Original pull request: #60.
2013-08-09 12:23:25 +02:00
Thomas Darimont
21352a8829 DATAMONGO-602 - Added test case to MongoTemplate tests to verify that querying by BigInteger ids work.
Original pull request: #59.
2013-08-07 18:58:32 +02:00
Oliver Gierke
58e1d2dbd9 DATAMONGO-741 - Fixed check for nested property references in aggregation framework.
Fixed using the actual field reference instead of the field name on resolving. Added equals(…) and hashCode() methods to value objects. Added unit tests for TypeBasedAggregationOperationContext.
2013-08-07 10:21:48 +02:00
Spring Buildmaster
4f7821e3c2 DATAMONGO-732 - Prepare next development iteration. 2013-08-05 08:48:22 -07:00
Spring Buildmaster
9dd866e34a DATAMONGO-732 - Release version 1.3.0.RC1. 2013-08-05 08:48:15 -07:00
Oliver Gierke
def6079795 DATAMONGO-732 - Prepare 1.3.0.RC1.
Updated changelog, notice and readme. Upgraded to Spring Data Build parent 1.1.1.RELEASE. Upgraded to Spring Data Commons RC1. Switched to milestone repository. Updated links to the parts of the Spring Data Commons reference documentation.
2013-08-05 17:33:58 +02:00
Thomas Darimont
f3f537c1a6 DATAMONGO-586 - Adjusted examples in reference documentation.
Modified formatting and moved the detailed descriptions below the example code.
Added example for arithmetic operations in projection operations.
2013-08-05 17:20:37 +02:00
Thomas Darimont
ad44db386b DATAMONGO-586 - Adjusted examples in reference documentation.
Changed examples in reference documentation to reflect the new DSL style.
2013-08-05 14:59:15 +02:00
Thomas Darimont
bcc3bf61b6 DATAMONGO-586 - Add initial support for arithmetic expressions.
Added test cases to ProjectionOperationUnitTests. Adjusted DSL for GroupOperation to be similar to ProjectionOperation. Introduced GroupOperationBuilder to GroupOperation to be able to define an alias for the current GroupOperation. Adjusted test cases in AggregationTests to the new DSL style accordingly. Added test cases to GroupOperationUnitTests for push and addToSet.
2013-08-05 14:58:10 +02:00
Oliver Gierke
1a28a294d1 DATAMONGO-586 - A bit of Jürgenization on ProjectionOperations. 2013-08-05 14:58:09 +02:00
Thomas Darimont
14623a3655 DATAMONGO-586 - Add initial support for arithmetic expressions.
ProjectionOperationBuilder now implements AggregationOperation in order to be able support aliased as well as non alias projection operation expressions. Added test case for arithmetic operations to AggregationTests. Added Product domain class to be able to demonstrate some meaningful arithmetic operations. Applied changes from code review. Added internal private remove method to ProjectionOperation to allow previous operation to support aliasing.
2013-08-05 14:56:45 +02:00
Thomas Darimont
6dcaa31897 DATAMONGO-586 - Added chapter for Aggregation Framework support to the reference documentation.
Added chapter to mongodb.xml.
Added myself to the authors list in index.xml
2013-08-05 14:56:45 +02:00
Oliver Gierke
e57fe346c0 DATAMONGO-586 - Next round of polishing.
Refined the way the aggregation pipeline gets rendered into a DBObject. More tests. Added $avg and shortcut methods to GroupOperations. Fixed ProjectionOperation to use 1 for implicit references. Made ProjectionOperation publicly visible. Added automatic result unwrapping. API consistency, tests, JavaDoc polish.
2013-08-05 14:55:36 +02:00
Thomas Darimont
7dd94949d5 DATAMONGO-586 - Initial support for automatic field reference resolution.
Added automatic field reference resolution which removes the need to have in depth knowledge on how aggregation steps structures the output.
Introduced AggregateOperationContext abstraction to hold the information of available fields for an aggregation step.
Introduced ContextConsumingAggregateOperation and ContextProducingAggregateOperation abstractions to be able to distinguish operations.
Updates test cases to reflect the API changes.
2013-08-05 14:55:35 +02:00
Oliver Gierke
966f971bee DATAMONGO-586 - Some refactorings in the integration test. 2013-08-05 14:55:34 +02:00
Thomas Darimont
aa23c579e8 DATAMONGO-586 - Added test case for complex aggregation framework use case.
Added Zipcode sample dataset from 10gen. Allow Projections to be used in conjunction with GroupOperations. Integration & Refactoring of github contribution by Tobias Trelle and Sebastian Herold. Switched from builder-style to static factory based DSL construction of aggregation specifications. Introduced embedded DSL for convenient construction of aggregation specifications. Added test cases based on mongodb aggregation framework examples. Added more test cases, additional java doc. Added test case for unwind operation (returnFiveMostCommonLikes) in AggregationTests. Other test cases should now also run in CI environment, due to deterministic result ordering. Adjusted write concern to ensure persistence of sample data.

Introduced TypedAggregation which holds type information of the input type of an aggregation. Cleaned up aggregate methods on MongoOperations. Removed HasToDBObject interface. Cleaned up constructors for Aggregation and TypedAggregation.
2013-08-05 14:49:48 +02:00
Oliver Gierke
6b634d08ce DATAMONGO-586 - Yet another round of polish.
Added Apache license headers where missing. Removed separate package. Reduced visibility of ReferenceUtil as it's internal use only. Fixed formatting.
2013-08-05 14:49:25 +02:00
Sebastian Herold
b7b61405f9 DATAMONGO-586 - Further enhancements to Aggregation API.
Fixed parameter names in comments. Add static factory method. Implement basic aggregation operation join point. Implement match operation. Extracted ReferenceUtil. Created starting point of $group operation with _id field definition and $addToSet fields.
2013-08-05 14:49:24 +02:00
Oliver Gierke
4d65aa7207 DATAMONGO-586 - First round of polish.
Fixed or added copyright headers where necessary. Added Tobias as author where necessary. Added @since tags to newly introduced classes and methods. Documented non-nullability of parameters. Polished test cases a bit.
2013-08-05 14:49:24 +02:00
Tobias Trelle
c129c706a3 DATAMONGO-586 - Initial commit for support of the aggregation framework.
Fluent interface for AggregationPipeline, tests. Added type safe versions for aggregation operations $match and $sort. Not null assertions + auto-prefix field in $unwind operation. Type safe impl for projections (first version). Support for $add and $substract in projection.
2013-08-05 14:49:24 +02:00
Johno Crawford
7823385ac7 DATAMONGO-544 - Added support for TTL collection via indexing annotation.
Added expireAfterSeconds attribute to @Indexed and @CompoundIndex annotations. Adapted MongoPersistentEntityIndexCreator to evaluate the attribute and configure the index about to be created if the attribute was configured to something non default.

Original pull request: #55.
2013-07-31 09:17:50 +02:00
Oliver Gierke
21fcfe11c2 DATAMONGO-731 - Adapted API changes for Parameters in Spring Data Commons.
Adapted API changes introduced for DATACMNS-350.
2013-07-30 15:18:25 +02:00
Thomas Darimont
bfe33a446c DATAMONGO-728 - Added missing package-info files. 2013-07-23 16:32:38 +02:00
Thomas Darimont
9be50316c3 DATAMONGO-709 - Added support for restricting results by document types.
Added restrict(…) method to the Query API that generates appropriate filter criteria to restrict the result to certain types only. 

Type restrictions in query expressions are now applied in QueryMapper via a MongoTypeMapper based on information passed in through the query object in a "special" key. Exposed MongoTypeMapper in MongoConverter and MappingMongoConverter. Merged DefaultTypeMapper and DefaultMongoTypeMapper.

Original pull request: #53.
2013-07-23 15:14:09 +02:00
Thomas Darimont
30513267af DATAMONGO-721 - Updates retain type information for collections properties.
Added test case to demonstrate that the fix for DATACMNS-345 will propagate into Spring Data MongoDB.

Original pull request: #52.
2013-07-16 13:39:03 +02:00
Oliver Gierke
d3d480e79b DATAMONGO-724 - Adapted test case to show type information written by converters gets regarded. 2013-07-16 11:51:50 +02:00
Oliver Gierke
c39ad1bbc4 DATAMONGO-724 - Added test case to show registered converters work.
Added test case that shows that if a custom converter doesn't write type information on its own, the managed type can't be used in polymorphic scenarios. Direct type mappings still work as expected.
2013-07-16 10:09:55 +02:00
Oliver Gierke
fcdc29df49 DATAMONGO-723 - Cleand up a few misnamed test cases.
Reactivated test cases that were name Test instead of Tests by accident.
2013-07-12 18:44:13 +02:00
Oliver Gierke
de7120d8dd DATAMONGO-717 - Forward port of test case.
Added a testcase to show the lifecycle callbacks are invoked in the appropriate order.
2013-07-10 23:07:08 +02:00
Thomas Darimont
84df02ae38 DATAMONGO-685 - ServerInfo should return used hostname reported by MongoDB.
Added test case getHostNameShouldReturnServerNameReportedByMongo() to MongoMonitorIntegrationTests. Modified MongoMonitorIntegrationTests to use common mongo-infrastructure configuration. ServerInfo.getHostName() is now derived from serverStatus.serverUsed.

Original pull request: #51.
2013-07-10 14:30:51 +02:00
Thomas Darimont
d6c5907940 DATAMONGO-702 - Allow usage of property names in field specifications.
MongoTemplate now translates property names used in a Query's field specification into the according field names. Refactored delegation in various doFind(…) methods and polished JavaDoc.

Original pull request: #50.
2013-07-09 11:26:20 +02:00
Thomas Darimont
b2fe54c0a1 DATAMONGO-392 - Updates now retain type information.
Extracted mongo type conversion in QueryMapper into delegateConvertToMongoType(…). Introduced QueryMapper subclass UpdateQueryMapper to retain type information in delegateConvertToMongoType(…). Added test case updatesShouldRetainTypeInformation to MongoTemplateTests.

Original pull request: #49.
2013-07-08 15:07:10 +02:00
Oliver Gierke
47a198c688 DATAMONGO-540 - Fixed test case.
Apparently the test case was not working on MongoDB instances in version 2.2.
2013-07-08 14:16:07 +02:00
Thomas Darimont
5d9dbda03b DATAMONGO-688 - Improve detection of id properties.
Added support for precedence of explicit id property mapping over implicit property mappings. Changed BasicMongoPersistentProperty.getFieldName() to return the mongo _id field name only for the "effective" id property considering the owner entity if already set). Added some test cases for all possible cases to MongoMappingContextUnitTests.

Original pull request: #48.
2013-07-08 14:16:00 +02:00
Thomas Darimont
36d52862bc DATAMONGO-671 - Improve test case to not fail on fast machines. 2013-07-05 11:57:15 +02:00
Thomas Darimont
0afbf6fe19 DATAMONGO-540 - Added test case to show findOne(…) works after upsert. 2013-07-05 11:55:37 +02:00
Oliver Gierke
b0bf8cb718 DATAMONGO-714 - Updated formatter to latest changes. 2013-07-05 11:51:20 +02:00
Thomas Darimont
567a8d9d5b DATAMONGO-713 - Fixed some spelling errors in README.md. 2013-07-05 10:29:35 +02:00
Oliver Gierke
ceef18d7a4 DATAMONGO-671 - Added integration tests to show lookups by date are working. 2013-07-03 17:48:04 +02:00
Thomas Darimont
4f57712f12 DATAMONGO-693 - More robust handling of host and replica set config in MongoFactoryBean.
MongoFactoryBean now considers empty strings for the replicaPair property as not set at all. The ServerAdressPropertyEditor also returns null as value for empty text strings. Deprecated setter for replica pair on MongoFactoryBean.
2013-07-02 16:31:02 +02:00
Oliver Gierke
478396c503 DATAMONGO-675 - MongoTemplate now maps Update in findAndModify(…).
The Update object handed to ….findAndModify(…) is now handed through the QueryMapper before being executed.
2013-07-01 08:28:40 +02:00
Oliver Gierke
aa80d1ad0a DATAMONGO-706 - Fixed DBRef conversion for nested keywords. 2013-06-29 13:07:34 +02:00
Oliver Gierke
fd28ab4d33 DATAMONGO-705 - Fixed handling of DBRefs in QueryMapper.
QueryMapper now converts values to become DBRefs correctly in getMappedKeyword(…). Added an exclusion path for the value handling in case we have an $exists keyword.
2013-06-25 19:25:13 +02:00
Andrew Duncan
187c80dfcc DATAMONGO-701 - Improve performance of starts-with and ends-with queries.
This changes the starts-with regex to the prefixed form using ^ to better make use of any index on the queried field. Also changes ending-with queries to use the $ anchor.
2013-06-25 17:56:07 +02:00
Ivan Sopov
389a3ac066 DATAMONGO-704 - Removed references to SimpleMongoConverter from JavaDoc.
Removing SimpleMongoConverter references from javadocs In commit 2832b524d3 MappingMongoConverter was made default instead of SimpleMongoConverter. Also SimpleMongoConverter was completely removed between 1.0.0.M3 and 1.0.0.M4 releases. This is an update for JavaDocs, that still reference SimpleMongoConverter as the default MongoConverter.
2013-06-25 17:31:32 +02:00
Oliver Gierke
297bd3e3dd DATAMONGO-690 - Upgraded to Spring Data Commons snapshots. 2013-06-24 13:39:13 +02:00
Oliver Gierke
b11fba3321 DATAMONGO-694 - Added test case to show overriding accusers works. 2013-06-14 12:44:24 +02:00
Spring Buildmaster
3c68671d86 DATAMONGO-690 - Prepare next development iteration. 2013-06-04 11:37:17 -07:00
Spring Buildmaster
b171f4192d DATAMONGO-690 - Release version 1.3.0.M1. 2013-06-04 11:37:13 -07:00
Oliver Gierke
21a1ce985c DATAMONGO-690 - Prepare 1.3 M1 release.
Upgraded to Spring Data Build 1.1.RELEASE, Spring Data Commons 1.6 M1. Switched to milestone repository. Updated changelog. Refer to latest docs from Spring Data Commons.
2013-06-04 20:07:43 +02:00
Oliver Gierke
97caba50bf DATAMONGO-680, DATAMONGO-681 - Expose MongoOperations.exists(…).
We're now exposing dedicated exists(…) methods on MongoOperations which simply looks up a cursor and inspects it for the presence of at least one element. SimpleMongoRepository implementation now also uses this optimized exists check.
2013-05-24 09:39:25 +02:00
A. B. M. Kowser Patwary
818f739d5a DATAMONGO-676 - SimpleMongoRepository now consistently uses EntityInformation.
SimpleMongoRepository.findOne(…) & ….delete(…) now use entityInformation.getCollectionName() to resolve the collection name to interact with. This allows global customizations of the collection name on repositories.
2013-05-24 09:39:07 +02:00
Oliver Gierke
a44c1fdd2d DATAMONGO-683 - QueryMapper now handles default ID names it mapping metadata missing.
Changed the implementation so that _id is considered an id field if no metadata is present. Heavily refactored QueryMapper internals so that the conversion code is more readable.
2013-05-24 09:16:57 +02:00
Oliver Gierke
6b35ca80d4 DATAMONGO-679 - Fixed saving plain JSON strings in MongoTemplate.
MongoTemplate.doSave(…) didn't handle plain JSON strings correctly. It now correctly passes the marshaled object to the underlying method.
2013-05-23 20:40:16 +02:00
Patryk Wąsik
23b276745c DATAMONGO-677 - QueryMapper now handles DBRefs in Maps correctly.
QueryMapper now handle Map with DBRef value, which is needed to process an update in MongoTemplate.doUpdate(…) to save versioned document correctly.
2013-05-23 20:10:18 +02:00
Oliver Gierke
be0092d3f5 DATAMONGO-682 - Performance improvement for mapping hotspots.
This commit includes the MongoDB specific parts for the mapping subsystem performance improvements. Reworked PerformanceTest to output more reasonable numbers.

Heavily inspired by Patryk Wasik's contribution at https://github.com/SpringSource/spring-data-mongodb/pull/37.

GitHub PR: #37
2013-05-23 19:45:09 +02:00
Oliver Gierke
f36792d419 DATAMONGO-569 - Improved AbstractMongoConfiguration.
AbstractMongoConfiguration doesn't expose a Mongo instance anymore until you explicitly make it one by annotating the implementation method in the configuration sub lass with @Bean.

Removed the custom call to MongoMappingContext.initialize() as Spring call the lifecycle method for us anyway.
2013-05-17 19:50:35 +02:00
Oliver Gierke
31393db6ff DATACMNS-379 - Adapt to changes in Spring Data Commons. 2013-05-13 09:29:04 +02:00
Martin Baumgartner
8b50af07ce DATAMONGO-545 - Add BeforeDeleteEvent and AfterDeleteEvent.
Added events for before and after deletion of documents. Polished logging code in AbstractMongoEventListener.
2013-04-29 15:45:14 +02:00
Oliver Gierke
0eb315a758 DATAMONGO-667 - Removed deprecated types and added further deprecations.
Removed Sort in favor of Sort in Spring Data Commons. Deprecated Order in favor of Direction in Spring Data Commons. Changed implementation to use the types from Spring Data Commons.
2013-04-29 14:02:10 +02:00
Oliver Gierke
976f5dd0e3 DATAMONGO-666 - Fixed architecture violation caused by exception class.
Moved MongoDataIntegrityViolationException into core package to break up package cycle. Updated SOnargraph architecture description to capture issues more closely.
2013-04-29 13:04:24 +02:00
Philipp Schneider
f614364918 DATAMONGO-554 - Add background attribute to @Indexed and @CompoundIndex.
@Indexed and @CompoundIndex now carry a background flag that can be used to enable background indexing. Updated MongoPersistentEntityIndexCreator to consider the flag and hand it to the ensureIndex(…) call.
2013-04-29 11:18:44 +02:00
Philipp Schneider
38a86033be DATAMONGO-663 - Added equals(…) and hashCode() to Field. 2013-04-26 15:37:51 +02:00
Patryk Wąsik
d11c20d548 DATAMONGO-657 - Support DBRef for maps
Added capability to write DBRef instances for map values. Reading had been supported before but writing wasn't.
2013-04-26 14:44:31 +02:00
Oliver Gierke
3e2387ae6b Fix path to repo to make sure snapshot build sees all repositories. 2013-04-26 14:44:19 +02:00
Oliver Gierke
e8a1caec53 Fixed pom.xml to refer to correct snapshot repo. 2013-04-18 13:33:28 +02:00
Oliver Gierke
44c0b14018 DATAMONGO-658 - Major overhaul of README.md.
Simplified repository usage. Added JavaConfig sample. Use Markdown syntax consistently.
2013-04-18 10:38:17 +02:00
Arthur Loder
3daf3fc95b DATAMONGO-658 - Fixes in README.md. 2013-04-18 10:20:05 +02:00
Oliver Gierke
bd3aac8342 DATAMONGO-658 - Pull forward changes made to README in 1.2.x branch. 2013-04-18 10:11:06 +02:00
Oliver Gierke
94f697da10 DATAMONGO-656 - Fixed potential NPE in MongoTemplate. 2013-04-17 10:25:01 +02:00
Oliver Gierke
0cdec56a27 DATAMONGO-651 - MongoTemplate now throws Mongo-specific exception with WriteResult.
If the WriteResultChecking is set to EXCEPTION on a MongoTemplate, we now throw a Mongo-specific exception that captures both the WriteResult and MongoActionOperation for further evaluation.
2013-04-15 17:03:35 +02:00
Patryk Wąsik
9d83331f9f DATAMONGO-652 - Added support for $elemMatch and $ positional operator. 2013-04-12 16:13:58 +02:00
Oliver Gierke
071cd1647f DATAMONGO-571 - Fixed setting null values during update of versioned entities.
In case of updating a versioned object,the Update object is now constructed from plain key value pairs, not using $set anymore. This will correctly set the null values in the updated document.
2013-04-11 12:01:38 +02:00
Oliver Gierke
92af5aa345 DATAMONGO-650 - Added snapshot repository to resolve Spring Data Commons. 2013-04-11 10:57:04 +02:00
Oliver Gierke
c07ad0fdf6 DATAMONGO-648 - Replaced XSD ids with XSD strings.
This change allows usage of Spring Data MongoDB XML namespace elements with <bean /> element using a profile. This scenario creates the case of e.g. two <mongo:db-factory /> declarations in the same XML file.
2013-04-10 20:33:31 +02:00
Oliver Gierke
04e0f5c4a7 DATAMONGO-642 - MongoChangeSetPersister now considers mapped collection.
So far the change set persister has used the plain domain type name to persist data. We now consider the collection name defined by the object mapping (through @Document(collection = "…")).
2013-04-02 11:42:36 +02:00
Oliver Gierke
133975fb44 DATAMONGO-641 - Fixed potential NullPointerException in MongoLog4jAppender.
Log4j appender now only closes Mongo instance if available.
2013-04-02 10:56:34 +02:00
Oliver Gierke
9a372a57e0 DATAMONGO-641 - Reformatting to prepare fix. 2013-04-02 10:53:25 +02:00
Oliver Gierke
e67644094a DATAMONGO-628 - Allow dependency of config layer to GridFS.
Updated Sonargraph architecture description to allow a dependency form the configuration layer to the GridFS layer. Dependency was introduced by c5a99b5b5e.
2013-04-02 10:31:08 +02:00
Martin Baumgartner
c5a99b5b5e DATAMONGO-140, DATAMONGO-628 - Namespace elements for MongoTemplate and GridFsTemplate. 2013-04-02 09:50:45 +02:00
Oliver Gierke
9564bcb280 DATAMONGO-603 - Expose abbreviate field names flag in JavaConfig.
We're now exposing a abbreviateFieldNames() flag in AbstractMongoConfiguration to allow easy enablement of field abbreviation.
2013-03-28 17:16:50 +01:00
Oliver Gierke
f1e961a1ee DATAMONGO-607 - Introduced FieldNamingStrategy SPI interface.
MongoMappingContext can now get a FieldNamingStrategy configured to allow the customization of field names used to persist property values to in case *no manual mapping is defined* (e.g. through @Field). The default strategy will simply use the property name as it did before.

We now also expose a abbreviate-field-names attribute on the <mongo:mapping-context /> XML namespace element to transparently register a CamelCaseAbbreviatingFieldNamingStrategy which abbreviates the property's name to the first letters of its camel-case structure. A property fooBar would then be persisted to a field named fb. If you're not using the XML namespace simply configure the strategy on your MongoMappingContext instance.

To avoid field name mapping ambiguities being introduced through a custom FieldNamingStrategy (imagine the camel-case strategy just mentioned and two properties lastname and level which would both map to l) the PersistentEntity implementation verifies the mapping metadata and throws an exception in case an ambiguity is found.
2013-03-28 16:41:42 +01:00
Oliver Gierke
0c69c87787 DATAMONGO-638 - MappingContext does not create PersistentEntities for AbstractMaps. 2013-03-28 16:28:02 +01:00
Oliver Gierke
48b4a88a6a DATAMONGO-629 - Fixed QueryMapper not to massage queries without type information.
So far the QueryMapper applied the id massaging (especially interpreting the default id keys) even if there was no persistence metadata available to do so. This caused e.g. queries handed into MongoTemplate.count(Query, String) to get keys of "id" massaged into "_id" which shouldn't be the case as we cannot assume anything about the documents and the keys contained in them.

So we now only apply the defaults if there is at least persistence metadata present. This means that for methods on MongoOperations that don't take type information of any kind the queries have to be defined in terms of the document, not the object model as we cannot refer to it.
2013-03-27 12:01:53 +01:00
Andrey Bloschetsov
d0c0866f88 DATAMONGO-637 - Fixed typo in Query.query(…).
Correcting mispelled "critera". Polished JavaDoc a little.
2013-03-27 11:23:35 +01:00
Oliver Gierke
ab18bb5b96 DATAMONGO-636 - Added support for countBy in derived queries.
We now change the query execution to a count execution in case a derived query has the PartTree.isCountProjection() set to true or a query defined in @Query has the newly introduced count() attribute set to true.

As the native Mongo count() returns a long we use a default ConversionService to potentially massage the query result into other numerical types.
2013-03-26 17:51:46 +01:00
Oliver Gierke
509be3d681 DATAMONGO-633 - Upgraded to Querydsl APT plugin 1.0.8. 2013-03-26 17:26:33 +01:00
Oliver Gierke
8e01f95b29 DATAMONGO-635 - Fixed some Sonar warnings. 2013-03-25 18:13:08 +01:00
Oliver Gierke
4dcec1f6e2 DATAMONGO-634 - Expand scope of CDI bean to application scope.
We simply inherit the application scope from the parent implementation.
2013-03-25 17:16:32 +01:00
Oliver Gierke
d3bf6c0a19 DATAMONGO-632 - Removed schemaLocation attribute from namespace references. 2013-03-25 17:16:26 +01:00
Oliver Gierke
3410a0589c DATAMONGO-631 - Explictly reject Order instances with ignore-case flag. 2013-03-25 17:13:02 +01:00
Oliver Gierke
7a64766496 DATAMONGO-633 - Upgraded to Querydsl 3.0.0.
Adapted custom APT serializer implementation to the new API.
2013-03-25 17:12:55 +01:00
Oliver Gierke
e56a8597b8 DATAMONGO-622 - Saving unversioned object now uses doInsert(…).
We now rather use doInsert(…) if a versioned object is saved the first time to prevent accidental updates not bumping the version number.
2013-02-26 11:31:48 +01:00
Oliver Gierke
e13208b4b3 DATAMONGO-621 - Initializing version property now uses ConversionService.
The BeanWrapper used in MongoTemplate.initializeVersionProperty(…) now uses the ConversionService held in the MongoConverter.
2013-02-26 11:29:55 +01:00
Oliver Gierke
6139e83d8d DATAMONGO-620 - Updating versioned object uses explicit collection name.
We're now handing the collection name given to doSaveVersioned(…) to the update method.
2013-02-26 11:29:40 +01:00
Oliver Gierke
f33790013f DATAMONGO-617 - Fixed potential NullPointerException in MongoTemplate.insert(…).
If MongoTemplate.insert(…) was called with a Mongo-simple type (such as a raw DBBobject) it caused a NullPointerException during the lookup of a version property. This is now fixed by correcting the guard.
2013-02-20 12:04:24 +01:00
Oliver Gierke
bf81d95d21 DATAMONGO-613 - Readded JConsole image and upgrade to Spring Data Build 1.1. 2013-02-11 12:40:41 +01:00
Oliver Gierke
158e4f033c DATAMONGO-612 - Fixed reconfiguration of dist.id. 2013-02-11 12:38:40 +01:00
Spring Buildmaster
81097061ad DATAMONGO-609 - Prepare next development iteration. 2013-02-08 04:15:50 -08:00
Spring Buildmaster
2bc6ebc250 DATAMONGO-609 - Release version 1.2.0.RELEASE 2013-02-08 04:15:46 -08:00
Oliver Gierke
909110cf4e DATAMONGO-609 - Upgraded to parent project and Spring Data Commons in release versions. 2013-02-08 13:05:49 +01:00
Oliver Gierke
ba094da5a7 DATAMONGO-611 - Upgraded to MongoDB Java driver 2.10.1. 2013-02-08 13:04:33 +01:00
Oliver Gierke
876b31bc52 DATAMONGO-609 - Updated changelog. 2013-02-08 13:00:20 +01:00
Oliver Gierke
dc8e8281eb DATAMONGO-606 - Add JodaTime specific Converter implementations if JodaTime is present on classpath. 2013-02-06 17:53:41 +01:00
Oliver Gierke
48deb1a150 DATAMONGO-603 - Polishing unit tests. 2013-02-06 17:39:33 +01:00
Michal Vich
48625956b7 DATAMONGO-568 - MongoTemplate.find(…) does not throw NullPointerException anymore.
Trigger findAll(…) if the Query object given to a find(…) method is null.
2013-02-06 15:04:39 +01:00
Michal Vich
782cf6e10d DATAMONGO-81 - Added more unit tests for MongoExceptionTranslator. 2013-02-06 14:41:56 +01:00
Oliver Gierke
c28e51cf86 DATAMONGO-603 - Make sure geo queries and repositories get correct Metric applied.
Heavily refactored NearQuery to simplify implementation and testability. Made sure the metric of the maximum distance is applied if no metric had been defined before. Adapted query preparation code to re-enforce the correct metric being set to not rely on the auto-application behavior of NearQuery to be safe against potential further refactorings.
2013-02-01 15:38:29 +01:00
Oliver Gierke
4b1065cac5 DATAMONGO-598 - Set needed properties and activate distribution artifact upload. 2013-01-29 20:36:41 +01:00
Oliver Gierke
c807b2abcf DATAMONGO-601 - Fixed exposing password by MongoDbUtils.
MongoDbUtils is now not exposing the plain password in the exception message piped into CannotGetMongoDbConnectionException but uses the newly introduced toString() method of UserCredentials (see DATACMNS-275).
2013-01-29 13:16:06 +01:00
Oliver Gierke
19ad2d3aac DATACMNS-274 - Adapt to moved type MappingContextIsNewStrategyFactory. 2013-01-29 10:54:36 +01:00
Oliver Gierke
bd7fe5bfd3 DATAMONGO-600 - Fixed parameter binding for derived queries on properties using polymorphism.
We need to retain the type in the serialized DBObject in case a derived query binds parameters for properties that use polymorphism as we serialize the object as nested document and this only matches in an all-or-nothing way. So if the type information is missing, we won't see any results.

So we now hand the TypeInformation of the property into the conversion logic and transparently skip the type information removal in case the types differ.
2013-01-25 11:29:39 +01:00
Oliver Gierke
a237999037 DATAMONGO-598 - Upgraded to new build infrastructure.
Fixed test ordering issue in Cross-Store module.
2013-01-18 14:45:48 +01:00
Oliver Gierke
d8a9752724 DATAMONGO-592 - Fix handling of SpEL evaluation values.
During constructor parameter value resolution the value to be used can result from evaluating a SpEL expression on the root document. This value has to be converted recursively if it is a nested document. Adapted to the altered API in SD Commons and override potentiallyConvertSpelValue(…) to re-invoke converter.
2012-12-16 15:26:25 +01:00
Philipp Schneider
4c8bf0dec2 DATAMONGO-583 - Using while (…) instead of for (…) for DBCursors to avoid memory leak.
Instead of iterating over the DBCursor using a for-loop we now use a while-loop to avoid the potential memory leak outlined in [0].

[0] https://jira.mongodb.org/browse/JAVA-664
2012-12-13 12:07:11 +05:30
Oliver Gierke
cffc27d83a DATAMONGO-590 - Code cleanups in MongoTemplate and error handling APIs. 2012-12-13 12:06:55 +05:30
Oliver Gierke
42ab4cb7b4 DATAMONGO-588 - Version attribute now gets initialized on insert.
The version attribute of an entity has not been correctly initialized to 0 when inserting objects using MongoTemplate.insert(…). We now set it to 0 correctly.
2012-12-10 17:38:16 +09:00
Oliver Gierke
42df25434f DATAMONGO-585 - Fix concurrency issue in database authentication.
So far the authentication of the database was synchronized *after* the check whether the database under consideration had already been authenticated. This can cause threads to concurrently try to authenticate the database which is rejected by the driver. Improved the synchronization block to include both the ….isAuthenticated() check as well as the authentication.
2012-12-03 11:37:06 +01:00
Oliver Gierke
4e16f7ebe2 DATACMNS-378 - Fixed ClassCastException in MapReduceResults.
The parseCount(…) method now safely converts both integer and long values from the source DBObject.
2012-11-29 13:22:38 +01:00
Oliver Gierke
04a87b373d DATAMONGO-577 - Cleaned up obsolete property.
The version property in BasicMongoPersistentEntity became obsolete after the upgrade in Spring Data Commons.
2012-11-28 17:39:20 +01:00
Oliver Gierke
c4eca7eadd DATAMONGO-581 - Expose the PersistentEntity managed in the repository factory bean.
MongoRepositoryFactoryBean now populates the MappingContext of the super class when the MongoOperations instance is set.
2012-11-28 17:35:47 +01:00
Philipp Schneider
3a9dc2b98c DATAMONGO-503 - Added support for content type on GridFsOperations.
Added methods to take the content type of the file to be created. Clarified nullability of arguments in JavaDoc.

Pull request: #19.
2012-11-28 14:30:05 +01:00
Oliver Gierke
c568b7cbc2 DATAMONGO-580 - Improved BeanDefinitionParser for MappingMongoConverter. 2012-11-27 14:44:00 +01:00
Oliver Gierke
9482350062 DATAMONGO-577 - Added support for auditing.
Introduced AuditingEventListener to invoke auditing subsystem available through Spring Data Commons. Added <mongo:auditing /> namespace element to transparently activate it.
2012-11-27 14:00:52 +01:00
Oliver Gierke
772a140def DATAMONGO-576 - Configure java.util.logging to prevent verbose test logging.
Added Slf4j bridge and configured surefire to configure JUL accordingly.
2012-11-23 10:50:29 +01:00
Oliver Gierke
936259a766 DATAMONGO-575 - Improved implementation of entity metadata lookup in MongoQueryMethod.
Got rid of obsolete EntityInformationCreator API and moved to a custom EntityMetadata extension instead.
2012-11-23 10:00:49 +01:00
Oliver Gierke
533d21281e DATAMONGO-570 - Guard against null values in references.
QueryMapper does not try to convert null values into DBRef objects anymore but returns the plain null value as is.
2012-11-13 18:23:02 +01:00
Oliver Gierke
6977fa87e6 DATAMONGO-573 - Moved to Logback for test logging. 2012-11-13 17:54:19 +01:00
Oliver Gierke
41dcebb010 DATAMONGO-559 - Upgraded to Spring Data Commons 1.5.0.BUILD-SNAPSHOT. 2012-11-13 17:54:19 +01:00
Oliver Gierke
ef077182f6 DATAMONGO-563 - Upgraded MongoDB Java driver to 2.9.2. 2012-11-13 17:54:19 +01:00
Dave Syer
21e7c63766 Update README.md 2012-10-26 18:02:05 +02:00
Jan Kronquist
4b018a9d7d DATAMONGO-562 - Saving versioned entity with already set id now works.
Instead of inspecting the id property we now assume the version property unset for initialization and work from there.
2012-10-24 16:07:26 +02:00
Spring Buildmaster
ecf15b93e0 DATAMONGO-559 - Prepare next development iteration. 2012-10-17 15:00:21 -07:00
Spring Buildmaster
6323a86560 DATAMONGO-559 - Release version 1.1.1.RELEASE. 2012-10-17 14:59:59 -07:00
Oliver Gierke
3b0b7315e1 DATAMONGO-559 - Prepare 1.1.1.RELEASE release. 2012-10-17 17:53:39 -04:00
Oliver Gierke
4aeba6f92d DATAMONGO-553 - MappingMongoConverter now uses MongoPersistentProperty.usePropertyAccess() to decide whether to use property access.
Introduced usePropertyAccess() method on MongoPersistentProperty and added an implementation that forces that into being true for Throwable.cause as using the field would cause a cycle as it points to this in the first place but rather massages the value in the getter.
2012-10-17 17:27:15 -04:00
Oliver Gierke
342f9ae837 DATAMONGO-551 - MongoTemplate can now persist plain Strings if they are valid JSON.
Added some code to MongoTemplate that inspects the object to be saved for being a String. If it is we try to parse the given String into a JSON document and continue as if we had been given a DBObject initially. Non-parseable Strings are rejected with a MappingException.
2012-10-15 11:49:29 -04:00
Oliver Gierke
66d98b355e DATAMONGO-550 - Fixed potential NullPointerExceptions in MongoTemplate.
Guarded access to results of mappingContext.getPersistentEntity(…) to prevent NullPointerExceptions in case the template is used with non-entity types like a plain DBObject. Added some custom logic for plain BasicDBObjects to get the _id field populated appropriately.
2012-10-15 11:38:00 -04:00
Oliver Gierke
cabbe747f8 DATAMONGO-549 - Fixed potential NullPointerException in MongoTemplate.
Added assertions and guards against MongoPersistentEntity lookups resulting in null values.
2012-10-11 08:49:49 +02:00
Spring Buildmaster
5ff3064acd DATAMONGO-541 - Prepare next development iteration. 2012-10-10 05:30:21 -07:00
Spring Buildmaster
3001e2941f DATAMONGO-541 - Release 1.1.0.RELEASE. 2012-10-10 05:30:15 -07:00
Oliver Gierke
9cf72fbdd4 DATAMONGO-541 - Polished pom.xml for Maven central propagation. 2012-10-10 14:23:11 +02:00
Oliver Gierke
cb5144de0f DATAMONGO-541 - Updated reference docs and changelog. 2012-10-10 14:04:43 +02:00
Oliver Gierke
4be90e51ae DATAMONGO-541 - Upgrade to Spring Data Commons 1.4.0.RELEASE. 2012-10-10 14:04:27 +02:00
Oliver Gierke
6c368d557b DATAMONGO-548 - Upgrade to Querydsl 2.8.0. 2012-10-09 09:11:27 +02:00
Oliver Gierke
a8432e13a1 DATAMONGO-543 - Polished quick start section.
Updated dependency versions to Spring Data MongoDB and Spring. Removed explicit dependency listing. Removed section on how to migrate between 1.0 milestones. Removed obsolete paragraphs.
2012-09-20 10:38:46 +02:00
Oliver Gierke
05ac139554 DATAMONGO-528 - Documented GridFs support. 2012-09-17 15:04:19 +02:00
Oliver Gierke
3661b2981e DATAMONGO-456 - Fixed <db-factory /> element documentation in XSD. 2012-09-17 12:22:48 +02:00
Oliver Gierke
69bd7acf74 DATAMONGO-457 - Fixed links in reference documentation. 2012-09-17 12:07:52 +02:00
Oliver Gierke
d882af257f DATAMONGO-539 - Fixed MongoTemplate.remove(object, collectionName).
If the entity being removed using MongoTemplate.remove(object, collectionName) contained an id that could be converted into an ObjectID it wasn't removed correctly currently. This was caused by the fact that the intermediate call didn't hand over the entity type and thus the id conversion failed. This in turn caused the query not to match the previous saved object.
2012-09-17 11:57:09 +02:00
Oliver Gierke
c92058a79a DATAMONGO-539 - Added test case to show removing entity from explicit collection works. 2012-09-13 17:12:50 +02:00
Oliver Gierke
ed2b576261 DATAMONGO-538 - Query API can now work with Sort and Pageable from Spring Data Commons.
Introduced with(Sort sort) and with(Pageable pageable) on Query. Deprecated sort() method and the custom Sort class. Deprecated QueryUtils.applyPagination(…) and ….applySorting(…) and changed internal calls to this to use the Query API directly. Some JavaDoc polishing.
2012-09-13 10:54:23 +02:00
Oliver Gierke
6744446a48 DATAMONGO-532 - Synchronize DB authentication.
In multithreaded environments Mongo database authentication can be triggered twice if two or more threads refer to the same db instance. This is now prevented by synchronizing calls to db.authenticate(…).
2012-09-12 12:58:31 +02:00
Oliver Gierke
fdecec48b2 DATAMONGO-484 - Upgraded to MongoDB driver 2.9.1. 2012-09-12 11:54:20 +02:00
Oliver Gierke
f1289c46e6 DATAMONGO-535 - Fixed broken MongoDBUtils resource synchronization.
We now make sure we really re-use the database instance bound to the thread avoiding duplicate lookups of Mongo.getDb(…). This doesn't seem to gain a big performance benefit anymore as the Mongo instance caches the DB instances internally anyway.
2012-09-11 20:44:33 +02:00
Oliver Gierke
aa0b87be57 DATAMONGO-536 - Fixed package cycle introduced by SerializationUtils. 2012-09-11 20:44:32 +02:00
noter
2040f02d07 DATAMONGO-279 - Added support for optimistic locking.
Introduced @Version annotation to demarcate a version property on an entity. MongoTemplate will initialize this property on the first save of an instance if not set already. If it is already set, the template will bump the version number on subsequent saves and actually trigger an update backed by a query including the old version number. A failure to update the entity accordingly will then trigger an OptimisticLockingFailureException.

General JavaDoc polish in mapping package.
2012-09-11 20:44:15 +02:00
Oliver Gierke
13a69ecdfd DATAMONGO-533 - Fixed index creator registration.
In cases an ApplicationContext already contains a MongoPersistentEntityIndexCreator the default one is not registered, even if the one in the ApplicationContext listens to another MappingContext's events.

Polished iterable classes setup in MongoTemplate along the way. Some JavaDoc polishes as well.
2012-09-07 21:21:17 +02:00
Oliver Gierke
f0051deff0 Formatting. 2012-09-06 16:25:40 +02:00
Oliver Gierke
05a8148084 DATAMONGO-530 - Fixed propagation of setApplicationContext(…) in MappingMongoConverter. 2012-09-06 16:06:47 +02:00
Oliver Gierke
aaa44b3369 DATAMONGO-529 - Update Querydsl setup to use 1.0.4.
Raised Maven compiler plugin version to 2.5.1.
2012-09-06 15:48:41 +02:00
Oliver Gierke
7f35c4430d DATAMONGO-527 - Fixed Criteria.equals(…). 2012-09-03 18:47:25 +02:00
Oliver Gierke
114489d19a DATAMONGO-521 - Added test case to show that repository AND query works. 2012-09-03 16:46:53 +02:00
Oliver Gierke
eda8200d51 DATAMONGO-523 - Added test case to verify type alias detection. 2012-09-03 15:24:21 +02:00
Oliver Gierke
b078ea9ceb DATAMONGO-513 - Update to Spring Data Commons 1.4.0.BUILD-SNAPSHOT. 2012-09-03 15:24:20 +02:00
mpollack
d90b1a0ddd DATAMONGO-526 - Polished README.md.
Update README to remove references to old API, Docs links as well as CouchDB. Remove reference to Spring Data Document, copy initial paragraph introducing the project from http://www.springsource.org/spring-data/mongodb
2012-09-03 15:23:33 +02:00
Spring Buildmaster
737a42e07a DATAMONGO-513 - Prepare next development iteration. 2012-08-24 02:24:09 -07:00
Spring Buildmaster
22d5d4c019 DATAMONGO-513 - Release 1.1.0.RC1. 2012-08-24 02:24:06 -07:00
Oliver Gierke
7ac1e7b6e1 DATAMONGO-513 - Prepare changelog for 1.1.0.RC1. 2012-08-24 11:16:54 +02:00
Oliver Gierke
e86ab783f3 DATAMONGO-513 - Update to Spring Data Commons 1.4.0.RC1. 2012-08-23 20:12:41 +02:00
Oliver Gierke
6fe3e67ecb DATAMONGO-517 - Fixed complex keyword handling.
Introduced intermediate getMappedKeyword(Keyword keyword, MongoPersistentProperty property) to correctly return a DBObject for keyword plus converted value. A few refactorings and improvements in the implementation of QueryMapper (Keyword value object etc.).
2012-08-21 12:25:30 +02:00
Oliver Gierke
3b78034c55 DATAMONGO-519 - Make Spring 3.1.2.RELEASE default Spring dependency version.
We move away from Maven version ranges as they complicate the build and dependency resolution process. They make the build in-reproducible. Users stuck with a 3.0.x version of Spring will now have to manually declare Spring dependencies in their needed 3.0.x version. Not that at least Spring 3.0.7 is required currently.
2012-08-20 17:19:37 +02:00
Oliver Gierke
a06a69797f DATACMNS-214 - Adapted API change in Spring Data Commons.
Plus additional cleanups.
2012-08-16 14:10:02 +02:00
Oliver Gierke
8b1557e38c DATAMONGO-506 - Added test case to show BasicQuery is working for nested properties. 2012-08-15 19:40:02 +02:00
Oliver Gierke
fcdc6d0df2 DATAMONGO-511 - QueryMapper now maps associations correctly.
Complete overhaul of the QueryMapper to better handle complex scenarios like property paths and association references.
2012-08-15 18:30:29 +02:00
Oliver Gierke
83b6cd7f05 DATAMONGO-510 - Criteria now only uses BasicDBList internally. 2012-08-15 18:30:15 +02:00
Oliver Gierke
38a9a6d51d DATAMONGO-509 - SimpleMongoRepository.exists(…) now avoids loading unnecessary data.
We're explicitly ruling out the entities attributes via a field spec to avoid unnecessary object marshaling just to find out whether it exists or not.
2012-08-15 18:16:03 +02:00
Oliver Gierke
5e2f16c678 DATAMONGO-508 - Eagerly return DBRef creation if the given value already is a DBRef. 2012-08-15 16:38:56 +02:00
Oliver Gierke
d7ae95a779 DATAMONGO-505 - Fixed handling of parameter binding of associations and collection values.
Instead of converting given association values as-is, ConvertingParameterAccessor now converts each collection value into a DBRef individually.
2012-08-15 15:16:44 +02:00
Oliver Gierke
8fbdf9afbd DATACMNS-212 - Apply refactorings in Spring Data Commons. 2012-08-13 14:27:19 +02:00
Oliver Gierke
f5a4d78e62 DATAMONGO-476 - @EnableMongoRepositories is now inherited into sub-classes. 2012-08-13 11:37:59 +02:00
Oliver Gierke
1f4264e6a7 DATAMONGO-472 - MongoQueryCreator now correctly translates Not keyword.
We're now translating a negating property reference into a ne(…) call instead of a not().is(…).
2012-08-10 19:58:36 +02:00
Oliver Gierke
05baa851d8 DATAMONGO-502 - QueryMapper now translates property names into field names. 2012-08-08 18:56:18 +02:00
Oliver Gierke
35e8ae1224 DATAMONGO-500 - Index creation is only done for the correct MappingContext.
Index creation now double checks the MappingContext a MappingContextEvent originates from before actually creating indexes. This avoids invalid indexes being created in a multi-database scenario.
2012-07-31 15:34:50 +02:00
Oliver Gierke
ceec0bcc4a DATAMONGO-499 - Fixed namespace reference to repository XSD. 2012-07-31 10:39:03 +02:00
Oliver Gierke
0d87e7fa5f DATAMONGO-496 - AbstractMongoConfiguration now defaults mapping base package.
AbstractMappingConfiguration now uses the package of the class extending it to enable entity scanning for that package. To disable entity scanning entirely override the method to return null or an empty String.
2012-07-30 18:00:42 +02:00
Oliver Gierke
a530629d97 DATAMONGO-497 - Fixed reading of empty collections.
Reading an empty collection always returned a HashSet assuming the returned value would be converted into the assigned properties value later on. However the method should rather return the correct type already which we do now by invoking the potential conversion.
2012-07-30 15:39:22 +02:00
Oliver Gierke
ba0232b187 DATAMONGO-494 - QueryMapper now forwards entity metadata into nested $(n)or criterias.
Introduced helper class to ease assertions on DBObjects as well.
2012-07-27 10:39:31 +02:00
Oliver Gierke
04a17cacb7 DATAMONGO-495 - Fixed debug output in MongoTemplate.doFind(…).
Using SerializationUtils to safely output the query to be executed.
2012-07-26 10:35:30 +02:00
Oliver Gierke
761d725fce DATAMONGO-493 - Fixed broken $ne handling in QueryMapper.
$ne expressions are now only being tried to be converted into an ObjectId in case they follow an id property. Previously they tried in every case which might have led to Strings being converted into ObjectIds that accidentally were valid ObjectIds but didn't represent an id at all.
2012-07-24 20:43:52 +02:00
Oliver Gierke
726b0b1bcc DATAMONGO-493 - Added test case to show the described scenario is working. 2012-07-24 20:13:50 +02:00
Spring Buildmaster
888e031452 DATAMONGO-491 - Prepare next development iteration. 2012-07-24 07:57:13 -07:00
596 changed files with 90046 additions and 12870 deletions

22
.travis.yml Normal file
View File

@@ -0,0 +1,22 @@
language: java
jdk:
- oraclejdk8
services:
- mongodb
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

1
CONTRIBUTING.MD Normal file
View File

@@ -0,0 +1 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

210
README.md
View File

@@ -1,155 +1,147 @@
Spring Data - Document
======================
# Spring Data MongoDB
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
As the name implies, the **Document** modules provides integration with document databases such as [MongoDB](http://www.mongodb.org/) and [CouchDB](http://couchdb.apache.org/).
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
Getting Help
------------
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
At this point your best bet is to look at the Look at the [JavaDocs](http://static.springsource.org/spring-data/data-document/docs/1.0.0.BUILD-SNAPSHOT/spring-data-mongodb/apidocs/) for MongoDB integration and corresponding and source code. For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80). If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
## Getting Help
The [User Guide](http://static.springsource.org/spring-data/data-document/docs/1.0.0.BUILD-SNAPSHOT/reference/html/) (A work in progress).
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
Quick Start
-----------
### Maven configuration
## MongoDB
Add the Maven dependency:
For those in a hurry:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
* Download the jar through Maven:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Connection Affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
Future plans are to support optional logging and/or exception throwing based on WriteResult return value, common map-reduce operations, GridFS operations. A simple API for partial document updates is also planned.
### Spring Data repositories
### Easy Data Repository generation
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
To simplify the creation of Data Repositories a generic Repository interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
The Repository interface is
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
public interface Repository<T, ID extends Serializable> {
List<Person> findByLastname(String lastname);
T save(T entity);
List<Person> findByFirstnameLike(String firstname);
}
```
List<T> save(Iterable<? extends T> entities);
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
T findById(ID id);
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
boolean exists(ID id);
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
List<T> findAll();
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
Long count();
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
void delete(T entity);
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
void delete(Iterable<? extends T> entities);
```xml
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
void deleteAll();
}
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
The MongoRepository extends Repository and will in future add more Mongo specific methods.
```java
@Service
public class MyService {
public interface MongoRepository<T, ID extends Serializable> extends
Repository<T, ID> {
}
private final PersonRepository repository;
SimpleMongoRepository is the out of the box implementation of the MongoRepository you can use for basid CRUD operations.
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
To go beyond basic CRUD, extend the MongoRepository interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
public void doWork() {
For example, given a Person class with first and last name properties, a PersonRepository interface that can query for Person by last name and when the first name matches a regular expression is shown below
repository.deleteAll();
public interface PersonRepository extends MongoRepository<Person, Long> {
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> findByLastname(String lastname);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
List<Person> findByFirstnameLike(String firstname);
}
You can have Spring automatically generate the implemention as shown below
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<property name="defaultCollectionName" value="springdata" />
</bean>
<bean class="org.springframework.data.document.mongodb.repository.MongoRepositoryFactoryBean">
<property name="template" ref="template" />
<property name="repositoryInterface" value="org.springframework.data.document.mongodb.repository.PersonRepository" />
</bean>
This will register an object in the container named PersonRepository. You can use it as shown below
@Service
public class MyService {
@Autowired
PersonRepository repository;
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
## CouchDB
TBD
Contributing to Spring Data
---------------------------
## Contributing to Spring Data
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.springsource.org/forumdisplay.php?f=80) by responding to questions and joining the debate.
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://www.springsource.org/node/feed) to springframework.org
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

View File

@@ -73,7 +73,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
@@ -102,7 +102,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
@@ -124,7 +124,7 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
@@ -135,9 +135,9 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
@@ -150,12 +150,12 @@
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
@@ -182,11 +182,11 @@
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
@@ -237,7 +237,7 @@
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
@@ -251,12 +251,12 @@
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>

434
pom.xml
View File

@@ -1,291 +1,171 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.1.0.M2</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-parent</module>
</modules>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<developers>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<modelVersion>4.0.0</modelVersion>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0</url>
<comments>
Copyright 2010 the original author or authors.
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<packaging>pom</packaging>
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://projects.spring.io/spring-data-mongodb</url>
http://www.apache.org/licenses/LICENSE-2.0
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.7.0.RELEASE</version>
</parent>
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
See the License for the specific language governing permissions and
limitations under the License.
</comments>
</license>
</licenses>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
</modules>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
<dist.releaseType>snapshot</dist.releaseType>
<dist.finalName>${dist.id}-${dist.version}</dist.finalName>
<dist.fileName>${dist.finalName}.zip</dist.fileName>
<dist.filePath>target/${dist.fileName}</dist.filePath>
<dist.bucketName>dist.springframework.org</dist.bucketName>
<!-- these properties should be in ~/.m2/settings.xml
<dist.accessKey>s3 access key</dist.accessKey>
<dist.secretKey>s3 secret key</dist.secretKey>
-->
</properties>
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.11.0.RELEASE</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
<build>
<extensions>
<extension>
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<developers>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
<developer>
<id>tdarimont</id>
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>cstrobl</id>
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
</developers>
<plugins>
<plugin>
<groupId>com.agilejava.docbkx</groupId>
<artifactId>docbkx-maven-plugin</artifactId>
<!-- yes it really needs to be this (2.0.7) otherwise pdf generation from a clean build doesn't work -->
<version>2.0.7</version>
<executions>
<execution>
<goals>
<goal>generate-html</goal>
<goal>generate-pdf</goal>
</goals>
<phase>pre-site</phase>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.docbook</groupId>
<artifactId>docbook-xml</artifactId>
<version>4.4</version>
<scope>runtime</scope>
</dependency>
</dependencies>
<configuration>
<includes>index.xml</includes>
<xincludeSupported>true</xincludeSupported>
<foCustomization>${project.basedir}/src/docbkx/resources/xsl/fopdf.xsl</foCustomization>
<htmlStylesheet>css/html.css</htmlStylesheet>
<chunkedOutput>false</chunkedOutput>
<htmlCustomization>${project.basedir}/src/docbkx/resources/xsl/html.xsl</htmlCustomization>
<useExtensions>1</useExtensions>
<highlightSource>1</highlightSource>
<highlightDefaultLanguage />
<!-- callouts -->
<entities>
<entity>
<name>version</name>
<value>${project.version}</value>
</entity>
</entities>
<postProcess>
<copy todir="${project.basedir}/target/site/reference">
<fileset dir="${project.basedir}/target/docbkx">
<include name="**/*.html" />
<include name="**/*.pdf" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources">
<include name="**/*.css" />
<include name="**/*.png" />
<include name="**/*.gif" />
<include name="**/*.jpg" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources/images">
<include name="*.png" />
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf" tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf" failonerror="false" />
</postProcess>
</configuration>
</plugin>
<profiles>
<profile>
<plugin>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.5</version>
<configuration>
<aggregate>true</aggregate>
<breakiterator>true</breakiterator>
<header>Spring Data Document</header>
<source>1.5</source>
<quiet>true</quiet>
<javadocDirectory>${project.basedir}/src/main/javadoc</javadocDirectory>
<overview>${project.basedir}/src/main/javadoc/overview.html</overview>
<stylesheetfile>${project.basedir}/src/main/javadoc/spring-javadoc.css</stylesheetfile>
<!-- copies doc-files subdirectory which contains image resources -->
<docfilessubdirs>true</docfilessubdirs>
<links>
<link>http://static.springframework.org/spring/docs/3.0.x/javadoc-api</link>
<link>http://download.oracle.com/javase/1.5.0/docs/api</link>
<link>http://api.mongodb.org/java/2.3</link>
</links>
</configuration>
</plugin>
<plugin><!--
run `mvn package assembly:assembly` to trigger assembly creation.
see http://www.sonatype.com/books/mvnref-book/reference/assemblies-set-dist-assemblies.html -->
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-5</version>
<inherited>false</inherited>
<executions>
<execution>
<id>distribution</id>
<goals>
<goal>single</goal>
</goals>
<phase>package</phase>
<configuration>
<descriptors>
<descriptor>${project.basedir}/src/assembly/distribution.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>upload-dist</id>
<phase>deploy</phase>
<configuration>
<tasks>
<ant antfile="${basedir}/src/ant/upload-dist.xml">
<target name="upload-dist" />
</ant>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.springframework.build</groupId>
<artifactId>org.springframework.build.aws.ant</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<dependency>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
<version>0.7.2</version>
</dependency>
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>
<id>mongo-next</id>
<properties>
<mongo>2.14.0-SNAPSHOT</mongo>
</properties>
<pluginRepositories>
<!-- Necessary for the build extension -->
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/spring-data</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.2</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
</profiles>
<dependencies>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -1,139 +1,139 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M2</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<modelVersion>4.0.0</modelVersion>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.1.0.M2</version>
</dependency>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<properties>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>3.5.5-Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
<dependencies>
</dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
</dependency>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.2</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
<!-- <aspectLibrary>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-aspects</artifactId>
</aspectLibrary> -->
</aspectLibraries>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.8.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,8 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
@@ -34,20 +34,20 @@ import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
protected final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
@@ -58,6 +58,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
@@ -100,15 +104,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
@@ -169,8 +183,13 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -21,13 +21,12 @@ import javax.persistence.EntityManager;
import javax.persistence.Transient;
import javax.persistence.Entity;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
@@ -44,7 +43,7 @@ import org.springframework.transaction.support.TransactionSynchronizationManager
*/
public aspect MongoDocumentBacking {
private static final Log LOGGER = LogFactory.getLog(MongoDocumentBacking.class);
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDocumentBacking.class);
// Aspect shared config
private ChangeSetPersister<Object> changeSetPersister;

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for Spring Data's MongoDB cross store support.
*/
package org.springframework.data.mongodb.crossstore;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,9 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
@@ -26,7 +28,6 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
@@ -35,55 +36,76 @@ import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
MongoTemplate mongoTemplate;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
EntityManager entityManager;
@Autowired
PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate;
@Before
public void setUp() {
txTemplate = new TransactionTemplate(transactionManager);
clearData(Person.class);
Address address = new Address(12, "MAin St.", "Boston", "MA", "02101");
Resume resume = new Resume();
resume.addEducation("Skanstulls High School, 1975");
resume.addEducation("Univ. of Stockholm, 1980");
resume.addJob("DiMark, DBA, 1990-2000");
resume.addJob("VMware, Developer, 2007-");
final Person person = new Person("Thomas", 20);
person.setAddress(address);
person.setResume(resume);
person.setId(1L);
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.persist(person);
return null;
}
});
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
@After
public void tearDown() {
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.remove(entityManager.find(Person.class, 1L));
return null;
}
});
}
private void clearData(Class<?> domainType) {
String collectionName = mongoTemplate.getCollectionName(domainType);
mongoTemplate.dropCollection(collectionName);
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
@@ -91,15 +113,18 @@ public class CrossStoreMongoTests {
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; " + "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
found.setAge(44);
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
entityManager.merge(found);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
@@ -111,14 +136,19 @@ public class CrossStoreMongoTests {
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
entityManager.detach(detached);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
Person result = entityManager.merge(detached);
entityManager.flush();
return result;
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
@@ -127,7 +157,7 @@ public class CrossStoreMongoTests {
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
@@ -154,8 +184,10 @@ public class CrossStoreMongoTests {
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
for (DBObject dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;

View File

@@ -1,12 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<!--
<logger name="org.springframework" level="debug" />
-->
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -0,0 +1,18 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
org.springframework.*;version="${spring:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -0,0 +1,42 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
<name>Spring Data MongoDB - Distribution</name>
<description>Distribution build for Spring Data MongoDB</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.MongoLog4jAppenderIntegrationTests",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -1,44 +1,30 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M2</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB Log4J Appender</name>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<mongo.version>2.3</mongo.version>
</properties>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB - Log4J Appender</name>
<dependencies>
<properties>
<log4j>1.2.16</log4j>
</properties>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
</dependency>
<dependencies>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</dependencies>
</project>

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
@@ -21,188 +20,207 @@ import java.util.Arrays;
import java.util.Calendar;
import java.util.Map;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.Level;
import org.apache.log4j.MDC;
import org.apache.log4j.PatternLayout;
import org.apache.log4j.spi.LoggingEvent;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Log4j appender writing log entries into a MongoDB instance.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppender extends AppenderSkeleton {
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {
}
public MongoLog4jAppender() {
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public String getHost() {
return host;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
public void setHost(String host) {
this.host = host;
}
public int getPort() {
return port;
}
public int getPort() {
return port;
}
public void setPort(int port) {
this.port = port;
}
public void setPort(int port) {
this.port = port;
}
public String getDatabase() {
return database;
}
public String getDatabase() {
return database;
}
public void setDatabase(String database) {
this.database = database;
}
public void setDatabase(String database) {
this.database = database;
}
public String getCollectionPattern() {
return collectionPattern;
}
public String getCollectionPattern() {
return collectionPattern;
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public String getApplicationId() {
return applicationId;
}
public String getApplicationId() {
return applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
@SuppressWarnings({"unchecked"})
@Override
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)
*/
@Override
@SuppressWarnings({ "unchecked" })
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
String coll = collectionLayout.format(event);
String coll = collectionLayout.format(event);
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
public void close() {
mongo.close();
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#close()
*/
public void close() {
public boolean requiresLayout() {
return true;
}
if (mongo != null) {
mongo.close();
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#requiresLayout()
*/
public boolean requiresLayout() {
return true;
}
}

View File

@@ -0,0 +1,5 @@
/**
* Infrastructure for to use MongoDB as a logging sink.
*/
package org.springframework.data.mongodb.log4j;

View File

@@ -1,75 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Calendar;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
*/
public class AppenderTest {
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -0,0 +1,77 @@
/*
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
@Before
public void setUp() throws Exception {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import org.junit.Test;
/**
* Unit tests for {@link MongoLog4jAppender}.
*
* @author Oliver Gierke
*/
public class MongoLog4jAppenderUnitTests {
/**
* @see DATAMONGO-641
*/
@Test
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}
}

View File

@@ -10,11 +10,4 @@ log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.data.document.mongodb=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,10 +1,9 @@
Bundle-SymbolicName: org.springframework.data.mongodb.log4j
Bundle-Name: Spring Data Mongo DB Log4J Appender
Bundle-Vendor: SpringSource
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo.version:[=.=,+1.0.0)}",
org.apache.log4j.*;version="[1.2.15, 2.0.0)",
org.apache.log4j.spi.*;version="[1.2.15, 2.0.0)"
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,272 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.1.0.M2</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- versions for commonly-used dependencies -->
<junit.version>4.10</junit.version>
<log4j.version>1.2.16</log4j.version>
<org.mockito.version>1.9.0</org.mockito.version>
<org.slf4j.version>1.6.1</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.4.0.M1</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
<bundlor.failOnWarnings>true</bundlor.failOnWarnings>
</properties>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/download/community
</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<dependencies>
<!-- Test dependencies -->
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-library</artifactId>
<version>1.2.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit-dep</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>1.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${org.slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<extensions>
<extension>
<!--
available only in the springframework maven repository. see
<repositories> section below
-->
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<resources>
<resource>
<directory>${project.basedir}/src/main/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<directory>${project.basedir}/src/main/resources</directory>
<includes>
<include>**/*</include>
</includes>
</resource>
</resources>
<testResources>
<testResource>
<directory>${project.basedir}/src/test/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
<testResource>
<directory>${project.basedir}/src/test/resources</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.4</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
<compilerArgument>-Xlint:-path</compilerArgument>
<showWarnings>true</showWarnings>
<showDeprecation>false</showDeprecation>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3.1</version>
<configuration>
<useDefaultManifestFile>true</useDefaultManifestFile>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<!--
configures the springsource bundlor plugin, which generates
OSGI-compatible MANIFEST.MF files during the 'compile' phase of
the maven build. this plugin is declared within the
pluginManagement section because not every module that inherits
from this pom needs bundlor's services, e.g.:
spring-integration-samples and all its children. for this reason,
all modules that wish to use bundlor must declare it explicitly.
it is not necessary to specify the <version> or <configuration>
sections, but groupId and artifactId are required. see
http://static.springsource.org/s2-bundlor/1.0.x/user-guide/html/ch04s03.html
for more info
-->
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
<version>1.0.0.RELEASE</version>
<configuration>
<failOnWarnings>${bundlor.failOnWarnings}</failOnWarnings>
</configuration>
<executions>
<execution>
<id>bundlor</id>
<goals>
<goal>bundlor</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>http://repo.springsource.org/libs-snapshot</url>
</repository>
</repositories>
<reporting>
<plugins>
<plugin>
<!--
significantly speeds up the 'Dependencies' report during site
creation see
http://old.nabble.com/Skipping-dependency-report-during-Maven2-site-generation-td20116761.html
-->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>2.1</version>
<configuration>
<dependencyLocationsEnabled>false</dependencyLocationsEnabled>
</configuration>
</plugin>
</plugins>
</reporting>
</project>

View File

@@ -1,146 +1,183 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.0.3.1152">
<scope name="spring-data-mongodb" type="Project">
<element name="Filter" type="TypeFilterReferenceOverridden">
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
<context version="7.1.10.209">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
</element>
<architecture>
<element name="Config" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="WeakTypePattern"/>
<element type="Layer" name="Repositories">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.repository.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
<element type="Subsystem" name="API">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.repository.*"/>
</element>
</element>
<element type="Subsystem" name="Query">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.query.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Implementation">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.support.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Config">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
<element name="Repositories" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.**" type="IncludeTypePattern"/>
<element type="Layer" name="Config">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.config.**"/>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.*" type="IncludeTypePattern"/>
</element>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
</element>
<element name="Implementation" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.support.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
</element>
<element name="Config" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|GridFS" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Monitoring" type="AllowedDependency"/>
</element>
<element name="Monitoring" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.monitor.**" type="IncludeTypePattern"/>
<element type="Layer" name="Monitoring">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.monitor.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
<element name="GridFS" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.gridfs.**" type="IncludeTypePattern"/>
<element type="Layer" name="GridFS">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.gridfs.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
<element name="Core" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="IncludeTypePattern"/>
</element>
<element name="Mapping" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.mapping.**" type="IncludeTypePattern"/>
<element type="Layer" name="Core">
<element type="TypeFilter" name="Assignment"/>
<element type="Subsystem" name="Mapping">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.mapping.**"/>
</element>
</element>
<element name="Geospatial" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.geo.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Geospatial">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.geo.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Query">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.query.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element name="Index" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.index.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element name="Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.core.**" type="WeakTypePattern"/>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.spel.**"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
<element type="Subsystem" name="Aggregation">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.aggregation.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|SpEL" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Index">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.index.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Aggregation" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.util.**"/>
</element>
<stereotype name="Unrestricted"/>
<stereotype name="Public"/>
</element>
</element>
<element type="Subsystem" name="API">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.*"/>
</element>
<stereotype name="Public"/>
</element>
</architecture>
<workspace>
<element name="src/main/java" type="JavaRootDirectory">
<element type="JavaRootDirectory" name="src/main/java">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
<element name="target/classes" type="JavaRootDirectory">
<element type="JavaRootDirectory" name="target/classes">
<reference name="Project|spring-data-mongodb::BuildUnit|spring-data-mongodb"/>
</element>
</workspace>
<physical>
<element name="spring-data-mongodb" type="BuildUnit"/>
<element type="BuildUnit" name="spring-data-mongodb"/>
</physical>
</scope>
<scope name="External" type="External">
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
<element name="java.**" type="ExcludeTypePattern"/>
<element name="javax.**" type="ExcludeTypePattern"/>
<scope type="External" name="External">
<element type="TypeFilter" name="Filter">
<element type="IncludeTypePattern" name="**"/>
<element type="ExcludeTypePattern" name="java.**"/>
<element type="ExcludeTypePattern" name="javax.**"/>
</element>
<architecture>
<element name="Spring" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.**" type="IncludeTypePattern"/>
<element name="org.springframework.data.**" type="ExcludeTypePattern"/>
<element type="Subsystem" name="Spring">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.springframework.**"/>
<element type="ExcludeTypePattern" name="org.springframework.data.**"/>
</element>
</element>
<element name="Spring Data Core" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.data.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Spring Data Core">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.springframework.data.**"/>
</element>
</element>
<element name="Mongo Java Driver" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mongodb.**" type="IncludeTypePattern"/>
<element name="org.bson.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Mongo Java Driver">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.mongodb.**"/>
<element type="IncludeTypePattern" name="org.bson.**"/>
</element>
</element>
<element name="Querydsl" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="com.mysema.query.**" type="IncludeTypePattern"/>
<element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
</element>
</element>
</architecture>
</scope>
<scope name="Global" type="Global">
<element name="Configuration" type="Configuration"/>
<element name="Filter" type="TypeFilter">
<element name="**" type="IncludeTypePattern"/>
<scope type="Global" name="Global">
<element type="Configuration" name="Configuration"/>
<element type="TypeFilter" name="Filter">
<element type="IncludeTypePattern" name="**"/>
</element>
</scope>
</context>

View File

@@ -1,95 +1,74 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
<description>MongoDB support for Spring Data</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M2</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
<version>1.8.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.6.0</querydsl.version>
<cdi.version>1.0</cdi.version>
<validation.version>1.0.0.GA</validation.version>
<webbeans.version>1.1.3</webbeans.version>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version.range}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
<exclusions>
<exclusion>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
</exclusion>
</exclusions>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
<version>${springdata.commons}</version>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
<scope>provided</scope>
</dependency>
@@ -99,30 +78,30 @@
<version>1.0</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans.version}</version>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
@@ -134,7 +113,14 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation.version}</version>
<version>${validation}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
<version>${objenesis}</version>
<optional>true</optional>
</dependency>
@@ -145,43 +131,63 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>performance-tests</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<includes>
<include>**/PerformanceTests.java</include>
</includes>
<excludes>
<exclude>none</exclude>
</excludes>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.2</version>
<artifactId>apt-maven-plugin</artifactId>
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>generate-test-sources</phase>
@@ -195,7 +201,30 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.UncategorizedDataAccessException;
/**
* @author Oliver Gierke
*/
public class LazyLoadingException extends UncategorizedDataAccessException {
private static final long serialVersionUID = -7089224903873220037L;
/**
* @param msg
* @param cause
*/
public LazyLoadingException(String msg, Throwable cause) {
super(msg, cause);
}
}

View File

@@ -1,6 +1,23 @@
/*
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.DB;
@@ -8,6 +25,7 @@ import com.mongodb.DB;
* Interface for factories creating {@link DB} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
*/
public interface MongoDbFactory {
@@ -27,4 +45,11 @@ public interface MongoDbFactory {
* @throws DataAccessException
*/
DB getDb(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,22 +27,35 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data Mongo configuration using JavaConfig.
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -55,12 +68,25 @@ public abstract class AbstractMongoConfiguration {
protected abstract String getDatabaseName();
/**
* Return the {@link Mongo} instance to connect to.
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
@Bean
public abstract Mongo mongo() throws Exception;
/**
@@ -84,24 +110,23 @@ public abstract class AbstractMongoConfiguration {
* @throws Exception
*/
@Bean
public SimpleMongoDbFactory mongoDbFactory() throws Exception {
UserCredentials credentials = getUserCredentials();
if (credentials == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
}
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s.
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour.
*
* @return
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
*/
protected String getMappingBasePackage() {
return null;
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
@@ -109,7 +134,10 @@ public abstract class AbstractMongoConfiguration {
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
@@ -127,11 +155,22 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.initialize();
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
@@ -156,8 +195,11 @@ public abstract class AbstractMongoConfiguration {
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), mongoMappingContext());
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
@@ -187,4 +229,26 @@ public abstract class AbstractMongoConfiguration {
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,17 +13,25 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Constants to declare bean names used by the namespace configuration.
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public abstract class BeanNames {
static final String MAPPING_CONTEXT = "mappingContext";
static final String INDEX_HELPER = "indexCreationHelper";
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext";
static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper";
static final String MONGO_BEAN_NAME = "mongo";
static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory";
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
static final String MONGO_TEMPLATE_BEAN_NAME = "mongoTemplate";
static final String GRID_FS_TEMPLATE_BEAN_NAME = "gridFsTemplate";
}

View File

@@ -0,0 +1,70 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.context.annotation.Import;
import org.springframework.data.auditing.DateTimeProvider;
import org.springframework.data.domain.AuditorAware;
/**
* Annotation to enable auditing in MongoDB via annotation configuration.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
@Inherited
@Documented
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Import(MongoAuditingRegistrar.class)
public @interface EnableMongoAuditing {
/**
* Configures the {@link AuditorAware} bean to be used to lookup the current principal.
*
* @return
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
* @return
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
* @return
*/
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
*
* @return
*/
String dateTimeProviderRef() default "";
}

View File

@@ -0,0 +1,83 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.gridfs.GridFsTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
String bucket = element.getAttribute("bucket");
BeanDefinitionBuilder gridFsTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(GridFsTemplate.class);
if (StringUtils.hasText(dbFactoryRef)) {
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
}
if (StringUtils.hasText(converterRef)) {
gridFsTemplateBuilder.addConstructorArgReference(converterRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
}
if (StringUtils.hasText(bucket)) {
gridFsTemplateBuilder.addConstructorArgValue(bucket);
}
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE_BEAN_NAME)
.getBeanDefinition();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,12 +24,13 @@ import java.util.List;
import java.util.Set;
import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.parsing.ReaderContext;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -38,6 +39,7 @@ import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
@@ -48,6 +50,9 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
@@ -66,31 +71,39 @@ import org.w3c.dom.Element;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean jsr303Present = ClassUtils.isPresent("javax.validation.Validator",
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : "mappingConverter";
}
/* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, id);
createIsNewStrategyFactoryBeanDefinition(ctxRef, parserContext, element);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
// Converter
@@ -98,30 +111,41 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
converterBuilder.addConstructorArgReference(dbFactoryRef);
converterBuilder.addConstructorArgReference(ctxRef);
String typeMapperRef = element.getAttribute("type-mapper-ref");
if (StringUtils.hasText(typeMapperRef)) {
converterBuilder.addPropertyReference("typeMapper", typeMapperRef);
}
if (conversionsDefinition != null) {
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
}
try {
registry.getBeanDefinition(INDEX_HELPER);
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef);
indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
INDEX_HELPER_BEAN_NAME));
}
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null) {
registry.registerBeanDefinition(VALIDATING_EVENT_LISTENER, validatingMongoEventListener);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatingMongoEventListener,
VALIDATING_EVENT_LISTENER_BEAN_NAME));
}
return converterBuilder.getBeanDefinition();
parserContext.registerBeanComponent(new BeanComponentDefinition(converterBuilder.getBeanDefinition(), id));
parserContext.popAndRegisterContainingComponent();
return null;
}
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
@@ -135,7 +159,6 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
RuntimeBeanReference validator = getValidator(builder, parserContext);
if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.addConstructorArgValue(validator);
@@ -148,7 +171,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
private RuntimeBeanReference getValidator(Object source, ParserContext parserContext) {
if (!jsr303Present) {
if (!JSR_303_PRESENT) {
return null;
}
@@ -157,39 +180,77 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerComponent(new BeanComponentDefinition(validatorDef, validatorName));
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName);
}
private String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition) {
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(ctxRef)) {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
if (StringUtils.hasText(ctxRef)) {
return ctxRef;
}
BeanComponentDefinitionBuilder componentDefinitionBuilder = new BeanComponentDefinitionBuilder(element,
parserContext);
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
return ctxRef;
}
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
boolean fieldNamingStrategyReferenced = StringUtils.hasText(fieldNamingStrategy);
boolean abbreviationActivated = StringUtils.hasText(abbreviateFieldNames)
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element);
return;
}
Object value = null;
if ("true".equals(abbreviateFieldNames)) {
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
} else if (fieldNamingStrategyReferenced) {
value = new RuntimeBeanReference(fieldNamingStrategy);
}
if (value != null) {
builder.addPropertyValue("fieldNamingStrategy", value);
}
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
@@ -224,7 +285,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
AbstractBeanDefinition conversionsBean = conversionsBuilder.getBeanDefinition();
conversionsBean.setSource(parserContext.extractSource(element));
parserContext.getRegistry().registerBeanDefinition("customConversions", conversionsBean);
parserContext.registerBeanComponent(new BeanComponentDefinition(conversionsBean, "customConversions"));
return conversionsBean;
}
@@ -232,7 +293,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
private static Set<String> getInititalEntityClasses(Element element) {
String basePackage = element.getAttribute(BASE_PACKAGE);
@@ -271,6 +332,20 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public static String createIsNewStrategyFactoryBeanDefinition(String mappingContextRef, ParserContext context,
Element element) {
BeanDefinitionBuilder mappingContextStrategyFactoryBuilder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder,
IS_NEW_STRATEGY_FACTORY_BEAN_NAME));
return IS_NEW_STRATEGY_FACTORY_BEAN_NAME;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.auditing.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
*
* @author Oliver Gierke
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEventListener.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
String mappingContextRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(mappingContextRef)) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, new RootBeanDefinition(MongoMappingContext.class));
}
mappingContextRef = MAPPING_CONTEXT_BEAN_NAME;
}
IsNewAwareAuditingHandlerBeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(
mappingContextRef);
parser.parse(element, parserContext);
builder.addConstructorArgValue(getObjectFactoryBeanDefinition(parser.getResolvedBeanName(),
parserContext.extractSource(element)));
}
}

View File

@@ -0,0 +1,130 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
*/
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
}

View File

@@ -0,0 +1,85 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -0,0 +1,198 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final Pattern GROUP_PATTERN = Pattern.compile("(\\\\?')(.*?)\\1");
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : extractCredentialsString(text)) {
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(
MongoCredential.createCredential(userNameAndPassword[0], database, userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private List<String> extractCredentialsString(String source) {
Matcher matcher = GROUP_PATTERN.matcher(source);
List<String> list = new ArrayList<String>();
while (matcher.find()) {
String value = StringUtils.trimLeadingCharacter(matcher.group(), '\'');
list.add(StringUtils.trimTrailingCharacter(value, '\''));
}
if (!list.isEmpty()) {
return list;
}
return Arrays.asList(source.split(","));
}
private static String[] extractUserNameAndPassword(String text) {
int index = text.lastIndexOf(DATABASE_DELIMINATOR);
index = index != -1 ? index : text.lastIndexOf(OPTIONS_DELIMINATOR);
return index == -1 ? new String[] {} : text.substring(0, index).split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (dbSeperationIndex == -1) {
return "";
}
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
private static void verifyUsernameAndPasswordPresent(String[] source) {
verifyUserNamePresent(source);
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
@@ -41,6 +42,8 @@ import com.mongodb.MongoURI;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@@ -53,7 +56,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY;
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
@@ -63,28 +66,28 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
BeanDefinition mongoUri = getMongoUri(element);
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
if (mongoUri != null) {
if (element.getAttributes().getLength() >= 2 && !element.hasAttribute("write-concern")) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -92,19 +95,16 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
dbFactoryBuilder.addConstructorArgValue(registerMongoBeanDefinition(element, parserContext));
}
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(dbname);
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db");
dbFactoryBuilder.addConstructorArgValue(userCredentials);
dbFactoryBuilder.addConstructorArgValue(element.getAttribute("authentication-dbname"));
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY)
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY_BEAN_NAME)
.getBeanDefinition();
}
@@ -148,14 +148,24 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes.
*
* @param uri
* @return
* @param element must not be {@literal null}.
* @return {@literal null} in case no client-/uri defined.
*/
private BeanDefinition getMongoUri(String uri) {
private BeanDefinition getMongoUri(Element element) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
boolean hasClientUri = element.hasAttribute("client-uri");
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,31 +16,29 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
*
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
registerBeanDefinitionParser("template", new MongoTemplateParser());
registerBeanDefinitionParser("gridFsTemplate", new GridFsTemplateParser());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,10 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -36,6 +32,7 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
@@ -58,13 +55,14 @@ public class MongoParser implements BeanDefinitionParser {
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO;
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
@@ -75,19 +73,4 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -33,12 +34,13 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {
}
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
@@ -53,12 +55,14 @@ abstract class MongoParsingUtils {
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
@@ -79,11 +83,58 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
@@ -100,4 +151,56 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -0,0 +1,87 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
* @author Oliver Gierke
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
BeanDefinitionBuilder mongoTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoTemplate.class);
setPropertyValue(mongoTemplateBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(dbFactoryRef)) {
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
}
if (StringUtils.hasText(converterRef)) {
mongoTemplateBuilder.addConstructorArgReference(converterRef);
}
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder,
BeanNames.MONGO_TEMPLATE_BEAN_NAME).getBeanDefinition();
}
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,12 +16,14 @@
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
@@ -31,9 +33,15 @@ import com.mongodb.ServerAddress;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
/**
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
@@ -43,6 +51,11 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(String replicaSetString) {
if (!StringUtils.hasText(replicaSetString)) {
setValue(null);
return;
}
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);
@@ -71,22 +84,53 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
*/
private ServerAddress parseServerAddress(String source) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(source.trim(), ":");
if (!StringUtils.hasText(source)) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
if (!StringUtils.hasText(source) || hostAndPort.length > 2) {
LOG.warn("Could not parse address source '{}'. Check your replica set configuration!", source);
String[] hostAndPort = extractHostAddressAndPort(source.trim());
if (hostAndPort.length > 2) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
try {
return hostAndPort.length == 1 ? new ServerAddress(hostAndPort[0]) : new ServerAddress(hostAndPort[0],
Integer.parseInt(hostAndPort[1]));
InetAddress hostAddress = InetAddress.getByName(hostAndPort[0]);
Integer port = hostAndPort.length == 1 ? null : Integer.parseInt(hostAndPort[1]);
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
} catch (UnknownHostException e) {
LOG.warn("Could not parse host '{}'. Check your replica set configuration!", hostAndPort[0]);
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
} catch (NumberFormatException e) {
LOG.warn("Could not parse port '{}'. Check your replica set configuration!", hostAndPort[1]);
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
}
return null;
}
/**
* Extract the host and port from the given {@link String}.
*
* @param addressAndPortSource must not be {@literal null}.
* @return
*/
private String[] extractHostAddressAndPort(String addressAndPortSource) {
Assert.notNull(addressAndPortSource, "Address and port source must not be null!");
String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN);
String hostAddress = hostAndPort[0];
if (isHostAddressInIPv6BracketNotation(hostAddress)) {
hostAndPort[0] = hostAddress.substring(1, hostAddress.length() - 1);
}
return hostAndPort;
}
private boolean isHostAddressInIPv6BracketNotation(String hostAddress) {
return hostAddress.startsWith("[") && hostAddress.endsWith("]");
}
}

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,17 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -34,9 +37,15 @@ import com.mongodb.MongoException;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -64,9 +73,9 @@ public class DefaultIndexOperations implements IndexOperations {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
collection.createIndex(indexDefinition.getIndexKeys());
}
return null;
}
@@ -99,10 +108,12 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null;
}
});
@@ -135,12 +146,24 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key);
if (Integer.valueOf(1).equals(value)) {
indexFields.add(IndexField.create(key, Order.ASCENDING));
} else if (Integer.valueOf(-1).equals(value)) {
indexFields.add(IndexField.create(key, Order.DESCENDING));
} else if ("2d".equals(value)) {
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}
@@ -149,8 +172,8 @@ public class DefaultIndexOperations implements IndexOperations {
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
}
return indexInfoList;

View File

@@ -0,0 +1,188 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
@Bean
public GeoJsonModule geoJsonModule() {
return new GeoJsonModule();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface IndexOperations {
@@ -51,7 +51,11 @@ public interface IndexOperations {
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,59 +13,53 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern. The EntityClass is null only for
* the MongoActionOperaton.INSERT_LIST.
*
* {@link WriteConcern} based on this information.
* <ul>
* <li>INSERT, SAVE have null query</li>
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityClass, document, and query</li>
* <li>INSERT_LIST has null entityType, document, and query</li>
* </ul>
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public class MongoAction {
private String collectionName;
private WriteConcern defaultWriteConcern;
private Class<?> entityClass;
private MongoActionOperation mongoActionOperation;
private DBObject query;
private DBObject document;
private final String collectionName;
private final WriteConcern defaultWriteConcern;
private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation;
private final DBObject query;
private final DBObject document;
/**
* Create an instance of a MongoAction
* Create an instance of a {@link MongoAction}.
*
* @param defaultWriteConcern the default write concern
* @param defaultWriteConcern the default write concern.
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
* @param entityClass the POJO that is being operated against
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityClass, DBObject document, DBObject query) {
super();
String collectionName, Class<?> entityType, DBObject document, DBObject query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;
this.collectionName = collectionName;
this.entityClass = entityClass;
this.entityType = entityType;
this.query = query;
this.document = document;
}
@@ -78,8 +72,16 @@ public class MongoAction {
return defaultWriteConcern;
}
/**
* @deprecated use {@link #getEntityType()} instead.
*/
@Deprecated
public Class<?> getEntityClass() {
return entityClass;
return entityType;
}
public Class<?> getEntityType() {
return entityType;
}
public MongoActionOperation getMongoActionOperation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,8 +20,8 @@ package org.springframework.data.mongodb.core;
* for a given mutating operation
*
* @author Mark Pollack
* @author Oliver Gierke
* @see MongoAction
*
*/
public enum MongoActionOperation {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import com.mongodb.Mongo;
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
@@ -34,6 +35,7 @@ public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo);
@@ -82,7 +84,16 @@ public class MongoAdmin implements MongoAdminOperations {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password));
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}

View File

@@ -1,16 +1,34 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation;
/**
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoAdminOperations {
@ManagedOperation
public abstract void dropDatabase(String databaseName);
void dropDatabase(String databaseName);
@ManagedOperation
public abstract void createDatabase(String databaseName);
void createDatabase(String databaseName);
@ManagedOperation
public abstract String getDatabaseStats(String databaseName);
}
String getDatabaseStats(String databaseName);
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -0,0 +1,295 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -0,0 +1,71 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.util.Assert;
import com.mongodb.WriteResult;
/**
* Mongo-specific {@link DataIntegrityViolationException}.
*
* @author Oliver Gierke
*/
public class MongoDataIntegrityViolationException extends DataIntegrityViolationException {
private static final long serialVersionUID = -186980521176764046L;
private final WriteResult writeResult;
private final MongoActionOperation actionOperation;
/**
* Creates a new {@link MongoDataIntegrityViolationException} using the given message and {@link WriteResult}.
*
* @param message the exception message
* @param writeResult the {@link WriteResult} that causes the exception, must not be {@literal null}.
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
*/
public MongoDataIntegrityViolationException(String message, WriteResult writeResult,
MongoActionOperation actionOperation) {
super(message);
Assert.notNull(writeResult, "WriteResult must not be null!");
Assert.notNull(actionOperation, "MongoActionOperation must not be null!");
this.writeResult = writeResult;
this.actionOperation = actionOperation;
}
/**
* Returns the {@link WriteResult} that caused the exception.
*
* @return the writeResult
*/
public WriteResult getWriteResult() {
return writeResult;
}
/**
* Returns the {@link MongoActionOperation} in which the current exception occured.
*
* @return the actionOperation
*/
public MongoActionOperation getActionOperation() {
return actionOperation;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,22 +18,24 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
* <p>
* Mainly intended for internal use within the framework.
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -43,9 +45,7 @@ public abstract class MongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {
}
private MongoDbUtils() {}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -55,7 +55,7 @@ public abstract class MongoDbUtils {
* @return the {@link DB} connection
*/
public static DB getDB(Mongo mongo, String databaseName) {
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true);
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true, databaseName);
}
/**
@@ -65,35 +65,52 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
Assert.notNull(mongo, "No Mongo instance specified!");
Assert.hasText(databaseName, "Database name must be given!");
Assert.notNull(credentials, "Credentials must not be null, use UserCredentials.NO_CREDENTIALS!");
Assert.hasText(authenticationDatabaseName, "Authentication database name must not be null or empty!");
return doGetDB(mongo, databaseName, credentials, true);
return doGetDB(mongo, databaseName, credentials, true, authenticationDatabaseName);
}
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate) {
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate,
String authenticationDatabaseName) {
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
if (dbHolder != null && !dbHolder.isEmpty()) {
// Do we have a populated holder and TX sync active?
if (dbHolder != null && !dbHolder.isEmpty() && TransactionSynchronizationManager.isSynchronizationActive()) {
DB db = null;
DB db = dbHolder.getDB(databaseName);
if (TransactionSynchronizationManager.isSynchronizationActive() && dbHolder.doesNotHoldNonDefaultDB()) {
// DB found but not yet synchronized
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
db = dbHolder.getDB(databaseName);
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
if (db != null) {
@@ -101,24 +118,16 @@ public abstract class MongoDbUtils {
}
}
// Lookup fresh database instance
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (credentialsGiven && !db.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName
+ "], username = [" + username + "], password = [" + password + "]", databaseName, credentials);
}
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}
// Use same Session for further Mongo actions within the transaction.
// Thread object will get removed by synchronization at transaction completion.
// TX sync active, bind new database to thread
if (TransactionSynchronizationManager.isSynchronizationActive()) {
LOGGER.debug("Registering Spring transaction synchronization for MongoDB instance {}.", databaseName);
@@ -131,8 +140,11 @@ public abstract class MongoDbUtils {
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
@@ -162,23 +174,43 @@ public abstract class MongoDbUtils {
return false;
}
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
return (dbHolder != null && dbHolder.containsDB(db));
return dbHolder != null && dbHolder.containsDB(db);
}
/**
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
db.requestDone();
ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,23 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -35,47 +39,193 @@ import org.springframework.data.mongodb.UncategorizedMongoDbException;
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/*
* (non-Javadoc)
*
* @see org.springframework.dao.support.PersistenceExceptionTranslator#
* translateExceptionIfPossible(java.lang.RuntimeException)
*/
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
// Check for well-known MongoException subclasses.
// All other MongoExceptions
if (ex instanceof DuplicateKey) {
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) {
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
} else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012) {
} else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// If we get here, we have an exception that resulted from user code,
// rather than the persistence provider, so we return null to indicate
// that translation should not occur.
return null;
}
/**
* {@link MongoDbErrorCodes} holds MongoDB specific error codes outlined in {@literal mongo/base/error_codes.err}.
*
* @author Christoph Strobl
* @since 1.8
*/
public static final class MongoDbErrorCodes {
static HashMap<Integer, String> dataAccessResourceFailureCodes;
static HashMap<Integer, String> dataIntegrityViolationCodes;
static HashMap<Integer, String> duplicateKeyCodes;
static HashMap<Integer, String> invalidDataAccessApiUsageExeption;
static HashMap<Integer, String> permissionDeniedCodes;
static HashMap<Integer, String> errorCodes;
static {
dataAccessResourceFailureCodes = new HashMap<Integer, String>(10);
dataAccessResourceFailureCodes.put(6, "HostUnreachable");
dataAccessResourceFailureCodes.put(7, "HostNotFound");
dataAccessResourceFailureCodes.put(89, "NetworkTimeout");
dataAccessResourceFailureCodes.put(91, "ShutdownInProgress");
dataAccessResourceFailureCodes.put(12000, "SlaveDelayDifferential");
dataAccessResourceFailureCodes.put(10084, "CannotFindMapFile64Bit");
dataAccessResourceFailureCodes.put(10085, "CannotFindMapFile");
dataAccessResourceFailureCodes.put(10357, "ShutdownInProgress");
dataAccessResourceFailureCodes.put(10359, "Header==0");
dataAccessResourceFailureCodes.put(13440, "BadOffsetInFile");
dataAccessResourceFailureCodes.put(13441, "BadOffsetInFile");
dataAccessResourceFailureCodes.put(13640, "DataFileHeaderCorrupt");
dataIntegrityViolationCodes = new HashMap<Integer, String>(6);
dataIntegrityViolationCodes.put(67, "CannotCreateIndex");
dataIntegrityViolationCodes.put(68, "IndexAlreadyExists");
dataIntegrityViolationCodes.put(85, "IndexOptionsConflict");
dataIntegrityViolationCodes.put(86, "IndexKeySpecsConflict");
dataIntegrityViolationCodes.put(112, "WriteConflict");
dataIntegrityViolationCodes.put(117, "ConflictingOperationInProgress");
duplicateKeyCodes = new HashMap<Integer, String>(3);
duplicateKeyCodes.put(3, "OBSOLETE_DuplicateKey");
duplicateKeyCodes.put(84, "DuplicateKeyValue");
duplicateKeyCodes.put(11000, "DuplicateKey");
duplicateKeyCodes.put(11001, "DuplicateKey");
invalidDataAccessApiUsageExeption = new HashMap<Integer, String>();
invalidDataAccessApiUsageExeption.put(5, "GraphContainsCycle");
invalidDataAccessApiUsageExeption.put(9, "FailedToParse");
invalidDataAccessApiUsageExeption.put(14, "TypeMismatch");
invalidDataAccessApiUsageExeption.put(15, "Overflow");
invalidDataAccessApiUsageExeption.put(16, "InvalidLength");
invalidDataAccessApiUsageExeption.put(20, "IllegalOperation");
invalidDataAccessApiUsageExeption.put(21, "EmptyArrayOperation");
invalidDataAccessApiUsageExeption.put(22, "InvalidBSON");
invalidDataAccessApiUsageExeption.put(23, "AlreadyInitialized");
invalidDataAccessApiUsageExeption.put(29, "NonExistentPath");
invalidDataAccessApiUsageExeption.put(30, "InvalidPath");
invalidDataAccessApiUsageExeption.put(40, "ConflictingUpdateOperators");
invalidDataAccessApiUsageExeption.put(45, "UserDataInconsistent");
invalidDataAccessApiUsageExeption.put(30, "DollarPrefixedFieldName");
invalidDataAccessApiUsageExeption.put(52, "InvalidPath");
invalidDataAccessApiUsageExeption.put(53, "InvalidIdField");
invalidDataAccessApiUsageExeption.put(54, "NotSingleValueField");
invalidDataAccessApiUsageExeption.put(55, "InvalidDBRef");
invalidDataAccessApiUsageExeption.put(56, "EmptyFieldName");
invalidDataAccessApiUsageExeption.put(57, "DottedFieldName");
invalidDataAccessApiUsageExeption.put(59, "CommandNotFound");
invalidDataAccessApiUsageExeption.put(60, "DatabaseNotFound");
invalidDataAccessApiUsageExeption.put(61, "ShardKeyNotFound");
invalidDataAccessApiUsageExeption.put(62, "OplogOperationUnsupported");
invalidDataAccessApiUsageExeption.put(66, "ImmutableField");
invalidDataAccessApiUsageExeption.put(72, "InvalidOptions");
invalidDataAccessApiUsageExeption.put(115, "CommandNotSupported");
invalidDataAccessApiUsageExeption.put(116, "DocTooLargeForCapped");
invalidDataAccessApiUsageExeption.put(130, "SymbolNotFound");
invalidDataAccessApiUsageExeption.put(17280, "KeyTooLong");
invalidDataAccessApiUsageExeption.put(13334, "ShardKeyTooBig");
permissionDeniedCodes = new HashMap<Integer, String>();
permissionDeniedCodes.put(11, "UserNotFound");
permissionDeniedCodes.put(18, "AuthenticationFailed");
permissionDeniedCodes.put(31, "RoleNotFound");
permissionDeniedCodes.put(32, "RolesNotRelated");
permissionDeniedCodes.put(33, "PrvilegeNotFound");
permissionDeniedCodes.put(15847, "CannotAuthenticate");
permissionDeniedCodes.put(16704, "CannotAuthenticateToAdminDB");
permissionDeniedCodes.put(16705, "CannotAuthenticateToAdminDB");
errorCodes = new HashMap<Integer, String>();
errorCodes.putAll(dataAccessResourceFailureCodes);
errorCodes.putAll(dataIntegrityViolationCodes);
errorCodes.putAll(duplicateKeyCodes);
errorCodes.putAll(invalidDataAccessApiUsageExeption);
errorCodes.putAll(permissionDeniedCodes);
}
public static boolean isDataIntegrityViolationCode(Integer errorCode) {
return errorCode == null ? false : dataIntegrityViolationCodes.containsKey(errorCode);
}
public static boolean isDataAccessResourceFailureCode(Integer errorCode) {
return errorCode == null ? false : dataAccessResourceFailureCodes.containsKey(errorCode);
}
public static boolean isDuplicateKeyCode(Integer errorCode) {
return errorCode == null ? false : duplicateKeyCodes.containsKey(errorCode);
}
public static boolean isPermissionDeniedCode(Integer errorCode) {
return errorCode == null ? false : permissionDeniedCodes.containsKey(errorCode);
}
public static boolean isInvalidDataAccessApiUsageCode(Integer errorCode) {
return errorCode == null ? false : invalidDataAccessApiUsageExeption.containsKey(errorCode);
}
public static String getErrorDescription(Integer errorCode) {
return errorCode == null ? null : errorCodes.get(errorCode);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,15 +15,16 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
@@ -36,12 +37,15 @@ import com.mongodb.WriteConcern;
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private Mongo mongo;
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoOptions mongoOptions;
private String host;
@@ -49,25 +53,42 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
/**
* @param mongoOptions
*/
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = Arrays.asList(replicaSetSeeds);
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
* @param replicaPair
*/
@Deprecated
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = Arrays.asList(replicaPair);
this.replicaPair = filterNonNullElementsAsList(replicaPair);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
@@ -81,12 +102,13 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
@@ -97,14 +119,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -115,9 +129,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
public void afterPropertiesSet() throws Exception {
@Override
protected Mongo createInstance() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
@@ -126,15 +141,15 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (!isNullOrEmpty(replicaPair)) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
} else if (!isNullOrEmpty(replicaSetSeeds)) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
String mongoHost = StringUtils.hasText(host) ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
@@ -143,14 +158,42 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongo.setWriteConcern(writeConcern);
}
this.mongo = mongo;
return mongo;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
public void destroy() throws Exception {
this.mongo.close();
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,6 +1,6 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
@@ -19,9 +19,11 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -31,10 +33,14 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult;
/**
@@ -45,6 +51,10 @@ import com.mongodb.WriteResult;
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
* @author Tobias Trelle
* @author Chuong Ngo
* @author Christoph Strobl
* @author Thomas Darimont
*/
public interface MongoOperations {
@@ -79,9 +89,23 @@ public interface MongoOperations {
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
@@ -137,9 +161,26 @@ public interface MongoOperations {
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query
* @param entityType
* @return
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -243,11 +284,19 @@ public interface MongoOperations {
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -261,7 +310,7 @@ public interface MongoOperations {
* Query for a list of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
@@ -301,6 +350,57 @@ public interface MongoOperations {
*/
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 1.3
*/
<O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
@@ -357,8 +457,10 @@ public interface MongoOperations {
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -367,7 +469,9 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -382,7 +486,7 @@ public interface MongoOperations {
* specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -399,7 +503,7 @@ public interface MongoOperations {
* type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -408,16 +512,43 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
*/
boolean exists(Query query, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @return
*/
boolean exists(Query query, Class<?> entityClass);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @param collectionName name of the collection to check for objects.
* @return
*/
boolean exists(Query query, Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -433,7 +564,7 @@ public interface MongoOperations {
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -442,7 +573,6 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the List of converted objects
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
@@ -464,18 +594,63 @@ public interface MongoOperations {
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
*
* @param <T>
* @return
*/
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parameterized type.
* @return
*/
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parameterized type.
* @param collectionName the collection to query.
* @return
*/
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parameterized type.
* @return
*/
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parameterized type.
* @param collectionName the collection to query.
* @return
*/
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName);
@@ -501,7 +676,7 @@ public interface MongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
@@ -510,7 +685,6 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
@@ -525,14 +699,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -540,9 +728,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -555,7 +743,7 @@ public interface MongoOperations {
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
@@ -593,13 +781,13 @@ public interface MongoOperations {
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
@@ -610,13 +798,13 @@ public interface MongoOperations {
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Cobnversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
@@ -646,6 +834,18 @@ public interface MongoOperations {
*/
WriteResult upsert(Query query, Update update, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
@@ -670,6 +870,19 @@ public interface MongoOperations {
*/
WriteResult updateFirst(Query query, Update update, String collectionName);
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
@@ -694,12 +907,25 @@ public interface MongoOperations {
*/
WriteResult updateMulti(Query query, Update update, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
/**
* Remove the given object from the collection by id.
*
* @param object
*/
void remove(Object object);
WriteResult remove(Object object);
/**
* Removes the given object from the given collection.
@@ -707,17 +933,26 @@ public interface MongoOperations {
* @param object
* @param collection must not be {@literal null} or empty.
*/
void remove(Object object, String collection);
WriteResult remove(Object object, String collection);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param <T>
* @param query
* @param entityClass
*/
<T> void remove(Query query, Class<T> entityClass);
WriteResult remove(Query query, Class<?> entityClass);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query
* @param entityClass
* @param collectionName
*/
WriteResult remove(Query query, Class<?> entityClass, String collectionName);
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
@@ -726,7 +961,40 @@ public interface MongoOperations {
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
void remove(Query query, String collectionName);
WriteResult remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, String collectionName);
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
* @param query
* @param entityClass
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass);
/**
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
* @param query
* @param entityClass
* @param collectionName
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Returns the underlying {@link MongoConverter}.
@@ -734,4 +1002,5 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,129 +15,99 @@
*/
package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.MongoOptions;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
/**
* A factory bean for construction of a MongoOptions instance
*
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* @author Graeme Rocher
* @Author Mark Pollack
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
private static final MongoOptions MONGO_OPTIONS = new MongoOptions();
/**
* number of connections allowed per host will block if run out
*/
private int connectionsPerHost = MONGO_OPTIONS.connectionsPerHost;
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
* be throw
*/
private int threadsAllowedToBlockForConnectionMultiplier = MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
/**
* max wait time of a blocking thread for a connection
*/
private int maxWaitTime = MONGO_OPTIONS.maxWaitTime;
/**
* connect timeout in milliseconds. 0 is default and infinite
*/
private int connectTimeout = MONGO_OPTIONS.connectTimeout;
/**
* socket timeout. 0 is default and infinite
*/
private int socketTimeout = MONGO_OPTIONS.socketTimeout;
/**
* This controls whether or not to have socket keep alive turned on (SO_KEEPALIVE).
* Configures the maximum number of connections allowed per host until we will block.
*
* defaults to false
*/
public boolean socketKeepAlive = MONGO_OPTIONS.socketKeepAlive;
/**
* this controls whether or not on a connect, the system retries automatically
*/
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = MONGO_OPTIONS.maxAutoConnectRetryTime;
/**
* This specifies the number of servers to wait for on the write operation, and exception raising behavior.
*
* Defaults to 0.
*/
private int writeNumber;
/**
* This controls timeout for write operations in milliseconds.
*
* Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
*/
private int writeTimeout;
/**
* This controls whether or not to fsync.
*
* Defaults to false.
*/
private boolean writeFsync;
/**
* Specifies if the driver is allowed to read from secondaries or slaves.
*
* Defaults to false
*/
@SuppressWarnings("deprecation")
private boolean slaveOk = MONGO_OPTIONS.slaveOk;
/**
* number of connections allowed per host will block if run out
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* multiplier for connectionsPerHost for # of threads that can block if connectionsPerHost is 10, and
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an exception will
* be throw
* A multiplier for connectionsPerHost for # of threads that can block a connection. If connectionsPerHost is 10, and
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block. If more threads try to block an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* max wait time of a blocking thread for a connection
* Max wait time of a blocking thread for a connection.
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* connect timeout in milliseconds. 0 is default and infinite
* Configures the connect timeout in milliseconds. Defaults to 0 (infinite time).
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* socket timeout. 0 is default and infinite
* Configures the socket timeout. Defaults to 0 (infinite time).
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* This controls whether or not to have socket keep alive
* Configures whether or not to have socket keep alive turned on (SO_KEEPALIVE). Defaults to {@literal false}.
*
* @param socketKeepAlive
*/
@@ -152,7 +122,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* <li>-1 = don't even report network errors</li>
* <li>0 = default, don't call getLastError by default</li>
* <li>1 = basic, call getLastError, but don't wait for slaves</li>
* <li>2+= wait for slaves</li>
* <li>2 += wait for slaves</li>
* </ul>
*
* @param writeNumber the number of servers to wait for on the write operation, and exception raising behavior.
@@ -162,75 +132,124 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
}
/**
* This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command.
* Configures the timeout for write operations in milliseconds. This defaults to {@literal 0} (indefinite).
*
* @param writeTimeout Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
* @param writeTimeout
*/
public void setWriteTimeout(int writeTimeout) {
this.writeTimeout = writeTimeout;
}
/**
* This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
* Configures whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to {@literal false}.
*
* @param writeFsync to fsync on write (true), otherwise false.
* @param writeFsync to fsync on <code>write (true)<code>, otherwise {@literal false}.
*/
public void setWriteFsync(boolean writeFsync) {
this.writeFsync = writeFsync;
}
/**
* this controls whether or not on a connect, the system retries automatically
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
/**
* The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0,
* which means to use the default 15s if autoConnectRetry is on.
* Configures the maximum amount of time in millisecons to spend retrying to open connection to the same server. This
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
/**
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to false.
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
@SuppressWarnings("deprecation")
public void afterPropertiesSet() {
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
MONGO_OPTIONS.maxWaitTime = maxWaitTime;
MONGO_OPTIONS.connectTimeout = connectTimeout;
MONGO_OPTIONS.socketTimeout = socketTimeout;
MONGO_OPTIONS.socketKeepAlive = socketKeepAlive;
MONGO_OPTIONS.autoConnectRetry = autoConnectRetry;
MONGO_OPTIONS.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
MONGO_OPTIONS.slaveOk = slaveOk;
MONGO_OPTIONS.w = writeNumber;
MONGO_OPTIONS.wtimeout = writeTimeout;
MONGO_OPTIONS.fsync = writeFsync;
/**
* Specifies if the driver should use an SSL connection to Mongo. This defaults to {@literal false}. By default
* {@link SSLSocketFactory#getDefault()} will be used. See {@link #setSslSocketFactory(SSLSocketFactory)} if you want
* to configure a custom factory.
*
* @param ssl true if the driver should use an SSL connection.
* @see #setSslSocketFactory(SSLSocketFactory)
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
public MongoOptions getObject() {
return MONGO_OPTIONS;
/**
* Specifies the {@link SSLSocketFactory} to use for creating SSL connections to Mongo. Defaults to
* {@link SSLSocketFactory#getDefault()}. Implicitly activates {@link #setSsl(boolean)} if a non-{@literal null} value
* is given.
*
* @param sslSocketFactory the sslSocketFactory to use.
* @see #setSsl(boolean)
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
setSsl(sslSocketFactory != null);
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
MongoOptions options = new MongoOptions();
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
return options;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoOptions.class;
}
public boolean isSingleton() {
return true;
}
}

View File

@@ -1,8 +1,26 @@
/*
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.transaction.support.ResourceHolder;
import org.springframework.transaction.support.ResourceHolderSynchronization;
/**
* @author Oliver Gierke
*/
class MongoSynchronization extends ResourceHolderSynchronization<ResourceHolder, Object> {
public MongoSynchronization(ResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -0,0 +1,158 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -0,0 +1,84 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,12 +19,17 @@ import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -34,6 +39,8 @@ import com.mongodb.WriteConcern;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -41,27 +48,49 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private final String databaseName;
private final boolean mongoInstanceCreated;
private final UserCredentials credentials;
private final PersistenceExceptionTranslator exceptionTranslator;
private final String authenticationDatabaseName;
private WriteConcern writeConcern;
/**
* Create an instance of {@link SimpleMongoDbFactory} given the {@link Mongo} instance and database name.
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, UserCredentials.NO_CREDENTIALS, false);
this(mongo, databaseName, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false);
this(mongo, databaseName, credentials, false, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
}
/**
@@ -71,13 +100,43 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true);
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated) {
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
@@ -88,6 +147,31 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.credentials = credentials == null ? UserCredentials.NO_CREDENTIALS : credentials;
this.exceptionTranslator = new MongoExceptionTranslator();
this.authenticationDatabaseName = StringUtils.hasText(authenticationDatabaseName) ? authenticationDatabaseName
: databaseName;
Assert.isTrue(this.authenticationDatabaseName.matches("[\\w-]+"),
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
@@ -111,11 +195,12 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
DB db = MongoDbUtils.getDB(mongo, dbName, credentials);
DB db = MongoDbUtils.getDB(mongo, dbName, credentials, authenticationDatabaseName);
if (writeConcern != null) {
db.setWriteConcern(writeConcern);
@@ -138,4 +223,13 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private static String parseChars(char[] chars) {
return chars == null ? null : String.valueOf(chars);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,27 +13,25 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* A strategy interface to determine the WriteConcern to use for a given MongoDbAction.
*
* Return the passed in default WriteConcern (a property on MongoAction) if no determination can be made.
* A strategy interface to determine the {@link WriteConcern} to use for a given {@link MongoAction}. Return the passed
* in default {@link WriteConcern} (a property on {@link MongoAction}) if no determination can be made.
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* Resolve the {@link WriteConcern} given the {@link MongoAction}.
*
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not
* be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
* @param action describes the context of the Mongo action. Contains a default {@link WriteConcern} to use if one
* should not be resolved.
* @return a {@link WriteConcern} based on the passed in {@link MongoAction} value, maybe {@literal null}.
*/
WriteConcern resolve(MongoAction action);
}

View File

@@ -1,5 +1,28 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Enum to represent how strict the check of {@link com.mongodb.WriteResult} shall be. It can either be skipped entirely
* (use {@link #NONE}), or errors can be logged ({@link #LOG}) or cause an exception to be thrown {@link #EXCEPTION}.
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
public enum WriteResultChecking {
NONE, LOG, EXCEPTION
}

View File

@@ -0,0 +1,438 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class Aggregation {
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
return newAggregation(operations.toArray(new AggregationOperation[operations.size()]));
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public static <T> TypedAggregation<T> newAggregation(Class<T> type, List<? extends AggregationOperation> operations) {
return newAggregation(type, operations.toArray(new AggregationOperation[operations.size()]));
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public static <T> TypedAggregation<T> newAggregation(Class<T> type, AggregationOperation... operations) {
return new TypedAggregation<T>(type, operations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
this.operations = aggregationOperations;
this.options = options;
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
* @return
*/
public static String previousOperation() {
return "_id";
}
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ProjectionOperation project(String... fields) {
return project(fields(fields));
}
/**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ProjectionOperation project(Fields fields) {
return new ProjectionOperation(fields);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field));
}
/**
* Creates a new {@link GroupOperation} for the given fields.
*
* @param fields must not be {@literal null}.
* @return
*/
public static GroupOperation group(String... fields) {
return group(fields(fields));
}
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static GroupOperation group(Fields fields) {
return new GroupOperation(fields);
}
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
* @param sort must not be {@literal null}.
* @return
*/
public static SortOperation sort(Sort sort) {
return new SortOperation(sort);
}
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
*/
public static SortOperation sort(Direction direction, String... fields) {
return new SortOperation(new Sort(direction, fields));
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return
*/
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
* @param maxElements must not be less than zero.
* @return
*/
public static LimitOperation limit(long maxElements) {
return new LimitOperation(maxElements);
}
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
* @return
*/
public static MatchOperation match(Criteria criteria) {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
* @see Fields#fields(String...)
* @param fields must not be {@literal null}.
* @return
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
}
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
*/
public static Fields bind(String name, String target) {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
AggregationOperationContext context = rootContext;
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return SerializationUtils
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new FieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
interface AggregationExpression {
/**
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
* @param context
* @return
*/
DBObject toDbObject(AggregationOperationContext context);
}

View File

@@ -0,0 +1,82 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.spel.ExpressionNode;
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
import org.springframework.data.mongodb.core.spel.ExpressionTransformer;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Interface to type an {@link ExpressionTransformer} to the contained
* {@link AggregationExpressionTransformationContext}.
*
* @author Oliver Gierke
*/
interface AggregationExpressionTransformer extends
ExpressionTransformer<AggregationExpressionTransformationContext<ExpressionNode>> {
/**
* A special {@link ExpressionTransformationContextSupport} to be aware of the {@link AggregationOperationContext}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public static class AggregationExpressionTransformationContext<T extends ExpressionNode> extends
ExpressionTransformationContextSupport<T> {
private final AggregationOperationContext aggregationContext;
/**
* Creates an {@link AggregationExpressionTransformationContext}.
*
* @param currentNode must not be {@literal null}.
* @param parentNode
* @param previousOperationObject
* @param aggregationContext must not be {@literal null}.
*/
public AggregationExpressionTransformationContext(T currentNode, ExpressionNode parentNode,
DBObject previousOperationObject, AggregationOperationContext context) {
super(currentNode, parentNode, previousOperationObject);
Assert.notNull(context, "AggregationOperationContext must not be null!");
this.aggregationContext = context;
}
/**
* Returns the underlying {@link AggregationOperationContext}.
*
* @return
*/
public AggregationOperationContext getAggregationContext() {
return aggregationContext;
}
/**
* Returns the {@link FieldReference} for the current {@link ExpressionNode}.
*
* @return
*/
public FieldReference getFieldReference() {
return aggregationContext.getReference(getCurrentNode().getName());
}
}
}

View File

@@ -0,0 +1,105 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
public enum AggregationFunctionExpressions {
SIZE;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final Object[] values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.length);
for (int i = 0; i < values.length; i++) {
args.add(unpack(values[i], context));
}
return new BasicDBObject("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* Represents one single operation in an aggregation pipeline.
*
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public interface AggregationOperation {
/**
* Turns the {@link AggregationOperation} into a {@link DBObject} by using the given
* {@link AggregationOperationContext}.
*
* @return the DBObject
*/
DBObject toDBObject(AggregationOperationContext context);
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import com.mongodb.DBObject;
/**
* The context for an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @since 1.3
*/
public interface AggregationOperationContext {
/**
* Returns the mapped {@link DBObject}, potentially converting the source considering mapping metadata etc.
*
* @param dbObject will never be {@literal null}.
* @return must not be {@literal null}.
*/
DBObject getMappedObject(DBObject dbObject);
/**
* Returns a {@link FieldReference} for the given field or {@literal null} if the context does not expose the given
* field.
*
* @param field must not be {@literal null}.
* @return
*/
FieldReference getReference(Field field);
/**
* Returns the {@link FieldReference} for the field with the given name or {@literal null} if the context does not
* expose a field with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
FieldReference getReference(String name);
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of executing an aggregation operation.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
public class AggregationResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private final DBObject rawResults;
private final String serverUsed;
/**
* Creates a new {@link AggregationResults} instance from the given mapped and raw results.
*
* @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}.
*/
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = Collections.unmodifiableList(mappedResults);
this.rawResults = rawResults;
this.serverUsed = parseServerUsed();
}
/**
* Returns the aggregation results.
*
* @return
*/
public List<T> getMappedResults() {
return mappedResults;
}
/**
* Returns the unique mapped result. Assumes no result or exactly one.
*
* @return
* @throws IllegalArgumentException in case more than one result is available.
*/
public T getUniqueMappedResult() {
Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!");
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}
/**
* Returns the server that has been used to perform the aggregation.
*
* @return
*/
public String getServerUsed() {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");
return object instanceof String ? (String) object : null;
}
}

View File

@@ -0,0 +1,413 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import org.springframework.util.CompositeIterator;
/**
* Value object to capture the fields exposed by an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class ExposedFields implements Iterable<ExposedField> {
private static final List<ExposedField> NO_FIELDS = Collections.emptyList();
private static final ExposedFields EMPTY = new ExposedFields(NO_FIELDS, NO_FIELDS);
private final List<ExposedField> originalFields;
private final List<ExposedField> syntheticFields;
/**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields from(ExposedField... fields) {
return from(Arrays.asList(fields));
}
/**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
*
* @param fields must not be {@literal null}.
* @return
*/
private static ExposedFields from(List<ExposedField> fields) {
ExposedFields result = EMPTY;
for (ExposedField field : fields) {
result = result.and(field);
}
return result;
}
/**
* Creates synthetic {@link ExposedFields} from the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields synthetic(Fields fields) {
return createFields(fields, true);
}
/**
* Creates non-synthetic {@link ExposedFields} from the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
public static ExposedFields nonSynthetic(Fields fields) {
return createFields(fields, false);
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
* @return
*/
private static ExposedFields createFields(Fields fields, boolean synthetic) {
Assert.notNull(fields, "Fields must not be null!");
List<ExposedField> result = new ArrayList<ExposedField>();
for (Field field : fields) {
result.add(new ExposedField(field, synthetic));
}
return ExposedFields.from(result);
}
/**
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
*/
private ExposedFields(List<ExposedField> originals, List<ExposedField> synthetic) {
this.originalFields = originals;
this.syntheticFields = synthetic;
}
/**
* Creates a new {@link ExposedFields} adding the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
* @return
*/
public ExposedFields and(ExposedField field) {
Assert.notNull(field, "Exposed field must not be null!");
ArrayList<ExposedField> result = new ArrayList<ExposedField>();
result.addAll(field.synthetic ? syntheticFields : originalFields);
result.add(field);
return new ExposedFields(field.synthetic ? originalFields : result, field.synthetic ? result : syntheticFields);
}
/**
* Returns the field with the given name or {@literal null} if no field with the given name is available.
*
* @param name
* @return
*/
public ExposedField getField(String name) {
for (ExposedField field : this) {
if (field.canBeReferredToBy(name)) {
return field;
}
}
return null;
}
/**
* Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all.
*
* @return
*/
boolean exposesNoNonSyntheticFields() {
return originalFields.isEmpty();
}
/**
* Returns whether the {@link ExposedFields} exposes a single non-synthetic field only.
*
* @return
*/
boolean exposesSingleNonSyntheticFieldOnly() {
return originalFields.size() == 1;
}
/**
* Returns whether the {@link ExposedFields} exposes no fields at all.
*
* @return
*/
boolean exposesNoFields() {
return exposedFieldsCount() == 0;
}
/**
* Returns whether the {@link ExposedFields} exposes a single field only.
*
* @return
*/
boolean exposesSingleFieldOnly() {
return exposedFieldsCount() == 1;
}
/**
* @return
*/
private int exposedFieldsCount() {
return originalFields.size() + syntheticFields.size();
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
@Override
public Iterator<ExposedField> iterator() {
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
iterator.add(syntheticFields.iterator());
iterator.add(originalFields.iterator());
return iterator;
}
/**
* A single exposed field.
*
* @author Oliver Gierke
*/
static class ExposedField implements Field {
private final boolean synthetic;
private final Field field;
/**
* Creates a new {@link ExposedField} with the given key.
*
* @param key must not be {@literal null} or empty.
* @param synthetic whether the exposed field is synthetic.
*/
public ExposedField(String key, boolean synthetic) {
this(Fields.field(key), synthetic);
}
/**
* Creates a new {@link ExposedField} for the given {@link Field}.
*
* @param delegate must not be {@literal null}.
* @param synthetic whether the exposed field is synthetic.
*/
public ExposedField(Field delegate, boolean synthetic) {
this.field = delegate;
this.synthetic = synthetic;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
*/
@Override
public String getName() {
return field.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getTarget()
*/
@Override
public String getTarget() {
return field.getTarget();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
*/
@Override
public boolean isAliased() {
return field.isAliased();
}
/**
* @return the synthetic
*/
public boolean isSynthetic() {
return synthetic;
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param name
* @return
*/
public boolean canBeReferredToBy(String name) {
return getName().equals(name) || getTarget().equals(name);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("AggregationField: %s, synthetic: %s", field, synthetic);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof ExposedField)) {
return false;
}
ExposedField that = (ExposedField) obj;
return this.field.equals(that.field) && this.synthetic == that.synthetic;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * field.hashCode();
result += 31 * (synthetic ? 0 : 1);
return result;
}
}
/**
* A reference to an {@link ExposedField}.
*
* @author Oliver Gierke
*/
static class FieldReference {
private final ExposedField field;
/**
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
*/
public FieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
*/
public String getRaw() {
String target = field.getTarget();
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
}
/**
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return
*/
public Object getReferenceValue() {
return field.synthetic && !field.isAliased() ? 1 : toString();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("$%s", getRaw());
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof FieldReference)) {
return false;
}
FieldReference that = (FieldReference) obj;
return this.field.equals(that.field);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return field.hashCode();
}
}
}

View File

@@ -0,0 +1,117 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.4
*/
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
private final AggregationOperationContext rootContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
this.exposedFields = exposedFields;
this.rootContext = rootContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return rootContext.getMappedObject(dbObject);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field, field.getTarget());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return getReference(null, name);
}
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
*/
private FieldReference getReference(Field field, String name) {
Assert.notNull(name, "Name must not be null!");
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new FieldReference(exposedField);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,29 +13,34 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
package org.springframework.data.mongodb.core.aggregation;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
* Abstraction for a field.
*
* @author Oliver Gierke
* @since 1.3
*/
public interface Shape {
public interface Field {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
* Returns the name of the field.
*
* @return must not be {@literal null}.
*/
String getName();
/**
* Returns the target of the field. In case no explicit target is available {@link #getName()} should be returned.
*
* @return must not be {@literal null}.
*/
String getTarget();
/**
* Returns whether the Field is aliased, which means it has a name set different from the target.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
boolean isAliased();
}

View File

@@ -0,0 +1,316 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
private static final String AMBIGUOUS_EXCEPTION = "Found two fields both using '%s' as name: %s and %s! Please "
+ "customize your field definitions to get to unique field names!";
public static final String UNDERSCORE_ID = "_id";
public static final String UNDERSCORE_ID_REF = "$_id";
private final List<Field> fields;
/**
* Creates a new {@link Fields} instance from the given {@link Fields}.
*
* @param fields must not be {@literal null} or empty.
* @return
*/
public static Fields from(Field... fields) {
Assert.notNull(fields, "Fields must not be null!");
return new Fields(Arrays.asList(fields));
}
/**
* Creates a new {@link Fields} instance for {@link Field}s with the given names.
*
* @param names must not be {@literal null}.
* @return
*/
public static Fields fields(String... names) {
Assert.notNull(names, "Field names must not be null!");
List<Field> fields = new ArrayList<Field>();
for (String name : names) {
fields.add(field(name));
}
return new Fields(fields);
}
/**
* Creates a {@link Field} with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public static Field field(String name) {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
}
/**
* Creates a new {@link Fields} instance using the given {@link Field}s.
*
* @param fields must not be {@literal null}.
*/
private Fields(List<Field> fields) {
Assert.notNull(fields, "Fields must not be null!");
this.fields = verify(fields);
}
private static final List<Field> verify(List<Field> fields) {
Map<String, Field> reference = new HashMap<String, Field>();
for (Field field : fields) {
String name = field.getName();
Field found = reference.get(name);
if (found != null) {
throw new IllegalArgumentException(String.format(AMBIGUOUS_EXCEPTION, name, found, field));
}
reference.put(name, field);
}
return fields;
}
private Fields(Fields existing, Field tail) {
this.fields = new ArrayList<Field>(existing.fields.size() + 1);
this.fields.addAll(existing.fields);
this.fields.add(tail);
}
/**
* Creates a new {@link Fields} instance with a new {@link Field} of the given name added.
*
* @param name must not be {@literal null}.
* @return
*/
public Fields and(String name) {
return and(new AggregationField(name));
}
public Fields and(String name, String target) {
return and(new AggregationField(name, target));
}
public Fields and(Field field) {
return new Fields(this, field);
}
public Fields and(Fields fields) {
Fields result = this;
for (Field field : fields) {
result = result.and(field);
}
return result;
}
public Field getField(String name) {
for (Field field : fields) {
if (field.getName().equals(name)) {
return field;
}
}
return null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
@Override
public Iterator<Field> iterator() {
return fields.iterator();
}
/**
* Value object to encapsulate a field in an aggregation operation.
*
* @author Oliver Gierke
*/
static class AggregationField implements Field {
private final String name;
private final String target;
/**
* Creates an aggregation field with the given {@code name}.
*
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/
public AggregationField(String name) {
this(name, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
String nameToSet = cleanUp(name);
String targetToSet = cleanUp(target);
Assert.hasText(nameToSet, "AggregationField name must not be null or empty!");
if (target == null && name.contains(".")) {
this.name = nameToSet.substring(nameToSet.indexOf('.') + 1);
this.target = nameToSet;
} else {
this.name = nameToSet;
this.target = targetToSet;
}
}
private static final String cleanUp(String source) {
if (source == null) {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
*/
public String getName() {
return name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
*/
public String getTarget() {
return StringUtils.hasText(this.target) ? this.target : this.name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#isAliased()
*/
@Override
public boolean isAliased() {
return !getName().equals(getTarget());
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("AggregationField - name: %s, target: %s", name, target);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof AggregationField)) {
return false;
}
AggregationField that = (AggregationField) obj;
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * name.hashCode();
result += 31 * ObjectUtils.nullSafeHashCode(target);
return result;
}
}
}

View File

@@ -0,0 +1,32 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
/**
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s.
*
* @author Thomas Darimont
*/
public interface FieldsExposingAggregationOperation extends AggregationOperation {
/**
* Returns the fields exposed by the {@link AggregationOperation}.
*
* @return will never be {@literal null}.
*/
ExposedFields getFields();
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
}
}

View File

@@ -0,0 +1,442 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Locale;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class GroupOperation implements FieldsExposingAggregationOperation {
/**
* Holds the non-synthetic fields which are the fields of the group-id structure.
*/
private final ExposedFields idFields;
private final List<Operation> operations;
/**
* Creates a new {@link GroupOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
*/
public GroupOperation(Fields fields) {
this.idFields = ExposedFields.nonSynthetic(fields);
this.operations = new ArrayList<Operation>();
}
/**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation}.
*
* @param groupOperation must not be {@literal null}.
*/
protected GroupOperation(GroupOperation groupOperation) {
this(groupOperation, Collections.<Operation> emptyList());
}
/**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s.
*
* @param groupOperation
* @param nextOperations
*/
private GroupOperation(GroupOperation groupOperation, List<Operation> nextOperations) {
Assert.notNull(groupOperation, "GroupOperation must not be null!");
Assert.notNull(nextOperations, "NextOperations must not be null!");
this.idFields = groupOperation.idFields;
this.operations = new ArrayList<Operation>(nextOperations.size() + 1);
this.operations.addAll(groupOperation.operations);
this.operations.addAll(nextOperations);
}
/**
* Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}.
*
* @param operation must not be {@literal null}.
* @return
*/
protected GroupOperation and(Operation operation) {
return new GroupOperation(this, Arrays.asList(operation));
}
/**
* Builder for {@link GroupOperation}s on a field.
*
* @author Thomas Darimont
*/
public static final class GroupOperationBuilder {
private final GroupOperation groupOperation;
private final Operation operation;
/**
* Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}.
*
* @param groupOperation
* @param operation
*/
private GroupOperationBuilder(GroupOperation groupOperation, Operation operation) {
Assert.notNull(groupOperation, "GroupOperation must not be null!");
Assert.notNull(operation, "Operation must not be null!");
this.groupOperation = groupOperation;
this.operation = operation;
}
/**
* Allows to specify an alias for the new-operation operation.
*
* @param alias
* @return
*/
public GroupOperation as(String alias) {
return this.groupOperation.and(operation.withAlias(alias));
}
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression.
* <p>
* Count expressions are emulated via {@code $sum: 1}.
* <p>
*
* @return
*/
public GroupOperationBuilder count() {
return newBuilder(GroupOps.SUM, null, 1);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder sum(String reference) {
return sum(reference, null);
}
private GroupOperationBuilder sum(String reference, Object value) {
return newBuilder(GroupOps.SUM, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder addToSet(String reference) {
return addToSet(reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value.
*
* @param value
* @return
*/
public GroupOperationBuilder addToSet(Object value) {
return addToSet(null, value);
}
private GroupOperationBuilder addToSet(String reference, Object value) {
return newBuilder(GroupOps.ADD_TO_SET, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder last(String reference) {
return newBuilder(GroupOps.LAST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder last(AggregationExpression expr) {
return newBuilder(GroupOps.LAST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder first(String reference) {
return newBuilder(GroupOps.FIRST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder first(AggregationExpression expr) {
return newBuilder(GroupOps.FIRST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder avg(String reference) {
return newBuilder(GroupOps.AVG, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder avg(AggregationExpression expr) {
return newBuilder(GroupOps.AVG, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder push(String reference) {
return push(reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value.
*
* @param value
* @return
*/
public GroupOperationBuilder push(Object value) {
return push(null, value);
}
private GroupOperationBuilder push(String reference, Object value) {
return newBuilder(GroupOps.PUSH, reference, value);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder min(String reference) {
return newBuilder(GroupOps.MIN, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder min(AggregationExpression expr) {
return newBuilder(GroupOps.MIN, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
*
* @param reference
* @return
*/
public GroupOperationBuilder max(String reference) {
return newBuilder(GroupOps.MAX, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder max(AggregationExpression expr) {
return newBuilder(GroupOps.MAX, null, expr);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields()
*/
@Override
public ExposedFields getFields() {
ExposedFields fields = this.idFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
for (Operation operation : operations) {
fields = fields.and(operation.asField());
}
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public com.mongodb.DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject operationObject = new BasicDBObject();
if (idFields.exposesNoNonSyntheticFields()) {
operationObject.put(Fields.UNDERSCORE_ID, null);
} else if (idFields.exposesSingleNonSyntheticFieldOnly()) {
FieldReference reference = context.getReference(idFields.iterator().next());
operationObject.put(Fields.UNDERSCORE_ID, reference.toString());
} else {
BasicDBObject inner = new BasicDBObject();
for (ExposedField field : idFields) {
FieldReference reference = context.getReference(field);
inner.put(field.getName(), reference.toString());
}
operationObject.put(Fields.UNDERSCORE_ID, inner);
}
for (Operation operation : operations) {
operationObject.putAll(operation.toDBObject(context));
}
return new BasicDBObject("$group", operationObject);
}
interface Keyword {
String toString();
}
private static enum GroupOps implements Keyword {
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
@Override
public String toString() {
String[] parts = name().split("_");
StringBuilder builder = new StringBuilder();
for (String part : parts) {
String lowerCase = part.toLowerCase(Locale.US);
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
}
return "$" + builder.toString();
}
}
static class Operation implements AggregationOperation {
private final Keyword op;
private final String key;
private final String reference;
private final Object value;
public Operation(Keyword op, String key, String reference, Object value) {
this.op = op;
this.key = key;
this.reference = reference;
this.value = value;
}
public Operation withAlias(String key) {
return new Operation(op, key, reference, value);
}
public ExposedField asField() {
return new ExposedField(key, true);
}
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(key, new BasicDBObject(op.toString(), getValue(context)));
}
public Object getValue(AggregationOperationContext context) {
if (reference == null) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return value;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
return reference;
}
return context.getReference(reference).toString();
}
@Override
public String toString() {
return "Operation [op=" + op + ", key=" + key + ", reference=" + reference + ", value=" + value + "]";
}
}
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class LimitOperation implements AggregationOperation {
private final long maxElements;
/**
* @param maxElements Number of documents to consider.
*/
public LimitOperation(long maxElements) {
Assert.isTrue(maxElements >= 0, "Maximum number of elements must be greater or equal to zero!");
this.maxElements = maxElements;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$limit", maxElements);
}
}

View File

@@ -0,0 +1,60 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class MatchOperation implements AggregationOperation {
private final CriteriaDefinition criteriaDefinition;
/**
* Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
*
* @param criteriaDefinition must not be {@literal null}.
*/
public MatchOperation(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteriaDefinition, "Criteria must not be null!");
this.criteriaDefinition = criteriaDefinition;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteriaDefinition.getCriteriaObject()));
}
}

View File

@@ -0,0 +1,998 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $project}-operation.
* <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.3
*/
public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final List<Projection> NONE = Collections.emptyList();
private static final String EXCLUSION_ERROR = "Exclusion of field %s not allowed. Projections by the mongodb "
+ "aggregation framework only support the exclusion of the %s field!";
private final List<Projection> projections;
/**
* Creates a new empty {@link ProjectionOperation}.
*/
public ProjectionOperation() {
this(NONE, NONE);
}
/**
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
*/
public ProjectionOperation(Fields fields) {
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields));
}
/**
* Copy constructor to allow building up {@link ProjectionOperation} instances from already existing
* {@link Projection}s.
*
* @param current must not be {@literal null}.
* @param projections must not be {@literal null}.
*/
private ProjectionOperation(List<? extends Projection> current, List<? extends Projection> projections) {
Assert.notNull(current, "Current projections must not be null!");
Assert.notNull(projections, "Projections must not be null!");
this.projections = new ArrayList<ProjectionOperation.Projection>(current.size() + projections.size());
this.projections.addAll(current);
this.projections.addAll(projections);
}
/**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one.
*
* @param projection must not be {@literal null}.
* @return
*/
private ProjectionOperation and(Projection projection) {
return new ProjectionOperation(this.projections, Arrays.asList(projection));
}
/**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with
* the given one.
*
* @param projection must not be {@literal null}.
* @return
*/
private ProjectionOperation andReplaceLastOneWith(Projection projection) {
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() : this.projections
.subList(0, this.projections.size() - 1);
return new ProjectionOperation(projections, Arrays.asList(projection));
}
/**
* Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public ProjectionOperationBuilder and(String name) {
return new ProjectionOperationBuilder(name, this, null);
}
public ExpressionProjectionOperationBuilder andExpression(String expression, Object... params) {
return new ExpressionProjectionOperationBuilder(expression, this, params);
}
public ProjectionOperationBuilder and(AggregationExpression expression) {
return new ProjectionOperationBuilder(expression, this, null);
}
/**
* Excludes the given fields from the projection.
*
* @param fieldNames must not be {@literal null}.
* @return
*/
public ProjectionOperation andExclude(String... fieldNames) {
for (String fieldName : fieldNames) {
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
}
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
return new ProjectionOperation(this.projections, excludeProjections);
}
/**
* Includes the given fields into the projection.
*
* @param fieldNames must not be {@literal null}.
* @return
*/
public ProjectionOperation andInclude(String... fieldNames) {
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fieldNames), true);
return new ProjectionOperation(this.projections, projections);
}
/**
* Includes the given fields into the projection.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation andInclude(Fields fields) {
return new ProjectionOperation(this.projections, FieldProjection.from(fields, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
ExposedFields fields = null;
for (Projection projection : projections) {
ExposedField field = projection.getExposedField();
fields = fields == null ? ExposedFields.from(field) : fields.and(field);
}
return fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject fieldObject = new BasicDBObject();
for (Projection projection : projections) {
fieldObject.putAll(projection.toDBObject(context));
}
return new BasicDBObject("$project", fieldObject);
}
/**
* Base class for {@link ProjectionOperationBuilder}s.
*
* @author Thomas Darimont
*/
private static abstract class AbstractProjectionOperationBuilder implements AggregationOperation {
protected final Object value;
protected final ProjectionOperation operation;
/**
* Creates a new {@link AbstractProjectionOperationBuilder} fot the given value and {@link ProjectionOperation}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
public AbstractProjectionOperationBuilder(Object value, ProjectionOperation operation) {
Assert.notNull(value, "value must not be null or empty!");
Assert.notNull(operation, "ProjectionOperation must not be null!");
this.value = value;
this.operation = operation;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return this.operation.toDBObject(context);
}
/**
* Returns the finally to be applied {@link ProjectionOperation} with the given alias.
*
* @param alias will never be {@literal null} or empty.
* @return
*/
public abstract ProjectionOperation as(String alias);
}
/**
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
*
* @author Thomas Darimont
*/
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
private final Object[] params;
private final String expression;
/**
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
* parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
super(expression, operation, null);
this.expression = expression;
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder#project(java.lang.String, java.lang.Object[])
*/
@Override
public ProjectionOperationBuilder project(String operation, final Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values) {
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.length + 1);
result.add(ExpressionProjection.toMongoExpression(context,
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
result.addAll(Arrays.asList(values));
return result;
}
};
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#as(java.lang.String)
*/
@Override
public ProjectionOperation as(String alias) {
Field expressionField = Fields.field(alias, alias);
return this.operation.and(new ExpressionProjection(expressionField, this.value.toString(), params));
}
/**
* A {@link Projection} based on a SpEL expression.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
static class ExpressionProjection extends Projection {
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
private final String expression;
private final Object[] params;
/**
* Creates a new {@link ExpressionProjection} for the given field, SpEL expression and parameters.
*
* @param field must not be {@literal null}.
* @param expression must not be {@literal null} or empty.
* @param parameters must not be {@literal null}.
*/
public ExpressionProjection(Field field, String expression, Object[] parameters) {
super(field);
Assert.hasText(expression, "Expression must not be null!");
Assert.notNull(parameters, "Parameters must not be null!");
this.expression = expression;
this.params = parameters.clone();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
}
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
return TRANSFORMER.transform(expression, context, params);
}
}
}
/**
* Builder for {@link ProjectionOperation}s on a field.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
private static final String NUMBER_NOT_NULL = "Number must not be null!";
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
private final String name;
private final OperationProjection previousProjection;
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given
* {@link ProjectionOperation}.
*
* @param name must not be {@literal null} or empty.
* @param operation must not be {@literal null}.
* @param previousProjection the previous operation projection, may be {@literal null}.
*/
public ProjectionOperationBuilder(String name, ProjectionOperation operation, OperationProjection previousProjection) {
super(name, operation);
this.name = name;
this.previousProjection = previousProjection;
}
/**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
* {@link ProjectionOperation}.
*
* @param value
* @param operation
* @param previousProjection
*/
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
OperationProjection previousProjection) {
super(value, operation);
this.name = null;
this.previousProjection = previousProjection;
}
/**
* Projects the result of the previous operation onto the current field. Will automatically add an exclusion for
* {@code _id} as what would be held in it by default will now go into the field just projected into.
*
* @return
*/
public ProjectionOperation previousOperation() {
return this.operation.andExclude(Fields.UNDERSCORE_ID) //
.and(new PreviousOperationProjection(name));
}
/**
* Defines a nested field binding for the current field.
*
* @param fields must not be {@literal null}.
* @return
*/
public ProjectionOperation nested(Fields fields) {
return this.operation.and(new NestedFieldProjection(name, fields));
}
/**
* Allows to specify an alias for the previous projection operation.
*
* @param string
* @return
*/
@Override
public ProjectionOperation as(String alias) {
if (this.previousProjection != null) {
return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias));
}
if (value instanceof AggregationExpression) {
return this.operation.and(new ExpressionProjection(Fields.field(alias), (AggregationExpression) value));
}
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
}
/**
* Generates an {@code $add} expression that adds the given number to the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder plus(Number number) {
Assert.notNull(number, NUMBER_NOT_NULL);
return project("add", number);
}
/**
* Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder plus(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("add", Fields.field(fieldReference));
}
/**
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder minus(Number number) {
Assert.notNull(number, "Number must not be null!");
return project("subtract", number);
}
/**
* Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned
* field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder minus(String fieldReference) {
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
return project("subtract", Fields.field(fieldReference));
}
/**
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
*
* @param number
* @return
*/
public ProjectionOperationBuilder multiply(Number number) {
Assert.notNull(number, NUMBER_NOT_NULL);
return project("multiply", number);
}
/**
* Generates an {@code $multiply} expression that multiplies the value of the given field with the previously
* mentioned field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder multiply(String fieldReference) {
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
return project("multiply", Fields.field(fieldReference));
}
/**
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
*
* @param number
* @return
*/
public ProjectionOperationBuilder divide(Number number) {
Assert.notNull(number, FIELD_REFERENCE_NOT_NULL);
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
return project("divide", number);
}
/**
* Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned
* field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder divide(String fieldReference) {
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
return project("divide", Fields.field(fieldReference));
}
/**
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
* the remainder.
*
* @param number
* @return
*/
public ProjectionOperationBuilder mod(Number number) {
Assert.notNull(number, NUMBER_NOT_NULL);
Assert.isTrue(Math.abs(number.intValue()) != 0, "Number must not be zero!");
return project("mod", number);
}
/**
* Generates an {@code $mod} expression that divides the value of the given field by the previously mentioned field
* and returns the remainder.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder mod(String fieldReference) {
Assert.notNull(fieldReference, FIELD_REFERENCE_NOT_NULL);
return project("mod", Fields.field(fieldReference));
}
public ProjectionOperationBuilder size() {
return project("size");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return this.operation.toDBObject(context);
}
/**
* Adds a generic projection for the current field.
*
* @param operation the operation key, e.g. {@code $add}.
* @param values the values to be set for the projection operation.
* @return
*/
public ProjectionOperationBuilder project(String operation, Object... values) {
OperationProjection operationProjection = new OperationProjection(Fields.field(value.toString()), operation,
values);
return new ProjectionOperationBuilder(value, this.operation.and(operationProjection), operationProjection);
}
/**
* A {@link Projection} to pull in the result of the previous operation.
*
* @author Oliver Gierke
*/
static class PreviousOperationProjection extends Projection {
private final String name;
/**
* Creates a new {@link PreviousOperationProjection} for the field with the given name.
*
* @param name must not be {@literal null} or empty.
*/
public PreviousOperationProjection(String name) {
super(Fields.field(name));
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(name, Fields.UNDERSCORE_ID_REF);
}
}
/**
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
static class FieldProjection extends Projection {
private final Field field;
private final Object value;
/**
* Creates a new {@link FieldProjection} for the field of the given name, assigning the given value.
*
* @param name must not be {@literal null} or empty.
* @param value
*/
public FieldProjection(String name, Object value) {
this(Fields.field(name), value);
}
private FieldProjection(Field field, Object value) {
super(field);
this.field = field;
this.value = value;
}
/**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as
* references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } .
*
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @return
*/
public static List<? extends Projection> from(Fields fields) {
return from(fields, null);
}
/**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
*
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @param value to use for the given field.
* @return
*/
public static List<FieldProjection> from(Fields fields, Object value) {
Assert.notNull(fields, "Fields must not be null!");
List<FieldProjection> projections = new ArrayList<FieldProjection>();
for (Field field : fields) {
projections.add(new FieldProjection(field, value));
}
return projections;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(field.getName(), renderFieldValue(context));
}
private Object renderFieldValue(AggregationOperationContext context) {
// implicit reference or explicit include?
if (value == null || Boolean.TRUE.equals(value)) {
if (Aggregation.SystemVariable.isReferingToSystemVariable(field.getTarget())) {
return field.getTarget();
}
// check whether referenced field exists in the context
return context.getReference(field).getReferenceValue();
} else if (Boolean.FALSE.equals(value)) {
// render field as excluded
return 0;
}
return value;
}
}
static class OperationProjection extends Projection {
private final Field field;
private final String operation;
private final List<Object> values;
/**
* Creates a new {@link OperationProjection} for the given field.
*
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
public OperationProjection(Field field, String operation, Object[] values) {
super(field);
Assert.hasText(operation, "Operation must not be null or empty!");
Assert.notNull(values, "Values must not be null!");
this.field = field;
this.operation = operation;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject inner = new BasicDBObject("$" + operation, getOperationArguments(context));
return new BasicDBObject(getField().getName(), inner);
}
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(getField().getName()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
}
return result;
}
/**
* Returns the field that holds the {@link OperationProjection}.
*
* @return
*/
protected Field getField() {
return field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#getExposedField()
*/
@Override
public ExposedField getExposedField() {
if (!getField().isAliased()) {
return super.getExposedField();
}
return new ExposedField(new AggregationField(getField().getName()), true);
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*
* @param alias the alias to set
* @return
*/
public OperationProjection withAlias(String alias) {
final Field aliasedField = Fields.field(alias, this.field.getName());
return new OperationProjection(aliasedField, operation, values.toArray()) {
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.OperationProjection#getField()
*/
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationProjection that we replace
// with this new instance.
return OperationProjection.this.getOperationArguments(context);
}
};
}
}
static class NestedFieldProjection extends Projection {
private final String name;
private final Fields fields;
public NestedFieldProjection(String name, Fields fields) {
super(Fields.field(name));
this.name = name;
this.fields = fields;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject nestedObject = new BasicDBObject();
for (Field field : fields) {
nestedObject.put(field.getName(), context.getReference(field.getTarget()).toString());
}
return new BasicDBObject(name, nestedObject);
}
}
/**
* Extracts the minute from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMinute() {
return project("minute");
}
/**
* Extracts the hour from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractHour() {
return project("hour");
}
/**
* Extracts the second from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractSecond() {
return project("second");
}
/**
* Extracts the millisecond from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMillisecond() {
return project("millisecond");
}
/**
* Extracts the year from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractYear() {
return project("year");
}
/**
* Extracts the month from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractMonth() {
return project("month");
}
/**
* Extracts the week from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractWeek() {
return project("week");
}
/**
* Extracts the dayOfYear from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfYear() {
return project("dayOfYear");
}
/**
* Extracts the dayOfMonth from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfMonth() {
return project("dayOfMonth");
}
/**
* Extracts the dayOfWeek from a date expression.
*
* @return
*/
public ProjectionOperationBuilder extractDayOfWeek() {
return project("dayOfWeek");
}
}
/**
* Base class for {@link Projection} implementations.
*
* @author Oliver Gierke
*/
private static abstract class Projection {
private final ExposedField field;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
public Projection(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
}
/**
* Returns the field exposed by the {@link Projection}.
*
* @return will never be {@literal null}.
*/
public ExposedField getExposedField() {
return field;
}
/**
* Renders the current {@link Projection} into a {@link DBObject} based on the given
* {@link AggregationOperationContext}.
*
* @param context will never be {@literal null}.
* @return
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
/**
* @author Thomas Darimont
*/
static class ExpressionProjection extends Projection {
private final AggregationExpression expression;
private final Field field;
/**
* Creates a new {@link ExpressionProjection}.
*
* @param field
* @param expression
*/
public ExpressionProjection(Field field, AggregationExpression expression) {
super(field);
this.field = field;
this.expression = expression;
}
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(field.getName(), expression.toDbObject(context));
}
}
}

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class SkipOperation implements AggregationOperation {
private final long skipCount;
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param skipCount number of documents to skip.
*/
public SkipOperation(long skipCount) {
Assert.isTrue(skipCount >= 0, "Skip count must not be negative!");
this.skipCount = skipCount;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$skip", skipCount);
}
}

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class SortOperation implements AggregationOperation {
private final Sort sort;
/**
* Creates a new {@link SortOperation} for the given {@link Sort} instance.
*
* @param sort must not be {@literal null}.
*/
public SortOperation(Sort sort) {
Assert.notNull(sort, "Sort must not be null!");
this.sort = sort;
}
public SortOperation and(Direction direction, String... fields) {
return and(new Sort(direction, fields));
}
public SortOperation and(Sort sort) {
return new SortOperation(this.sort.and(sort));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject object = new BasicDBObject();
for (Order order : sort) {
// Check reference
FieldReference reference = context.getReference(order.getProperty());
object.put(reference.getRaw(), order.isAscending() ? 1 : -1);
}
return new BasicDBObject("$sort", object);
}
}

View File

@@ -0,0 +1,513 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.util.DBObjectUtils.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.core.GenericTypeResolver;
import org.springframework.data.mongodb.core.spel.ExpressionNode;
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
import org.springframework.data.mongodb.core.spel.LiteralNode;
import org.springframework.data.mongodb.core.spel.MethodReferenceNode;
import org.springframework.data.mongodb.core.spel.OperatorNode;
import org.springframework.expression.spel.ExpressionState;
import org.springframework.expression.spel.SpelNode;
import org.springframework.expression.spel.SpelParserConfiguration;
import org.springframework.expression.spel.ast.CompoundExpression;
import org.springframework.expression.spel.ast.Indexer;
import org.springframework.expression.spel.ast.InlineList;
import org.springframework.expression.spel.ast.PropertyOrFieldReference;
import org.springframework.expression.spel.standard.SpelExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Renders the AST of a SpEL expression as a MongoDB Aggregation Framework projection expression.
*
* @author Thomas Darimont
*/
class SpelExpressionTransformer implements AggregationExpressionTransformer {
// TODO: remove explicit usage of a configuration once SPR-11031 gets fixed
private static final SpelParserConfiguration CONFIG = new SpelParserConfiguration(false, false);
private static final SpelExpressionParser PARSER = new SpelExpressionParser(CONFIG);
private final List<ExpressionNodeConversion<? extends ExpressionNode>> conversions;
/**
* Creates a new {@link SpelExpressionTransformer}.
*/
public SpelExpressionTransformer() {
List<ExpressionNodeConversion<? extends ExpressionNode>> conversions = new ArrayList<ExpressionNodeConversion<? extends ExpressionNode>>();
conversions.add(new OperatorNodeConversion(this));
conversions.add(new LiteralNodeConversion(this));
conversions.add(new IndexerNodeConversion(this));
conversions.add(new InlineListNodeConversion(this));
conversions.add(new PropertyOrFieldReferenceNodeConversion(this));
conversions.add(new CompoundExpressionNodeConversion(this));
conversions.add(new MethodReferenceNodeConversion(this));
this.conversions = Collections.unmodifiableList(conversions);
}
/**
* Transforms the given SpEL expression to a corresponding MongoDB expression against the given
* {@link AggregationOperationContext} {@code context}.
* <p>
* Exposes the given @{code params} as <code>[0] ... [n]</code>.
*
* @param expression must not be {@literal null}
* @param context must not be {@literal null}
* @param params must not be {@literal null}
* @return
*/
public Object transform(String expression, AggregationOperationContext context, Object... params) {
Assert.notNull(expression, "Expression must not be null!");
Assert.notNull(context, "AggregationOperationContext must not be null!");
Assert.notNull(params, "Parameters must not be null!");
SpelExpression spelExpression = (SpelExpression) PARSER.parseExpression(expression);
ExpressionState state = new ExpressionState(new StandardEvaluationContext(params), CONFIG);
ExpressionNode node = ExpressionNode.from(spelExpression.getAST(), state);
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, null, null, context));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.spel.ExpressionTransformer#transform(org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport)
*/
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
return lookupConversionFor(context.getCurrentNode()).convert(context);
}
/**
* Returns an appropriate {@link ExpressionNodeConversion} for the given {@code node}. Throws an
* {@link IllegalArgumentException} if no conversion could be found.
*
* @param node
* @return the appropriate {@link ExpressionNodeConversion} for the given {@link ExpressionNode}.
*/
@SuppressWarnings("unchecked")
private ExpressionNodeConversion<ExpressionNode> lookupConversionFor(ExpressionNode node) {
for (ExpressionNodeConversion<? extends ExpressionNode> candidate : conversions) {
if (candidate.supports(node)) {
return (ExpressionNodeConversion<ExpressionNode>) candidate;
}
}
throw new IllegalArgumentException("Unsupported Element: " + node + " Type: " + node.getClass()
+ " You probably have a syntax error in your SpEL expression!");
}
/**
* Abstract base class for {@link SpelNode} to (Db)-object conversions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static abstract class ExpressionNodeConversion<T extends ExpressionNode> implements
AggregationExpressionTransformer {
private final AggregationExpressionTransformer transformer;
private final Class<? extends ExpressionNode> nodeType;
/**
* Creates a new {@link ExpressionNodeConversion}.
*
* @param transformer must not be {@literal null}.
*/
@SuppressWarnings("unchecked")
public ExpressionNodeConversion(AggregationExpressionTransformer transformer) {
Assert.notNull(transformer, "Transformer must not be null!");
this.nodeType = (Class<? extends ExpressionNode>) GenericTypeResolver.resolveTypeArgument(this.getClass(),
ExpressionNodeConversion.class);
this.transformer = transformer;
}
/**
* Returns whether the current conversion supports the given {@link ExpressionNode}. By default we will match the
* node type against the genric type the subclass types the type parameter to.
*
* @param node will never be {@literal null}.
* @return true if {@literal this} conversion can be applied to the given {@code node}.
*/
protected boolean supports(ExpressionNode node) {
return nodeType.equals(node.getClass());
}
/**
* Triggers the transformation for the given {@link ExpressionNode} and the given current context.
*
* @param node must not be {@literal null}.
* @param context must not be {@literal null}.
* @return
*/
protected Object transform(ExpressionNode node, AggregationExpressionTransformationContext<?> context) {
Assert.notNull(node, "ExpressionNode must not be null!");
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
return transform(node, context.getParentNode(), null, context);
}
/**
* Triggers the transformation with the given new {@link ExpressionNode}, new parent node, the current operation and
* the previous context.
*
* @param node must not be {@literal null}.
* @param parent
* @param operation
* @param context must not be {@literal null}.
* @return
*/
protected Object transform(ExpressionNode node, ExpressionNode parent, DBObject operation,
AggregationExpressionTransformationContext<?> context) {
Assert.notNull(node, "ExpressionNode must not be null!");
Assert.notNull(context, "AggregationExpressionTransformationContext must not be null!");
return transform(new AggregationExpressionTransformationContext<ExpressionNode>(node, parent, operation,
context.getAggregationContext()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#transform(org.springframework.data.mongodb.core.aggregation.AggregationExpressionTransformer.AggregationExpressionTransformationContext)
*/
@Override
public Object transform(AggregationExpressionTransformationContext<ExpressionNode> context) {
return transformer.transform(context);
}
/**
* Performs the actual conversion from {@link SpelNode} to the corresponding representation for MongoDB.
*
* @param context
* @return
*/
protected abstract Object convert(AggregationExpressionTransformationContext<T> context);
}
/**
* A {@link ExpressionNodeConversion} that converts arithmetic operations.
*
* @author Thomas Darimont
*/
private static class OperatorNodeConversion extends ExpressionNodeConversion<OperatorNode> {
public OperatorNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<OperatorNode> context) {
OperatorNode currentNode = context.getCurrentNode();
DBObject operationObject = createOperationObjectAndAddToPreviousArgumentsIfNecessary(context, currentNode);
Object leftResult = transform(currentNode.getLeft(), currentNode, operationObject, context);
if (currentNode.isUnaryMinus()) {
return convertUnaryMinusOp(context, leftResult);
}
// we deliberately ignore the RHS result
transform(currentNode.getRight(), currentNode, operationObject, context);
return operationObject;
}
private DBObject createOperationObjectAndAddToPreviousArgumentsIfNecessary(
AggregationExpressionTransformationContext<OperatorNode> context, OperatorNode currentNode) {
DBObject nextDbObject = new BasicDBObject(currentNode.getMongoOperator(), new BasicDBList());
if (!context.hasPreviousOperation()) {
return nextDbObject;
}
if (context.parentIsSameOperation()) {
// same operator applied in a row e.g. 1 + 2 + 3 carry on with the operation and render as $add: [1, 2 ,3]
nextDbObject = context.getPreviousOperationObject();
} else if (!currentNode.isUnaryOperator()) {
// different operator -> add context object for next level to list if arguments of previous expression
context.addToPreviousOperation(nextDbObject);
}
return nextDbObject;
}
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context, Object leftResult) {
Object result = leftResult instanceof Number ? leftResult
: new BasicDBObject("$multiply", dbList(-1, leftResult));
if (leftResult != null && context.hasPreviousOperation()) {
context.addToPreviousOperation(result);
}
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(java.lang.Class)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isMathematicalOperation();
}
}
/**
* A {@link ExpressionNodeConversion} that converts indexed expressions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class IndexerNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public IndexerNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
return context.addToPreviousOrReturn(context.getCurrentNode().getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(Indexer.class);
}
}
/**
* A {@link ExpressionNodeConversion} that converts in-line list expressions.
*
* @author Thomas Darimont
*/
private static class InlineListNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public InlineListNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
ExpressionNode currentNode = context.getCurrentNode();
if (!currentNode.hasChildren()) {
return null;
}
// just take the first item
return transform(currentNode.getChild(0), currentNode, null, context);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(InlineList.class);
}
}
/**
* A {@link ExpressionNodeConversion} that converts property or field reference expressions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class PropertyOrFieldReferenceNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public PropertyOrFieldReferenceNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#convert(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionTransformationContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
String fieldReference = context.getFieldReference().toString();
return context.addToPreviousOrReturn(fieldReference);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(PropertyOrFieldReference.class);
}
}
/**
* A {@link ExpressionNodeConversion} that converts literal expressions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class LiteralNodeConversion extends ExpressionNodeConversion<LiteralNode> {
public LiteralNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
@SuppressWarnings("unchecked")
protected Object convert(AggregationExpressionTransformationContext<LiteralNode> context) {
LiteralNode node = context.getCurrentNode();
Object value = node.getValue();
if (context.hasPreviousOperation()) {
if (node.isUnaryMinus(context.getParentNode())) {
// unary minus operator
return NumberUtils.convertNumberToTargetClass(((Number) value).doubleValue() * -1,
(Class<Number>) value.getClass()); // retain type, e.g. int to -int
}
return context.addToPreviousOperation(value);
}
return value;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#supports(org.springframework.expression.spel.SpelNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isLiteral();
}
}
/**
* A {@link ExpressionNodeConversion} that converts method reference expressions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class MethodReferenceNodeConversion extends ExpressionNodeConversion<MethodReferenceNode> {
public MethodReferenceNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<MethodReferenceNode> context) {
MethodReferenceNode node = context.getCurrentNode();
List<Object> args = new ArrayList<Object>();
for (ExpressionNode childNode : node) {
args.add(transform(childNode, context));
}
return context.addToPreviousOrReturn(new BasicDBObject(node.getMethodName(), dbList(args.toArray())));
}
}
/**
* A {@link ExpressionNodeConversion} that converts method compound expressions.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class CompoundExpressionNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public CompoundExpressionNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
ExpressionNode currentNode = context.getCurrentNode();
if (currentNode.hasfirstChildNotOfType(Indexer.class)) {
// we have a property path expression like: foo.bar -> render as reference
return context.addToPreviousOrReturn(context.getFieldReference().toString());
}
return context.addToPreviousOrReturn(currentNode.getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(CompoundExpression.class);
}
}
}

View File

@@ -0,0 +1,103 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} aware of a particular type and a {@link MappingContext} to potentially translate
* property references into document field names.
*
* @author Oliver Gierke
* @since 1.3
*/
public class TypeBasedAggregationOperationContext implements AggregationOperationContext {
private final Class<?> type;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link TypeBasedAggregationOperationContext} for the given type, {@link MappingContext} and
* {@link QueryMapper}.
*
* @param type must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
* @param mapper must not be {@literal null}.
*/
public TypeBasedAggregationOperationContext(Class<?> type,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, QueryMapper mapper) {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
Assert.notNull(mapper, "QueryMapper must not be null!");
this.type = type;
this.mappingContext = mappingContext;
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return mapper.getMappedObject(dbObject, mappingContext.getPersistentEntity(type));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.Field)
*/
@Override
public FieldReference getReference(Field field) {
PropertyPath.from(field.getTarget(), type);
return getReferenceFor(field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return getReferenceFor(field(name));
}
private FieldReference getReferenceFor(Field field) {
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(
field.getTarget(), type);
Field mappedField = field(propertyPath.getLeafProperty().getName(),
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE));
return new FieldReference(new ExposedField(mappedField, true));
}
}

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.List;
import org.springframework.util.Assert;
/**
* A {@code TypedAggregation} is a special {@link Aggregation} that holds information of the input aggregation type.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
public class TypedAggregation<I> extends Aggregation {
private final Class<I> inputType;
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, AggregationOperation... operations) {
this(inputType, asAggregationList(operations));
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations) {
this(inputType, operations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link TypedAggregation} from the given {@link AggregationOperation}s and the given
* {@link AggregationOptions}.
*
* @param inputType must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
* @param options must not be {@literal null}.
*/
public TypedAggregation(Class<I> inputType, List<AggregationOperation> operations, AggregationOptions options) {
super(operations, options);
Assert.notNull(inputType, "Input type must not be null!");
this.inputType = inputType;
}
/**
* Returns the input type for the {@link Aggregation}.
*
* @return the inputType will never be {@literal null}.
*/
public Class<I> getInputType() {
return inputType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Aggregation#withOptions(org.springframework.data.mongodb.core.aggregation.AggregationOptions)
*/
public TypedAggregation<I> withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new TypedAggregation<I>(inputType, operations, options);
}
}

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
*/
public class UnwindOperation implements AggregationOperation {
private final ExposedField field;
/**
* Creates a new {@link UnwindOperation} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
public UnwindOperation(Field field) {
Assert.notNull(field);
this.field = new ExposedField(field, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$unwind", context.getReference(field).toString());
}
}

View File

@@ -0,0 +1,5 @@
/**
* Support for the MongoDB aggregation framework.
* @since 1.3
*/
package org.springframework.data.mongodb.core.aggregation;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.math.BigInteger;
@@ -21,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -33,8 +32,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObj
* Base class for {@link MongoConverter} implementations. Sets up a {@link GenericConversionService} and populates basic
* converters. Allows registering {@link CustomConversions}.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
* @author Jon Brisbin
* @author Oliver Gierke
*/
public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean {
@@ -47,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
}
/**
@@ -94,6 +91,14 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
conversions.registerConvertersIn(conversionService);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object)
*/
public Object convertToMongoType(Object obj) {
return convertToMongoType(obj, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getConversionService()

Some files were not shown because too many files have changed in this diff Show More