Compare commits

..

129 Commits

Author SHA1 Message Date
Spring Buildmaster
2bc6ebc250 DATAMONGO-609 - Release version 1.2.0.RELEASE 2013-02-08 04:15:46 -08:00
Oliver Gierke
909110cf4e DATAMONGO-609 - Upgraded to parent project and Spring Data Commons in release versions. 2013-02-08 13:05:49 +01:00
Oliver Gierke
ba094da5a7 DATAMONGO-611 - Upgraded to MongoDB Java driver 2.10.1. 2013-02-08 13:04:33 +01:00
Oliver Gierke
876b31bc52 DATAMONGO-609 - Updated changelog. 2013-02-08 13:00:20 +01:00
Oliver Gierke
dc8e8281eb DATAMONGO-606 - Add JodaTime specific Converter implementations if JodaTime is present on classpath. 2013-02-06 17:53:41 +01:00
Oliver Gierke
48deb1a150 DATAMONGO-603 - Polishing unit tests. 2013-02-06 17:39:33 +01:00
Michal Vich
48625956b7 DATAMONGO-568 - MongoTemplate.find(…) does not throw NullPointerException anymore.
Trigger findAll(…) if the Query object given to a find(…) method is null.
2013-02-06 15:04:39 +01:00
Michal Vich
782cf6e10d DATAMONGO-81 - Added more unit tests for MongoExceptionTranslator. 2013-02-06 14:41:56 +01:00
Oliver Gierke
c28e51cf86 DATAMONGO-603 - Make sure geo queries and repositories get correct Metric applied.
Heavily refactored NearQuery to simplify implementation and testability. Made sure the metric of the maximum distance is applied if no metric had been defined before. Adapted query preparation code to re-enforce the correct metric being set to not rely on the auto-application behavior of NearQuery to be safe against potential further refactorings.
2013-02-01 15:38:29 +01:00
Oliver Gierke
4b1065cac5 DATAMONGO-598 - Set needed properties and activate distribution artifact upload. 2013-01-29 20:36:41 +01:00
Oliver Gierke
c807b2abcf DATAMONGO-601 - Fixed exposing password by MongoDbUtils.
MongoDbUtils is now not exposing the plain password in the exception message piped into CannotGetMongoDbConnectionException but uses the newly introduced toString() method of UserCredentials (see DATACMNS-275).
2013-01-29 13:16:06 +01:00
Oliver Gierke
19ad2d3aac DATACMNS-274 - Adapt to moved type MappingContextIsNewStrategyFactory. 2013-01-29 10:54:36 +01:00
Oliver Gierke
bd7fe5bfd3 DATAMONGO-600 - Fixed parameter binding for derived queries on properties using polymorphism.
We need to retain the type in the serialized DBObject in case a derived query binds parameters for properties that use polymorphism as we serialize the object as nested document and this only matches in an all-or-nothing way. So if the type information is missing, we won't see any results.

So we now hand the TypeInformation of the property into the conversion logic and transparently skip the type information removal in case the types differ.
2013-01-25 11:29:39 +01:00
Oliver Gierke
a237999037 DATAMONGO-598 - Upgraded to new build infrastructure.
Fixed test ordering issue in Cross-Store module.
2013-01-18 14:45:48 +01:00
Oliver Gierke
d8a9752724 DATAMONGO-592 - Fix handling of SpEL evaluation values.
During constructor parameter value resolution the value to be used can result from evaluating a SpEL expression on the root document. This value has to be converted recursively if it is a nested document. Adapted to the altered API in SD Commons and override potentiallyConvertSpelValue(…) to re-invoke converter.
2012-12-16 15:26:25 +01:00
Philipp Schneider
4c8bf0dec2 DATAMONGO-583 - Using while (…) instead of for (…) for DBCursors to avoid memory leak.
Instead of iterating over the DBCursor using a for-loop we now use a while-loop to avoid the potential memory leak outlined in [0].

[0] https://jira.mongodb.org/browse/JAVA-664
2012-12-13 12:07:11 +05:30
Oliver Gierke
cffc27d83a DATAMONGO-590 - Code cleanups in MongoTemplate and error handling APIs. 2012-12-13 12:06:55 +05:30
Oliver Gierke
42ab4cb7b4 DATAMONGO-588 - Version attribute now gets initialized on insert.
The version attribute of an entity has not been correctly initialized to 0 when inserting objects using MongoTemplate.insert(…). We now set it to 0 correctly.
2012-12-10 17:38:16 +09:00
Oliver Gierke
42df25434f DATAMONGO-585 - Fix concurrency issue in database authentication.
So far the authentication of the database was synchronized *after* the check whether the database under consideration had already been authenticated. This can cause threads to concurrently try to authenticate the database which is rejected by the driver. Improved the synchronization block to include both the ….isAuthenticated() check as well as the authentication.
2012-12-03 11:37:06 +01:00
Oliver Gierke
4e16f7ebe2 DATACMNS-378 - Fixed ClassCastException in MapReduceResults.
The parseCount(…) method now safely converts both integer and long values from the source DBObject.
2012-11-29 13:22:38 +01:00
Oliver Gierke
04a87b373d DATAMONGO-577 - Cleaned up obsolete property.
The version property in BasicMongoPersistentEntity became obsolete after the upgrade in Spring Data Commons.
2012-11-28 17:39:20 +01:00
Oliver Gierke
c4eca7eadd DATAMONGO-581 - Expose the PersistentEntity managed in the repository factory bean.
MongoRepositoryFactoryBean now populates the MappingContext of the super class when the MongoOperations instance is set.
2012-11-28 17:35:47 +01:00
Philipp Schneider
3a9dc2b98c DATAMONGO-503 - Added support for content type on GridFsOperations.
Added methods to take the content type of the file to be created. Clarified nullability of arguments in JavaDoc.

Pull request: #19.
2012-11-28 14:30:05 +01:00
Oliver Gierke
c568b7cbc2 DATAMONGO-580 - Improved BeanDefinitionParser for MappingMongoConverter. 2012-11-27 14:44:00 +01:00
Oliver Gierke
9482350062 DATAMONGO-577 - Added support for auditing.
Introduced AuditingEventListener to invoke auditing subsystem available through Spring Data Commons. Added <mongo:auditing /> namespace element to transparently activate it.
2012-11-27 14:00:52 +01:00
Oliver Gierke
772a140def DATAMONGO-576 - Configure java.util.logging to prevent verbose test logging.
Added Slf4j bridge and configured surefire to configure JUL accordingly.
2012-11-23 10:50:29 +01:00
Oliver Gierke
936259a766 DATAMONGO-575 - Improved implementation of entity metadata lookup in MongoQueryMethod.
Got rid of obsolete EntityInformationCreator API and moved to a custom EntityMetadata extension instead.
2012-11-23 10:00:49 +01:00
Oliver Gierke
533d21281e DATAMONGO-570 - Guard against null values in references.
QueryMapper does not try to convert null values into DBRef objects anymore but returns the plain null value as is.
2012-11-13 18:23:02 +01:00
Oliver Gierke
6977fa87e6 DATAMONGO-573 - Moved to Logback for test logging. 2012-11-13 17:54:19 +01:00
Oliver Gierke
41dcebb010 DATAMONGO-559 - Upgraded to Spring Data Commons 1.5.0.BUILD-SNAPSHOT. 2012-11-13 17:54:19 +01:00
Oliver Gierke
ef077182f6 DATAMONGO-563 - Upgraded MongoDB Java driver to 2.9.2. 2012-11-13 17:54:19 +01:00
Dave Syer
21e7c63766 Update README.md 2012-10-26 18:02:05 +02:00
Jan Kronquist
4b018a9d7d DATAMONGO-562 - Saving versioned entity with already set id now works.
Instead of inspecting the id property we now assume the version property unset for initialization and work from there.
2012-10-24 16:07:26 +02:00
Spring Buildmaster
ecf15b93e0 DATAMONGO-559 - Prepare next development iteration. 2012-10-17 15:00:21 -07:00
Spring Buildmaster
6323a86560 DATAMONGO-559 - Release version 1.1.1.RELEASE. 2012-10-17 14:59:59 -07:00
Oliver Gierke
3b0b7315e1 DATAMONGO-559 - Prepare 1.1.1.RELEASE release. 2012-10-17 17:53:39 -04:00
Oliver Gierke
4aeba6f92d DATAMONGO-553 - MappingMongoConverter now uses MongoPersistentProperty.usePropertyAccess() to decide whether to use property access.
Introduced usePropertyAccess() method on MongoPersistentProperty and added an implementation that forces that into being true for Throwable.cause as using the field would cause a cycle as it points to this in the first place but rather massages the value in the getter.
2012-10-17 17:27:15 -04:00
Oliver Gierke
342f9ae837 DATAMONGO-551 - MongoTemplate can now persist plain Strings if they are valid JSON.
Added some code to MongoTemplate that inspects the object to be saved for being a String. If it is we try to parse the given String into a JSON document and continue as if we had been given a DBObject initially. Non-parseable Strings are rejected with a MappingException.
2012-10-15 11:49:29 -04:00
Oliver Gierke
66d98b355e DATAMONGO-550 - Fixed potential NullPointerExceptions in MongoTemplate.
Guarded access to results of mappingContext.getPersistentEntity(…) to prevent NullPointerExceptions in case the template is used with non-entity types like a plain DBObject. Added some custom logic for plain BasicDBObjects to get the _id field populated appropriately.
2012-10-15 11:38:00 -04:00
Oliver Gierke
cabbe747f8 DATAMONGO-549 - Fixed potential NullPointerException in MongoTemplate.
Added assertions and guards against MongoPersistentEntity lookups resulting in null values.
2012-10-11 08:49:49 +02:00
Spring Buildmaster
5ff3064acd DATAMONGO-541 - Prepare next development iteration. 2012-10-10 05:30:21 -07:00
Spring Buildmaster
3001e2941f DATAMONGO-541 - Release 1.1.0.RELEASE. 2012-10-10 05:30:15 -07:00
Oliver Gierke
9cf72fbdd4 DATAMONGO-541 - Polished pom.xml for Maven central propagation. 2012-10-10 14:23:11 +02:00
Oliver Gierke
cb5144de0f DATAMONGO-541 - Updated reference docs and changelog. 2012-10-10 14:04:43 +02:00
Oliver Gierke
4be90e51ae DATAMONGO-541 - Upgrade to Spring Data Commons 1.4.0.RELEASE. 2012-10-10 14:04:27 +02:00
Oliver Gierke
6c368d557b DATAMONGO-548 - Upgrade to Querydsl 2.8.0. 2012-10-09 09:11:27 +02:00
Oliver Gierke
a8432e13a1 DATAMONGO-543 - Polished quick start section.
Updated dependency versions to Spring Data MongoDB and Spring. Removed explicit dependency listing. Removed section on how to migrate between 1.0 milestones. Removed obsolete paragraphs.
2012-09-20 10:38:46 +02:00
Oliver Gierke
05ac139554 DATAMONGO-528 - Documented GridFs support. 2012-09-17 15:04:19 +02:00
Oliver Gierke
3661b2981e DATAMONGO-456 - Fixed <db-factory /> element documentation in XSD. 2012-09-17 12:22:48 +02:00
Oliver Gierke
69bd7acf74 DATAMONGO-457 - Fixed links in reference documentation. 2012-09-17 12:07:52 +02:00
Oliver Gierke
d882af257f DATAMONGO-539 - Fixed MongoTemplate.remove(object, collectionName).
If the entity being removed using MongoTemplate.remove(object, collectionName) contained an id that could be converted into an ObjectID it wasn't removed correctly currently. This was caused by the fact that the intermediate call didn't hand over the entity type and thus the id conversion failed. This in turn caused the query not to match the previous saved object.
2012-09-17 11:57:09 +02:00
Oliver Gierke
c92058a79a DATAMONGO-539 - Added test case to show removing entity from explicit collection works. 2012-09-13 17:12:50 +02:00
Oliver Gierke
ed2b576261 DATAMONGO-538 - Query API can now work with Sort and Pageable from Spring Data Commons.
Introduced with(Sort sort) and with(Pageable pageable) on Query. Deprecated sort() method and the custom Sort class. Deprecated QueryUtils.applyPagination(…) and ….applySorting(…) and changed internal calls to this to use the Query API directly. Some JavaDoc polishing.
2012-09-13 10:54:23 +02:00
Oliver Gierke
6744446a48 DATAMONGO-532 - Synchronize DB authentication.
In multithreaded environments Mongo database authentication can be triggered twice if two or more threads refer to the same db instance. This is now prevented by synchronizing calls to db.authenticate(…).
2012-09-12 12:58:31 +02:00
Oliver Gierke
fdecec48b2 DATAMONGO-484 - Upgraded to MongoDB driver 2.9.1. 2012-09-12 11:54:20 +02:00
Oliver Gierke
f1289c46e6 DATAMONGO-535 - Fixed broken MongoDBUtils resource synchronization.
We now make sure we really re-use the database instance bound to the thread avoiding duplicate lookups of Mongo.getDb(…). This doesn't seem to gain a big performance benefit anymore as the Mongo instance caches the DB instances internally anyway.
2012-09-11 20:44:33 +02:00
Oliver Gierke
aa0b87be57 DATAMONGO-536 - Fixed package cycle introduced by SerializationUtils. 2012-09-11 20:44:32 +02:00
noter
2040f02d07 DATAMONGO-279 - Added support for optimistic locking.
Introduced @Version annotation to demarcate a version property on an entity. MongoTemplate will initialize this property on the first save of an instance if not set already. If it is already set, the template will bump the version number on subsequent saves and actually trigger an update backed by a query including the old version number. A failure to update the entity accordingly will then trigger an OptimisticLockingFailureException.

General JavaDoc polish in mapping package.
2012-09-11 20:44:15 +02:00
Oliver Gierke
13a69ecdfd DATAMONGO-533 - Fixed index creator registration.
In cases an ApplicationContext already contains a MongoPersistentEntityIndexCreator the default one is not registered, even if the one in the ApplicationContext listens to another MappingContext's events.

Polished iterable classes setup in MongoTemplate along the way. Some JavaDoc polishes as well.
2012-09-07 21:21:17 +02:00
Oliver Gierke
f0051deff0 Formatting. 2012-09-06 16:25:40 +02:00
Oliver Gierke
05a8148084 DATAMONGO-530 - Fixed propagation of setApplicationContext(…) in MappingMongoConverter. 2012-09-06 16:06:47 +02:00
Oliver Gierke
aaa44b3369 DATAMONGO-529 - Update Querydsl setup to use 1.0.4.
Raised Maven compiler plugin version to 2.5.1.
2012-09-06 15:48:41 +02:00
Oliver Gierke
7f35c4430d DATAMONGO-527 - Fixed Criteria.equals(…). 2012-09-03 18:47:25 +02:00
Oliver Gierke
114489d19a DATAMONGO-521 - Added test case to show that repository AND query works. 2012-09-03 16:46:53 +02:00
Oliver Gierke
eda8200d51 DATAMONGO-523 - Added test case to verify type alias detection. 2012-09-03 15:24:21 +02:00
Oliver Gierke
b078ea9ceb DATAMONGO-513 - Update to Spring Data Commons 1.4.0.BUILD-SNAPSHOT. 2012-09-03 15:24:20 +02:00
mpollack
d90b1a0ddd DATAMONGO-526 - Polished README.md.
Update README to remove references to old API, Docs links as well as CouchDB. Remove reference to Spring Data Document, copy initial paragraph introducing the project from http://www.springsource.org/spring-data/mongodb
2012-09-03 15:23:33 +02:00
Spring Buildmaster
737a42e07a DATAMONGO-513 - Prepare next development iteration. 2012-08-24 02:24:09 -07:00
Spring Buildmaster
22d5d4c019 DATAMONGO-513 - Release 1.1.0.RC1. 2012-08-24 02:24:06 -07:00
Oliver Gierke
7ac1e7b6e1 DATAMONGO-513 - Prepare changelog for 1.1.0.RC1. 2012-08-24 11:16:54 +02:00
Oliver Gierke
e86ab783f3 DATAMONGO-513 - Update to Spring Data Commons 1.4.0.RC1. 2012-08-23 20:12:41 +02:00
Oliver Gierke
6fe3e67ecb DATAMONGO-517 - Fixed complex keyword handling.
Introduced intermediate getMappedKeyword(Keyword keyword, MongoPersistentProperty property) to correctly return a DBObject for keyword plus converted value. A few refactorings and improvements in the implementation of QueryMapper (Keyword value object etc.).
2012-08-21 12:25:30 +02:00
Oliver Gierke
3b78034c55 DATAMONGO-519 - Make Spring 3.1.2.RELEASE default Spring dependency version.
We move away from Maven version ranges as they complicate the build and dependency resolution process. They make the build in-reproducible. Users stuck with a 3.0.x version of Spring will now have to manually declare Spring dependencies in their needed 3.0.x version. Not that at least Spring 3.0.7 is required currently.
2012-08-20 17:19:37 +02:00
Oliver Gierke
a06a69797f DATACMNS-214 - Adapted API change in Spring Data Commons.
Plus additional cleanups.
2012-08-16 14:10:02 +02:00
Oliver Gierke
8b1557e38c DATAMONGO-506 - Added test case to show BasicQuery is working for nested properties. 2012-08-15 19:40:02 +02:00
Oliver Gierke
fcdc6d0df2 DATAMONGO-511 - QueryMapper now maps associations correctly.
Complete overhaul of the QueryMapper to better handle complex scenarios like property paths and association references.
2012-08-15 18:30:29 +02:00
Oliver Gierke
83b6cd7f05 DATAMONGO-510 - Criteria now only uses BasicDBList internally. 2012-08-15 18:30:15 +02:00
Oliver Gierke
38a9a6d51d DATAMONGO-509 - SimpleMongoRepository.exists(…) now avoids loading unnecessary data.
We're explicitly ruling out the entities attributes via a field spec to avoid unnecessary object marshaling just to find out whether it exists or not.
2012-08-15 18:16:03 +02:00
Oliver Gierke
5e2f16c678 DATAMONGO-508 - Eagerly return DBRef creation if the given value already is a DBRef. 2012-08-15 16:38:56 +02:00
Oliver Gierke
d7ae95a779 DATAMONGO-505 - Fixed handling of parameter binding of associations and collection values.
Instead of converting given association values as-is, ConvertingParameterAccessor now converts each collection value into a DBRef individually.
2012-08-15 15:16:44 +02:00
Oliver Gierke
8fbdf9afbd DATACMNS-212 - Apply refactorings in Spring Data Commons. 2012-08-13 14:27:19 +02:00
Oliver Gierke
f5a4d78e62 DATAMONGO-476 - @EnableMongoRepositories is now inherited into sub-classes. 2012-08-13 11:37:59 +02:00
Oliver Gierke
1f4264e6a7 DATAMONGO-472 - MongoQueryCreator now correctly translates Not keyword.
We're now translating a negating property reference into a ne(…) call instead of a not().is(…).
2012-08-10 19:58:36 +02:00
Oliver Gierke
05baa851d8 DATAMONGO-502 - QueryMapper now translates property names into field names. 2012-08-08 18:56:18 +02:00
Oliver Gierke
35e8ae1224 DATAMONGO-500 - Index creation is only done for the correct MappingContext.
Index creation now double checks the MappingContext a MappingContextEvent originates from before actually creating indexes. This avoids invalid indexes being created in a multi-database scenario.
2012-07-31 15:34:50 +02:00
Oliver Gierke
ceec0bcc4a DATAMONGO-499 - Fixed namespace reference to repository XSD. 2012-07-31 10:39:03 +02:00
Oliver Gierke
0d87e7fa5f DATAMONGO-496 - AbstractMongoConfiguration now defaults mapping base package.
AbstractMappingConfiguration now uses the package of the class extending it to enable entity scanning for that package. To disable entity scanning entirely override the method to return null or an empty String.
2012-07-30 18:00:42 +02:00
Oliver Gierke
a530629d97 DATAMONGO-497 - Fixed reading of empty collections.
Reading an empty collection always returned a HashSet assuming the returned value would be converted into the assigned properties value later on. However the method should rather return the correct type already which we do now by invoking the potential conversion.
2012-07-30 15:39:22 +02:00
Oliver Gierke
ba0232b187 DATAMONGO-494 - QueryMapper now forwards entity metadata into nested $(n)or criterias.
Introduced helper class to ease assertions on DBObjects as well.
2012-07-27 10:39:31 +02:00
Oliver Gierke
04a17cacb7 DATAMONGO-495 - Fixed debug output in MongoTemplate.doFind(…).
Using SerializationUtils to safely output the query to be executed.
2012-07-26 10:35:30 +02:00
Oliver Gierke
761d725fce DATAMONGO-493 - Fixed broken $ne handling in QueryMapper.
$ne expressions are now only being tried to be converted into an ObjectId in case they follow an id property. Previously they tried in every case which might have led to Strings being converted into ObjectIds that accidentally were valid ObjectIds but didn't represent an id at all.
2012-07-24 20:43:52 +02:00
Oliver Gierke
726b0b1bcc DATAMONGO-493 - Added test case to show the described scenario is working. 2012-07-24 20:13:50 +02:00
Spring Buildmaster
888e031452 DATAMONGO-491 - Prepare next development iteration. 2012-07-24 07:57:13 -07:00
Spring Buildmaster
91b818b8c1 DATAMONGO-491 - Release 1.1.0.M2. 2012-07-24 07:57:10 -07:00
Oliver Gierke
7646a64770 DATAMONGO-491 - Prepare 1.1.0.M2 release.
Updated changelog and links to reference documentation.
2012-07-24 16:50:53 +02:00
Oliver Gierke
94c057e89c DATAMONGO-491 - Upgrade to Spring Data Commons 1.4.0.M1. 2012-07-24 15:27:55 +02:00
Oliver Gierke
190d7cefb0 DATAMONGO-474 - Populating id's after save now inspects field only.
So far the algorithm to inspect whether an id property has to be set after a save(…) operation has used the plain BeanWrapper.getProperty(PersistentProperty property) method. This caused problems in case the getter of the id field returned something completely different (to be precise: a complex type not convertible out of the box).

We now inspect the id field only to retrieve the value.
2012-07-24 13:23:40 +02:00
Oliver Gierke
669bc071b1 DATAMONGO-490 - Fixed typos. 2012-07-23 17:01:12 +02:00
Oliver Gierke
31b9b6b5c0 DATAMONGO-489 - Ensure read collections get converted to appropriate target type.
When reading BasicDBLists we now make sure the resulting collection is converted into the actual target type eventually. It might be an array and thus need an additional round of massaging before being returned as value.
2012-07-23 16:33:24 +02:00
Oliver Gierke
914ecd34fe DATAMONGO-486 - Polished namespace implementation.
Be better citizen regarding namespace support in STS. Moved ParsingUtils to Spring Data Commons.
2012-07-19 01:24:02 +02:00
Oliver Gierke
74532ff199 DATAMONGO-485 - Added test case to show complex id's are working. 2012-07-17 12:40:27 +02:00
Oliver Gierke
c7bcb55bda DATAMONGO-476 - JavaConfig support for repositories. 2012-07-17 11:58:49 +02:00
Amol Nayak
afe560ca7d DATAMONGO-480 - Consider WriteResult for insert(…) and save(…) methods. 2012-07-16 18:51:05 +02:00
Oliver Gierke
1380c49fd0 DATAMONGO-483 - Indexes now use the field name even if index name is defined. 2012-07-16 18:30:54 +02:00
Oliver Gierke
90240a8da5 DATAMONGO-482 - Fixed typo in reference documentation. 2012-07-16 17:42:37 +02:00
Oliver Gierke
ad587f63ed DATAMONGO-474 - Fixed criteria mapping for MongoTemplate.group(…).
The criteria object handed to the group object needs to be mapped correctly to map complex values. Improved error handling on the way.
2012-07-16 16:33:18 +02:00
Oliver Gierke
3c7cb592b3 DATAMONGO-475 - Fixed debug output in map-reduce operations.
Using SerializationUtils.serializeToJsonSafely(…) instead of plain toString() as this might cause SerializationExceptions for complex objects.
2012-07-16 14:56:03 +02:00
Raman Gupta
5b15c9500a DATAMONGO-477 - Change upper bound of Guava dependency to 14. 2012-07-15 17:55:51 +02:00
Oliver Gierke
2d25f0d6e4 DATAMONGO-469 - Polishing in MongoQueryCreator. 2012-06-26 17:36:33 +02:00
Oliver Gierke
557fc869eb DATAMONGO-469 - Fixed parsing of And keyword in derived queries. 2012-06-26 13:32:54 +02:00
Oliver Gierke
033f44e802 DATAMONGO-470 - Implemented equals(…) and hashCode() for Query and Criteria. 2012-06-26 13:30:00 +02:00
Oliver Gierke
ee3c1bc007 DATAMONGO-467 - Fix identifier handling for Querydsl.
As we try to massage the value of the id property into an ObjectId if possible we need to do so as well when mapping the Querydsl query. Adapted SpringDataMongoDbSerializer accordingly.
2012-06-25 13:09:18 +02:00
Oliver Gierke
f507fe2e4d DATAMONGO-464 - Fixed resource synchronization in MongoDbUtils.
MongoDbUtils now correctly returns DB instances for others than the first one bound. So far the lookup for an alternate database resulted in the first one bound to be returned. Polished log statements a bit.
2012-06-25 11:12:54 +02:00
Oliver Gierke
4a27ba0a3f DATAMONGO-466 - QueryMapper now only tries id conversion for top level document.
So far the QueryMapper has tried to map id properties of nested documents to ObjectIds which it shouldn't do actually.
2012-06-22 14:46:35 +02:00
Oliver Gierke
e2e5fd8b31 DATACMNS-186, DATACMNS-185 - Upgrade to Spring Data Commons 1.3.2.BUILD-SNAPSHOT.
Use the new initialize() method in test cases for MappingContext. Benefit from rejection of multiple id properties.
2012-06-22 14:43:23 +02:00
Oliver Gierke
ba9abd1dd0 DATAMONGO-455 - Documentation mentions BasicQuery. 2012-06-20 12:24:59 +02:00
Oliver Gierke
ab66614843 DATAMONGO-454 - Improvements to ServerAddressPropertyEditor.
ServerAddressPropertyEditor now only eventually fails if none of the configured addresses can be parsed correctly. Strengthened the parsing implementation to not fail for host-only parsing or accidental double commas.
Cleaned up integration tests for replica set configuration.
2012-06-20 10:57:32 +02:00
Oliver Gierke
caa2f5f030 DATAMONGO-378 - Fixed potential ClassCastException for MapReduceResults and upcoming MongoDB release.
The type of the value returned for the total field of the timing map in map-reduce results has changed from Integer to Long as of MongoDB version 2.1.0 apparently. Changed MapReduceResults to accommodate either Integer or Long types.
2012-06-20 10:05:16 +02:00
Oliver Gierke
6e7c8f5771 DATAMONGO-462 - Added custom converters for URL.
So far URL instances were treated as entities and serialized as nested document. As there was no custom converter registered to re-instantiate the objects and URL does not contain a no-arg constructor, reading the instances back in resulted in an ugly exception in ReflectionEntityInstantiator. We now register a custom Converter to serialize URL instances as their plain toString() representation. This causes the reading working out of the box as the StringToObjectConverter registered by default uses the constructor taking a String on URL accidentally. To make sure this still works we added an explicit StringToURLConverter to implement symmetric conversions.
2012-06-19 18:44:42 +02:00
Oliver Gierke
44c4566c9d DATAMONGO-428 - Fixed parsing of output collection for complex MapReduce result.
The raw result for a map-reduce operation might contain a complex element containing the output collection in case the original request configured an output database as option. Adapted the parsing of the output collection to accommodate both scenarios (plain String value as well as DBObject wrapper).
2012-06-19 09:30:35 +02:00
Oliver Gierke
9a1e6226b1 DATAMONGO-424 - Don't resolve DBRefs eagerly if property type is DBRef.
So far we have resolved DBRef values eagerly without inspecting the actual property type the would have to be assigned eventually. Now we simply skip the recursive resolve process if the property type (or component type, map value type) is DBRef actually.
2012-06-19 09:01:28 +02:00
Oliver Gierke
cdb6d54d6a DATAMONGO-460 - Improvements to Querydsl repository implementation.
Internalized setup of MongodbSerializer which makes the constructors of SpringDataMongodbQuery much simpler.
2012-06-18 20:18:48 +02:00
Oliver Gierke
1fbfd3f0cb DATAMONGO-450 - Log output uses mapped query for debug logging. 2012-06-14 12:37:14 +02:00
Oliver Gierke
848e6f59c2 DATAMONGO-447 - Fixed broken log output in debug level.
The debug output now uses the already mapped query object when concatenating the log string. Improved applying the id after save operations by inspecting whether the object already has the id set before trying to set it. This could have caused problems in case you use a complex id and don't provide a custom converter as it can be serialized out of the box. Fixed minor glitch in MappingMongoConverter which was not really a bug as another path through the code has covered the scenario later on. Introduced SerializationUtils class that provides a method to safely serialize objects to pseudo JSON. Pseudo in the sense that it simply renders a complex object as { $java : object.toString() }. This is useful for debug output before the DBObject was mapped into Mongo-native types.
2012-06-14 12:26:55 +02:00
Oliver Gierke
b3a891c69b DATAMONGO-458 - Read empty collections are modifiable now.
So far we've read empty collections and populated the property value of the Java object being created with Collections.emptySet(). This returns an unmodifiable Set so that further modifications fail with an UnsupportedOperationException. We now simply use new HashSet<Object>().
2012-06-08 12:59:10 +02:00
Oliver Gierke
54e105d19d General dependency and plugin version upgrades.
Upgraded to JUnit 4.10. Move to junit-dep dependency to allow cleaning up hamcrest dependencies. Use hamcrest-library ins tread of hamcrest-all. Upgrade to Mockito 1.9.0, use mockito-core instead of mockito-all. Upgrade to Spring Data Core 1.3.1.BUILD-SNAPSHOT. Upgraded to Querydsl 2.6.0. Upgraded compiler and Surefire plugin.
2012-05-29 14:20:53 +02:00
Oliver Gierke
4cfb62a413 DATAMONGO-451 - Tweaked pom.xml to allow build run without Bundlor.
Expose the failOnWarnings property to be able to override it through a command line argument. Necessary as the Sonar build uses Clover which instruments source code and thus creates type dependencies, Bundlor complains about not being declared in template.mf.
2012-05-15 13:43:59 +02:00
Maciej Walkowiak
7cdf9cedf3 DATAMONGO-446 - Fixed bug in paging query methods returning Lists
Using List as return type for paginating methods didn't work for query methods currently. Fixed by inspecting the Pageable parameter potentially handed into them and restricting the result set accordingly.
2012-05-15 13:17:56 +02:00
Spring Buildmaster
fd198c172b DATAMONGO-396 - Released 1.1.0.M1. 2012-05-07 06:11:26 -07:00
227 changed files with 7355 additions and 4236 deletions

View File

@@ -1,16 +1,22 @@
Spring Data - Document
Spring Data MongoDB
======================
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
As the name implies, the **Document** modules provides integration with document databases such as [MongoDB](http://www.mongodb.org/) and [CouchDB](http://couchdb.apache.org/).
The Spring Data MongoDB aims to provide a familiar and consistent Spring-based programming model for for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a Repository style data access layer
Getting Help
------------
At this point your best bet is to look at the Look at the [JavaDocs](http://static.springsource.org/spring-data/data-document/docs/1.0.0.BUILD-SNAPSHOT/spring-data-mongodb/apidocs/) for MongoDB integration and corresponding and source code. For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80). If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to the The [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
The [User Guide](http://static.springsource.org/spring-data/data-document/docs/1.0.0.BUILD-SNAPSHOT/reference/html/) (A work in progress).
The [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
The home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
Quick Start
@@ -23,19 +29,19 @@ For those in a hurry:
* Download the jar through Maven:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides
@@ -137,11 +143,6 @@ This will register an object in the container named PersonRepository. You can u
}
## CouchDB
TBD
Contributing to Spring Data
---------------------------

366
pom.xml
View File

@@ -1,291 +1,93 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.1.0.M1</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-parent</module>
</modules>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<developers>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<packaging>pom</packaging>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0</url>
<comments>
Copyright 2010 the original author or authors.
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.0.0.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
</modules>
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
See the License for the specific language governing permissions and
limitations under the License.
</comments>
</license>
</licenses>
<properties>
<project.type>multi</project.type>
<dist.name>spring-data-mongodb</dist.name>
<springdata.commons>1.5.0.RELEASE</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
<developers>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Project Lean</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
<dist.releaseType>snapshot</dist.releaseType>
<dist.finalName>${dist.id}-${dist.version}</dist.finalName>
<dist.fileName>${dist.finalName}.zip</dist.fileName>
<dist.filePath>target/${dist.fileName}</dist.filePath>
<dist.bucketName>dist.springframework.org</dist.bucketName>
<!-- these properties should be in ~/.m2/settings.xml
<dist.accessKey>s3 access key</dist.accessKey>
<dist.secretKey>s3 secret key</dist.secretKey>
-->
</properties>
<build>
<extensions>
<extension>
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>com.agilejava.docbkx</groupId>
<artifactId>docbkx-maven-plugin</artifactId>
<!-- yes it really needs to be this (2.0.7) otherwise pdf generation from a clean build doesn't work -->
<version>2.0.7</version>
<executions>
<execution>
<goals>
<goal>generate-html</goal>
<goal>generate-pdf</goal>
</goals>
<phase>pre-site</phase>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.docbook</groupId>
<artifactId>docbook-xml</artifactId>
<version>4.4</version>
<scope>runtime</scope>
</dependency>
</dependencies>
<configuration>
<includes>index.xml</includes>
<xincludeSupported>true</xincludeSupported>
<foCustomization>${project.basedir}/src/docbkx/resources/xsl/fopdf.xsl</foCustomization>
<htmlStylesheet>css/html.css</htmlStylesheet>
<chunkedOutput>false</chunkedOutput>
<htmlCustomization>${project.basedir}/src/docbkx/resources/xsl/html.xsl</htmlCustomization>
<useExtensions>1</useExtensions>
<highlightSource>1</highlightSource>
<highlightDefaultLanguage />
<!-- callouts -->
<entities>
<entity>
<name>version</name>
<value>${project.version}</value>
</entity>
</entities>
<postProcess>
<copy todir="${project.basedir}/target/site/reference">
<fileset dir="${project.basedir}/target/docbkx">
<include name="**/*.html" />
<include name="**/*.pdf" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources">
<include name="**/*.css" />
<include name="**/*.png" />
<include name="**/*.gif" />
<include name="**/*.jpg" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources/images">
<include name="*.png" />
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf" tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf" failonerror="false" />
</postProcess>
</configuration>
</plugin>
<plugin>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.5</version>
<configuration>
<aggregate>true</aggregate>
<breakiterator>true</breakiterator>
<header>Spring Data Document</header>
<source>1.5</source>
<quiet>true</quiet>
<javadocDirectory>${project.basedir}/src/main/javadoc</javadocDirectory>
<overview>${project.basedir}/src/main/javadoc/overview.html</overview>
<stylesheetfile>${project.basedir}/src/main/javadoc/spring-javadoc.css</stylesheetfile>
<!-- copies doc-files subdirectory which contains image resources -->
<docfilessubdirs>true</docfilessubdirs>
<links>
<link>http://static.springframework.org/spring/docs/3.0.x/javadoc-api</link>
<link>http://download.oracle.com/javase/1.5.0/docs/api</link>
<link>http://api.mongodb.org/java/2.3</link>
</links>
</configuration>
</plugin>
<plugin><!--
run `mvn package assembly:assembly` to trigger assembly creation.
see http://www.sonatype.com/books/mvnref-book/reference/assemblies-set-dist-assemblies.html -->
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-5</version>
<inherited>false</inherited>
<executions>
<execution>
<id>distribution</id>
<goals>
<goal>single</goal>
</goals>
<phase>package</phase>
<configuration>
<descriptors>
<descriptor>${project.basedir}/src/assembly/distribution.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>upload-dist</id>
<phase>deploy</phase>
<configuration>
<tasks>
<ant antfile="${basedir}/src/ant/upload-dist.xml">
<target name="upload-dist" />
</ant>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.springframework.build</groupId>
<artifactId>org.springframework.build.aws.ant</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<dependency>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
<version>0.7.2</version>
</dependency>
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>
<pluginRepositories>
<!-- Necessary for the build extension -->
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
</pluginRepositories>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/spring-data</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<dependencies>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo}</version>
</dependency>
</dependencies>
</project>

View File

@@ -1,139 +1,145 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Persistence Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<dependencies>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.1.0.M1</version>
</dependency>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${spring}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.0.RELEASE</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>3.5.5-Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
</dependency>
</dependencies>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.2</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
<!-- <aspectLibrary>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-aspects</artifactId>
</aspectLibrary> -->
</aspectLibraries>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<source>${source.level}</source>
<target>${source.level}</target>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -17,8 +17,8 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
@@ -44,7 +44,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
protected final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;

View File

@@ -21,13 +21,12 @@ import javax.persistence.EntityManager;
import javax.persistence.Transient;
import javax.persistence.Entity;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
@@ -44,7 +43,7 @@ import org.springframework.transaction.support.TransactionSynchronizationManager
*/
public aspect MongoDocumentBacking {
private static final Log LOGGER = LogFactory.getLog(MongoDocumentBacking.class);
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDocumentBacking.class);
// Aspect shared config
private ChangeSetPersister<Object> changeSetPersister;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,9 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
@@ -26,7 +28,6 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
@@ -39,20 +40,55 @@ import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
MongoTemplate mongoTemplate;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
EntityManager entityManager;
@Autowired
PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate;
@Before
public void setUp() {
txTemplate = new TransactionTemplate(transactionManager);
clearData(Person.class.getName());
Address address = new Address(12, "MAin St.", "Boston", "MA", "02101");
Resume resume = new Resume();
resume.addEducation("Skanstulls High School, 1975");
resume.addEducation("Univ. of Stockholm, 1980");
resume.addJob("DiMark, DBA, 1990-2000");
resume.addJob("VMware, Developer, 2007-");
final Person person = new Person("Thomas", 20);
person.setAddress(address);
person.setResume(resume);
person.setId(1L);
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.persist(person);
return null;
}
});
}
@After
public void tearDown() {
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.remove(entityManager.find(Person.class, 1L));
return null;
}
});
}
private void clearData(String collectionName) {
@@ -64,26 +100,8 @@ public class CrossStoreMongoTests {
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
@@ -91,15 +109,18 @@ public class CrossStoreMongoTests {
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; " + "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
found.setAge(44);
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
entityManager.merge(found);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
@@ -111,14 +132,19 @@ public class CrossStoreMongoTests {
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
entityManager.detach(detached);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
Person result = entityManager.merge(detached);
entityManager.flush();
return result;
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
@@ -127,7 +153,7 @@ public class CrossStoreMongoTests {
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
@@ -154,7 +180,9 @@ public class CrossStoreMongoTests {
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {

View File

@@ -1,12 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<!--
<logger name="org.springframework" level="debug" />
-->
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -0,0 +1,18 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="0",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
org.springframework.*;version="${spring30:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -0,0 +1,39 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
<name>Spring Data MongoDB - Distribution</name>
<description>Distribution build for Spring Data MongoDB</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.id>spring-data-mongodb</dist.id>
<dist.key>SDMONGO</dist.key>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,44 +1,30 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB Log4J Appender</name>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.2.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<mongo.version>2.3</mongo.version>
</properties>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB - Log4J Appender</name>
<dependencies>
<properties>
<log4j>1.2.16</log4j>
</properties>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
</dependency>
<dependencies>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</dependencies>
</project>

View File

@@ -5,6 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo.version:[=.=,+1.0.0)}",
org.apache.log4j.*;version="[1.2.15, 2.0.0)",
org.apache.log4j.spi.*;version="[1.2.15, 2.0.0)"
com.mongodb.*;version="${mongo:[=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,271 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.1.0.M1</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- versions for commonly-used dependencies -->
<junit.version>4.8.1</junit.version>
<log4j.version>1.2.16</log4j.version>
<org.mockito.version>1.8.4</org.mockito.version>
<org.slf4j.version>1.6.1</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.3.0.RC2</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/download/community
</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<dependencies>
<!-- Test dependencies -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>1.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${org.slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<extensions>
<extension>
<!--
available only in the springframework maven repository. see
<repositories> section below
-->
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<resources>
<resource>
<directory>${project.basedir}/src/main/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<directory>${project.basedir}/src/main/resources</directory>
<includes>
<include>**/*</include>
</includes>
</resource>
</resources>
<testResources>
<testResource>
<directory>${project.basedir}/src/test/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
<testResource>
<directory>${project.basedir}/src/test/resources</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
<compilerArgument>-Xlint:-path</compilerArgument>
<showWarnings>true</showWarnings>
<showDeprecation>false</showDeprecation>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3.1</version>
<configuration>
<useDefaultManifestFile>true</useDefaultManifestFile>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<junitArtifactName>junit:junit</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<!--
configures the springsource bundlor plugin, which generates
OSGI-compatible MANIFEST.MF files during the 'compile' phase of
the maven build. this plugin is declared within the
pluginManagement section because not every module that inherits
from this pom needs bundlor's services, e.g.:
spring-integration-samples and all its children. for this reason,
all modules that wish to use bundlor must declare it explicitly.
it is not necessary to specify the <version> or <configuration>
sections, but groupId and artifactId are required. see
http://static.springsource.org/s2-bundlor/1.0.x/user-guide/html/ch04s03.html
for more info
-->
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
<version>1.0.0.RELEASE</version>
<configuration>
<failOnWarnings>true</failOnWarnings>
</configuration>
<executions>
<execution>
<id>bundlor</id>
<goals>
<goal>bundlor</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>http://repo.springsource.org/libs-snapshot</url>
</repository>
</repositories>
<reporting>
<plugins>
<plugin>
<!--
significantly speeds up the 'Dependencies' report during site
creation see
http://old.nabble.com/Skipping-dependency-report-during-Maven2-site-generation-td20116761.html
-->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>2.1</version>
<configuration>
<dependencyLocationsEnabled>false</dependencyLocationsEnabled>
</configuration>
</plugin>
</plugins>
</reporting>
</project>

View File

@@ -1,95 +1,77 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
<description>MongoDB support for Spring Data</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
<version>1.2.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.5.0</querydsl.version>
<cdi.version>1.0</cdi.version>
<validation.version>1.0.0.GA</validation.version>
<webbeans.version>1.1.3</webbeans.version>
<validation>1.0.0.GA</validation>
</properties>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
<exclusions>
<exclusion>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
</exclusion>
</exclusions>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
<version>${springdata.commons}</version>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>com.google.code.morphia</groupId>
<artifactId>morphia</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
<scope>provided</scope>
</dependency>
@@ -104,7 +86,7 @@
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
@@ -112,14 +94,14 @@
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans.version}</version>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
@@ -134,7 +116,7 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation.version}</version>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -144,43 +126,30 @@
<version>4.2.0.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>performance-tests</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<includes>
<include>**/PerformanceTests.java</include>
</includes>
<excludes>
<exclude>none</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.2</version>
<version>1.0.4</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>generate-test-sources</phase>
@@ -194,7 +163,25 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -27,19 +27,22 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
/**
* Base class for Spring Data Mongo configuration using JavaConfig.
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
@@ -86,22 +89,26 @@ public abstract class AbstractMongoConfiguration {
@Bean
public SimpleMongoDbFactory mongoDbFactory() throws Exception {
UserCredentials creadentials = getUserCredentials();
UserCredentials credentials = getUserCredentials();
if (creadentials == null) {
if (credentials == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), creadentials);
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
}
}
/**
* Return the base package to scan for mapped {@link Document}s.
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour.
*
* @return
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
*/
protected String getMappingBasePackage() {
return null;
return getClass().getPackage().getName();
}
/**
@@ -127,11 +134,22 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
mappingContext.initialize();
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and

View File

@@ -26,4 +26,6 @@ public abstract class BeanNames {
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
}

View File

@@ -24,12 +24,12 @@ import java.util.List;
import java.util.Set;
import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -38,6 +38,7 @@ import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
@@ -48,6 +49,8 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
@@ -67,25 +70,28 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
*/
public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean jsr303Present = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : "mappingConverter";
}
/* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : "mappingConverter";
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, id);
createIsNewStrategyFactoryBeanDefinition(ctxRef, parserContext, element);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
@@ -110,18 +116,23 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
INDEX_HELPER));
}
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null) {
registry.registerBeanDefinition(VALIDATING_EVENT_LISTENER, validatingMongoEventListener);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatingMongoEventListener,
VALIDATING_EVENT_LISTENER));
}
return converterBuilder.getBeanDefinition();
parserContext.registerBeanComponent(new BeanComponentDefinition(converterBuilder.getBeanDefinition(), id));
parserContext.popAndRegisterContainingComponent();
return null;
}
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
@@ -135,7 +146,6 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
RuntimeBeanReference validator = getValidator(builder, parserContext);
if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.addConstructorArgValue(validator);
@@ -157,36 +167,42 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerComponent(new BeanComponentDefinition(validatorDef, validatorName));
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName);
}
private String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition) {
static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(ctxRef)) {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
if (StringUtils.hasText(ctxRef)) {
return ctxRef;
}
BeanComponentDefinitionBuilder componentDefinitionBuilder = new BeanComponentDefinitionBuilder(element,
parserContext);
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
ctxRef = converterId + "." + MAPPING_CONTEXT;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
return ctxRef;
}
@@ -224,7 +240,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
AbstractBeanDefinition conversionsBean = conversionsBuilder.getBeanDefinition();
conversionsBean.setSource(parserContext.extractSource(element));
parserContext.getRegistry().registerBeanDefinition("customConversions", conversionsBean);
parserContext.registerBeanComponent(new BeanComponentDefinition(conversionsBean, "customConversions"));
return conversionsBean;
}
@@ -232,7 +248,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
private static Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
String basePackage = element.getAttribute(BASE_PACKAGE);
@@ -271,6 +287,19 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public static String createIsNewStrategyFactoryBeanDefinition(String mappingContextRef, ParserContext context,
Element element) {
BeanDefinitionBuilder mappingContextStrategyFactoryBuilder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY));
return IS_NEW_STRATEGY_FACTORY;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
*
* @author Oliver Gierke
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEventListener.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
String mappingContextName = BeanNames.MAPPING_CONTEXT;
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT)) {
mappingContextName = MappingMongoConverterParser.potentiallyCreateMappingContext(element, parserContext, null,
BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
}
MappingMongoConverterParser.createIsNewStrategyFactoryBeanDefinition(mappingContextName, parserContext, element);
}
BeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(BeanNames.IS_NEW_STRATEGY_FACTORY);
BeanDefinition handlerBeanDefinition = parser.parse(element, parserContext);
builder.addConstructorArgValue(handlerBeanDefinition);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 by the original author(s).
* Copyright 2011-2012 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,19 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.ParsingUtils.*;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.util.StringUtils;
@@ -44,19 +44,29 @@ import com.mongodb.MongoURI;
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = element.getAttribute("id");
if (!StringUtils.hasText(id)) {
id = DB_FACTORY;
}
return id;
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
@@ -64,12 +74,11 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
@@ -77,19 +86,26 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
// Defaulting
mongoRef = StringUtils.hasText(mongoRef) ? mongoRef : registerMongoBeanDefinition(element, parserContext);
dbname = StringUtils.hasText(dbname) ? dbname : "db";
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
} else {
dbFactoryBuilder.addConstructorArgValue(registerMongoBeanDefinition(element, parserContext));
}
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(dbname);
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY)
.getBeanDefinition();
}
/**
@@ -100,14 +116,13 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* @param parserContext must not be {@literal null}.
* @return
*/
private String registerMongoBeanDefinition(Element element, ParserContext parserContext) {
private BeanDefinition registerMongoBeanDefinition(Element element, ParserContext parserContext) {
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(element, mongoBuilder, "host");
ParsingUtils.setPropertyValue(element, mongoBuilder, "port");
setPropertyValue(mongoBuilder, element, "host");
setPropertyValue(mongoBuilder, element, "port");
return BeanDefinitionReaderUtils.registerWithGeneratedName(mongoBuilder.getBeanDefinition(),
parserContext.getRegistry());
return getSourceBeanDefinition(mongoBuilder, parserContext, element);
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,26 +16,31 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigParser;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
* @author Oliver Gierke
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
*
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("repositories", new MongoRepositoryConfigParser());
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,20 +13,20 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
@@ -35,54 +35,59 @@ import org.w3c.dom.Element;
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class MongoParser extends AbstractSingleBeanDefinitionParser {
public class MongoParser implements BeanDefinitionParser {
@Override
protected Class<?> getBeanClass(Element element) {
return MongoFactoryBean.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
ParsingUtils.setPropertyValue(element, builder, "port", "port");
ParsingUtils.setPropertyValue(element, builder, "host", "host");
ParsingUtils.setPropertyValue(element, builder, "write-concern", "writeConcern");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
ParsingUtils.parseMongoOptions(element, builder);
ParsingUtils.parseReplicaSet(element, builder);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "write-concern", "writeConcern");
registerServerAddressPropertyEditor(parserContext.getRegistry());
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernPropertyEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*
* @param parserContext the ParserContext to
*/
private void registerServerAddressPropertyEditor(BeanDefinitionRegistry registry) {
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String name = super.resolveId(element, definition, parserContext);
if (!StringUtils.hasText(name)) {
name = "mongo";
}
return name;
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}
}

View File

@@ -0,0 +1,103 @@
/*
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
abstract class MongoParsingUtils {
private MongoParsingUtils() {
}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,141 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
abstract class ParsingUtils {
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
mongoBuilder.addPropertyValue("replicaSetSeeds", replicaSetString);
}
return true;
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsElement, optionsDefBuilder, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsElement, optionsDefBuilder, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsElement, optionsDefBuilder, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsElement, optionsDefBuilder, "connect-timeout", "connectTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-timeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsElement, optionsDefBuilder, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsElement, optionsDefBuilder, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsElement, optionsDefBuilder, "write-number", "writeNumber");
setPropertyValue(optionsElement, optionsDefBuilder, "write-timeout", "writeTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "write-fsync", "writeFsync");
setPropertyValue(optionsElement, optionsDefBuilder, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName, String propertyName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(propertyName, attr);
}
}
/**
* Sets the property with the given attribute name on the given {@link BeanDefinitionBuilder} to the value of the
* attribute with the given name.
*
* @param element must not be {@literal null}.
* @param builder must not be {@literal null}.
* @param attrName must not be {@literal null} or empty.
*/
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(attrName, attr);
}
}
/**
* Returns the {@link BeanDefinition} built by the given {@link BeanDefinitionBuilder} enriched with source
* information derived from the given {@link Element}.
*
* @param builder must not be {@literal null}.
* @param context must not be {@literal null}.
* @param element must not be {@literal null}.
* @return
*/
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context,
Element element) {
AbstractBeanDefinition definition = builder.getBeanDefinition();
definition.setSource(context.extractSource(element));
return definition;
}
/**
* Registers a {@link WriteConcernPropertyEditor} in the given {@link BeanDefinitionRegistry}.
*
* @param registry must not be {@literal null}.
*/
static void registerWriteConcernPropertyEditor(BeanDefinitionRegistry registry) {
Assert.notNull(registry);
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
}

View File

@@ -17,7 +17,11 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
@@ -30,6 +34,8 @@ import com.mongodb.ServerAddress;
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
@@ -38,21 +44,49 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
public void setAsText(String replicaSetString) {
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
ServerAddress[] serverAddresses = new ServerAddress[replicaSetStringArray.length];
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);
for (int i = 0; i < replicaSetStringArray.length; i++) {
for (String element : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(replicaSetStringArray[i], ":");
ServerAddress address = parseServerAddress(element);
try {
serverAddresses[i] = new ServerAddress(hostAndPort[0], Integer.parseInt(hostAndPort[1]));
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Could not parse port " + hostAndPort[1], e);
} catch (UnknownHostException e) {
throw new IllegalArgumentException("Could not parse host " + hostAndPort[0], e);
if (address != null) {
serverAddresses.add(address);
}
}
setValue(serverAddresses);
if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration! Validate your config!");
}
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
}
/**
* Parses the given source into a {@link ServerAddress}.
*
* @param source
* @return the
*/
private ServerAddress parseServerAddress(String source) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(source.trim(), ":");
if (!StringUtils.hasText(source) || hostAndPort.length > 2) {
LOG.warn("Could not parse address source '{}'. Check your replica set configuration!", source);
return null;
}
try {
return hostAndPort.length == 1 ? new ServerAddress(hostAndPort[0]) : new ServerAddress(hostAndPort[0],
Integer.parseInt(hostAndPort[1]));
} catch (UnknownHostException e) {
LOG.warn("Could not parse host '{}'. Check your replica set configuration!", hostAndPort[0]);
} catch (NumberFormatException e) {
LOG.warn("Could not parse port '{}'. Check your replica set configuration!", hostAndPort[1]);
}
return null;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,59 +13,53 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern. The EntityClass is null only for
* the MongoActionOperaton.INSERT_LIST.
*
* {@link WriteConcern} based on this information.
* <ul>
* <li>INSERT, SAVE have null query</li>
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityClass, document, and query</li>
* <li>INSERT_LIST has null entityType, document, and query</li>
* </ul>
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public class MongoAction {
private String collectionName;
private WriteConcern defaultWriteConcern;
private Class<?> entityClass;
private MongoActionOperation mongoActionOperation;
private DBObject query;
private DBObject document;
private final String collectionName;
private final WriteConcern defaultWriteConcern;
private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation;
private final DBObject query;
private final DBObject document;
/**
* Create an instance of a MongoAction
* Create an instance of a {@link MongoAction}.
*
* @param defaultWriteConcern the default write concern
* @param defaultWriteConcern the default write concern.
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
* @param entityClass the POJO that is being operated against
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityClass, DBObject document, DBObject query) {
super();
String collectionName, Class<?> entityType, DBObject document, DBObject query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;
this.collectionName = collectionName;
this.entityClass = entityClass;
this.entityType = entityType;
this.query = query;
this.document = document;
}
@@ -78,8 +72,16 @@ public class MongoAction {
return defaultWriteConcern;
}
/**
* @deprecated use {@link #getEntityType()} instead.
*/
@Deprecated
public Class<?> getEntityClass() {
return entityClass;
return entityType;
}
public Class<?> getEntityType() {
return entityType;
}
public MongoActionOperation getMongoActionOperation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,8 +20,8 @@ package org.springframework.data.mongodb.core;
* for a given mutating operation
*
* @author Mark Pollack
* @author Oliver Gierke
* @see MongoAction
*
*/
public enum MongoActionOperation {

View File

@@ -78,51 +78,62 @@ public abstract class MongoDbUtils {
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate) {
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
if (dbHolder != null && !dbHolder.isEmpty()) {
// pre-bound Mongo DB
DB db = null;
if (TransactionSynchronizationManager.isSynchronizationActive() && dbHolder.doesNotHoldNonDefaultDB()) {
// Spring transaction management is active ->
db = dbHolder.getDB();
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing Mongo DB");
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
// Do we have a populated holder and TX sync active?
if (dbHolder != null && !dbHolder.isEmpty() && TransactionSynchronizationManager.isSynchronizationActive()) {
DB db = dbHolder.getDB(databaseName);
// DB found but not yet synchronized
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
if (db != null) {
return db;
}
}
LOGGER.trace("Getting Mongo Database name=[" + databaseName + "]");
// Lookup fresh database instance
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (credentialsGiven && !db.isAuthenticated()) {
// Note, can only authenticate once against the same com.mongodb.DB object.
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName
+ "], username = [" + username + "], password = [" + password + "]", databaseName, credentials);
synchronized (db) {
if (credentialsGiven && !db.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
// Use same Session for further Mongo actions within the transaction.
// Thread object will get removed by synchronization at transaction completion.
// TX sync active, bind new database to thread
if (TransactionSynchronizationManager.isSynchronizationActive()) {
// We're within a Spring-managed transaction, possibly from JtaTransactionManager.
LOGGER.debug("Registering Spring transaction synchronization for new Hibernate Session");
LOGGER.debug("Registering Spring transaction synchronization for MongoDB instance {}.", databaseName);
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(db);
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(db);
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
}
@@ -146,11 +157,12 @@ public abstract class MongoDbUtils {
* @return whether the DB is transactional
*/
public static boolean isDBTransactional(DB db, Mongo mongo) {
if (mongo == null) {
return false;
}
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
return (dbHolder != null && dbHolder.containsDB(db));
return dbHolder != null && dbHolder.containsDB(db);
}
/**
@@ -159,6 +171,7 @@ public abstract class MongoDbUtils {
* @param db the DB to close (may be <code>null</code>)
*/
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,12 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DuplicateKeyException;
@@ -29,21 +23,26 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
* exception from the {@code org.springframework.dao} hierarchy. Return {@literal null} if no translation is
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
* @author Oliver Gierke
* @author Michal Vich
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
/*
* (non-Javadoc)
*
* @see org.springframework.dao.support.PersistenceExceptionTranslator#
* translateExceptionIfPossible(java.lang.RuntimeException)
*/
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
// Check for well-known MongoException subclasses.
@@ -52,14 +51,23 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
if (ex instanceof DuplicateKey) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
@@ -69,9 +77,6 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// If we get here, we have an exception that resulted from user code,
// rather than the persistence provider, so we return null to indicate

View File

@@ -16,13 +16,14 @@
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
@@ -37,6 +38,7 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.context.ApplicationEventPublisherAware;
import org.springframework.context.ApplicationListener;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.io.Resource;
@@ -44,8 +46,10 @@ import org.springframework.core.io.ResourceLoader;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
@@ -97,6 +101,7 @@ import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
@@ -105,40 +110,25 @@ import com.mongodb.util.JSON;
* @author Graeme Rocher
* @author Mark Pollack
* @author Oliver Gierke
* @author Amol Nayak
* @author Patryk Wasik
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoTemplate.class);
private static final String ID = "_id";
private static final String ID_FIELD = "_id";
private static final WriteResultChecking DEFAULT_WRITE_RESULT_CHECKING = WriteResultChecking.NONE;
@SuppressWarnings("serial")
private static final List<String> ITERABLE_CLASSES = new ArrayList<String>() {
{
add(List.class.getName());
add(Collection.class.getName());
add(Iterator.class.getName());
}
};
private static final Collection<String> ITERABLE_CLASSES;
/*
* WriteConcern to be used for write operations if it has been specified.
* Otherwise we should not use a WriteConcern defaulting to the one set for
* the DB or Collection.
*/
private WriteConcern writeConcern = null;
static {
private WriteConcernResolver writeConcernResolver = new DefaultWriteConcernResolver();
Set<String> iterableClasses = new HashSet<String>();
iterableClasses.add(List.class.getName());
iterableClasses.add(Collection.class.getName());
iterableClasses.add(Iterator.class.getName());
/*
* WriteResultChecking to be used for write operations if it has been
* specified. Otherwise we should not do any checking.
*/
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
/**
* Set the ReadPreference when operating on a collection. See {@link #prepareCollection(DBCollection)}
*/
private ReadPreference readPreference = null;
ITERABLE_CLASSES = Collections.unmodifiableCollection(iterableClasses);
}
private final MongoConverter mongoConverter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -146,6 +136,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private final MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
private final QueryMapper mapper;
private WriteConcern writeConcern;
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
private ReadPreference readPreference;
private ApplicationEventPublisher eventPublisher;
private ResourceLoader resourceLoader;
private MongoPersistentEntityIndexCreator indexCreator;
@@ -153,8 +147,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration
*
* @param mongo
* @param databaseName
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
public MongoTemplate(Mongo mongo, String databaseName) {
this(new SimpleMongoDbFactory(mongo, databaseName), null);
@@ -164,8 +158,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Constructor used for a template configuration with user credentials in the form of
* {@link org.springframework.data.authentication.UserCredentials}
*
* @param mongo
* @param databaseName
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param userCredentials
*/
public MongoTemplate(Mongo mongo, String databaseName, UserCredentials userCredentials) {
@@ -173,9 +167,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Constructor used for a basic template configuration
* Constructor used for a basic template configuration.
*
* @param mongoDbFactory
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoTemplate(MongoDbFactory mongoDbFactory) {
this(mongoDbFactory, null);
@@ -184,7 +178,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration.
*
* @param mongoDbFactory
* @param mongoDbFactory must not be {@literal null}.
* @param mongoConverter
*/
public MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter) {
@@ -205,7 +199,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
}
}
}
/**
@@ -219,7 +212,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Configures the {@link WriteConcern} to be used with the template.
* Configures the {@link WriteConcern} to be used with the template. If none is configured the {@link WriteConcern}
* configured on the {@link MongoDbFactory} will apply. If you configured a {@link Mongo} instance no
* {@link WriteConcern} will be used.
*
* @param writeConcern
*/
@@ -246,11 +241,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
this.readPreference = readPreference;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
String[] beans = applicationContext.getBeanNamesForType(MongoPersistentEntityIndexCreator.class);
if ((null == beans || beans.length == 0) && applicationContext instanceof ConfigurableApplicationContext) {
((ConfigurableApplicationContext) applicationContext).addApplicationListener(indexCreator);
}
prepareIndexCreator(applicationContext);
eventPublisher = applicationContext;
if (mappingContext instanceof ApplicationEventPublisherAware) {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
@@ -258,6 +256,30 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
resourceLoader = applicationContext;
}
/**
* Inspects the given {@link ApplicationContext} for {@link MongoPersistentEntityIndexCreator} and those in turn if
* they were registered for the current {@link MappingContext}. If no creator for the current {@link MappingContext}
* can be found we manually add the internally created one as {@link ApplicationListener} to make sure indexes get
* created appropriately for entity types persisted through this {@link MongoTemplate} instance.
*
* @param context must not be {@literal null}.
*/
private void prepareIndexCreator(ApplicationContext context) {
String[] indexCreators = context.getBeanNamesForType(MongoPersistentEntityIndexCreator.class);
for (String creator : indexCreators) {
MongoPersistentEntityIndexCreator creatorBean = context.getBean(creator, MongoPersistentEntityIndexCreator.class);
if (creatorBean.isIndexCreatorFor(mappingContext)) {
return;
}
}
if (context instanceof ConfigurableApplicationContext) {
((ConfigurableApplicationContext) context).addApplicationListener(indexCreator);
}
}
/**
* Returns the default {@link org.springframework.data.mongodb.core.core.convert.MongoConverter}.
*
@@ -330,11 +352,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject sortObject = query.getSortObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: $s",
serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
@@ -453,7 +476,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
} else {
query.limit(1);
List<T> results = find(query, entityClass, collectionName);
return (results.isEmpty() ? null : results.get(0));
return results.isEmpty() ? null : results.get(0);
}
}
@@ -464,19 +487,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> List<T> find(final Query query, Class<T> entityClass, String collectionName) {
CursorPreparer cursorPreparer = query == null ? null : new QueryCursorPreparer(query);
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, cursorPreparer);
if (query == null) {
return findAll(entityClass, collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
}
public <T> T findById(Object id, Class<T> entityClass) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
return findById(id, entityClass, persistentEntity.getCollection());
return findById(id, entityClass, determineCollectionName(entityClass));
}
public <T> T findById(Object id, Class<T> entityClass, String collectionName) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
String idKey = idProperty == null ? ID : idProperty.getName();
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
String idKey = idProperty == null ? ID_FIELD : idProperty.getName();
return doFindOne(collectionName, new BasicDBObject(idKey, id), null, entityClass);
}
@@ -621,6 +648,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
initializeVersionProperty(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
@@ -633,6 +662,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
}
private void initializeVersionProperty(Object entity) {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity == null || mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity, null);
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
public void insert(Collection<? extends Object> batchToSave, Class<?> entityClass) {
doInsertBatch(determineCollectionName(entityClass), batchToSave, this.mongoConverter);
}
@@ -677,6 +716,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
List<DBObject> dbObjectList = new ArrayList<DBObject>();
for (T o : batchToSave) {
initializeVersionProperty(o);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(o));
@@ -697,21 +738,83 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void save(Object objectToSave) {
Assert.notNull(objectToSave);
save(objectToSave, determineEntityCollectionName(objectToSave));
}
public void save(Object objectToSave, String collectionName) {
doSave(collectionName, objectToSave, this.mongoConverter);
Assert.notNull(objectToSave);
Assert.hasText(collectionName);
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(objectToSave.getClass());
// No optimistic locking -> simple save
if (mongoPersistentEntity == null || !mongoPersistentEntity.hasVersionProperty()) {
doSave(collectionName, objectToSave, this.mongoConverter);
return;
}
doSaveVersioned(objectToSave, mongoPersistentEntity, collectionName);
}
private <T> void doSaveVersioned(T objectToSave, MongoPersistentEntity<?> entity, String collectionName) {
BeanWrapper<PersistentEntity<T, ?>, T> beanWrapper = BeanWrapper.create(objectToSave,
this.mongoConverter.getConversionService());
MongoPersistentProperty idProperty = entity.getIdProperty();
MongoPersistentProperty versionProperty = entity.getVersionProperty();
Number version = beanWrapper.getProperty(versionProperty, Number.class, !versionProperty.usePropertyAccess());
// Fresh instance -> initialize version property
if (version == null) {
beanWrapper.setProperty(versionProperty, 0);
doSave(collectionName, objectToSave, this.mongoConverter);
} else {
assertUpdateableIdIfNotSet(objectToSave);
// Create query for entity with the id and old version
Object id = beanWrapper.getProperty(idProperty);
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
Number number = beanWrapper.getProperty(versionProperty, Number.class, false);
beanWrapper.setProperty(versionProperty, number.longValue() + 1);
BasicDBObject dbObject = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
this.mongoConverter.write(objectToSave, dbObject);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbObject));
Update update = Update.fromDBObject(dbObject, ID_FIELD);
updateFirst(query, update, objectToSave.getClass());
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbObject));
}
}
@SuppressWarnings("unchecked")
protected <T> void doSave(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
DBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
if (!(objectToSave instanceof String)) {
writer.write(objectToSave, dbDoc);
} else {
try {
objectToSave = (T) JSON.parse((String) objectToSave);
} catch (JSONParseException e) {
throw new MappingException("Could not parse given String to save into a JSON document!", e);
}
}
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
@@ -722,19 +825,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected Object insertDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
LOGGER.debug("Inserting DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDoc);
} else {
collection.insert(dbDoc, writeConcernToUse);
}
return dbDoc.get(ID);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDoc) : collection.insert(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.INSERT);
return dbDoc.get(ID_FIELD);
}
});
}
@@ -745,25 +846,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert list of DBObjects containing " + dbDocList.size() + " items");
LOGGER.debug("Inserting list of DBObjects containing " + dbDocList.size() + " items");
}
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDocList);
} else {
collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
}
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDocList) : collection.insert(
dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
handleAnyWriteResultErrors(writeResult, null, MongoActionOperation.INSERT_LIST);
return null;
}
});
List<ObjectId> ids = new ArrayList<ObjectId>();
for (DBObject dbo : dbDocList) {
Object id = dbo.get(ID);
Object id = dbo.get(ID_FIELD);
if (id instanceof ObjectId) {
ids.add((ObjectId) id);
} else {
@@ -776,19 +875,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected Object saveDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("save DBObject containing fields: " + dbDoc.keySet());
LOGGER.debug("Saving DBObject containing fields: " + dbDoc.keySet());
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.save(dbDoc);
} else {
collection.save(dbDoc, writeConcernToUse);
}
return dbDoc.get(ID);
WriteResult writeResult = writeConcernToUse == null ? collection.save(dbDoc) : collection.save(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.SAVE);
return dbDoc.get(ID_FIELD);
}
});
}
@@ -831,21 +928,25 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
+ collectionName);
}
WriteResult wr;
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
entityClass, updateObj, queryObj);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
wr = collection.update(queryObj, updateObj, upsert, multi);
} else {
wr = collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
WriteResult writeResult = writeConcernToUse == null ? collection.update(queryObj, updateObj, upsert, multi)
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
if (entity != null && entity.hasVersionProperty() && !multi) {
if (writeResult.getN() == 0) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString());
}
}
handleAnyWriteResultErrors(wr, queryObj, "update with '" + updateObj + "'");
return wr;
handleAnyWriteResultErrors(writeResult, queryObj, MongoActionOperation.UPDATE);
return writeResult;
}
});
}
@@ -867,7 +968,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
remove(getIdQueryFor(object), collection);
doRemove(collection, getIdQueryFor(object), object.getClass());
}
/**
@@ -881,7 +982,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + entity.getType().getName());
@@ -897,7 +998,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private void assertUpdateableIdIfNotSet(Object entity) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
if (idProperty == null) {
return;
@@ -919,27 +1020,30 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
}
final DBObject queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject dboq = mapper.getMappedObject(queryObject, entity);
WriteResult wr = null;
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + queryObject + " in collection: " + collection.getName());
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { dboq, collection.getName() });
}
if (writeConcernToUse == null) {
wr = collection.remove(dboq);
} else {
wr = collection.remove(dboq, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dboq, "remove");
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
writeConcernToUse);
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
return null;
}
});
@@ -990,26 +1094,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
CommandResult commandResult = null;
try {
if (command.getOutputType() == MapReduceCommand.OutputType.INLINE) {
commandResult = executeCommand(commandObject, getDb().getOptions());
} else {
commandResult = executeCommand(commandObject);
}
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [" + commandResult + "]");
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(commandObject));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1034,7 +1127,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
@@ -1060,23 +1153,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
LOGGER.debug("Executing Group with DBObject [{}]", serializeToJsonSafely(commandObject));
}
CommandResult commandResult = executeCommand(commandObject, getDb().getOptions());
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
LOGGER.debug("Group command result = [{}]", commandResult);
}
@SuppressWarnings("unchecked")
@@ -1192,7 +1276,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCollection coll = db.createCollection(collectionName, collectionOptions);
// TODO: Emit a collection created event
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created collection [" + coll.getFullName() + "]");
LOGGER.debug("Created collection [{}]", coll.getFullName());
}
return coll;
}
@@ -1220,14 +1304,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
* <p/>
* Can be overridden by subclasses.
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
* otherwise, an instance of SimpleMongoConverter will be used. The query document is specified as a standard DBObject
* and so is the fields specification. Can be overridden by subclasses.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1245,19 +1325,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
+ " in collection: " + collectionName);
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
objectCallback, collectionName);
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter. The
* query document is specified as a standard DBObject and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1331,13 +1413,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedQuery = mapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " and update: " + updateObj + " in collection: " + collectionName);
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mapper.getMappedObject(query, entity), fields, sort,
updateObj, options), new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
/**
@@ -1352,20 +1436,28 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
if (savedObject instanceof BasicDBObject) {
DBObject dbObject = (DBObject) savedObject;
dbObject.put(ID_FIELD, id);
return;
}
MongoPersistentProperty idProp = getIdPropertyFor(savedObject.getClass());
if (idProp == null) {
return;
}
try {
BeanWrapper.create(savedObject, mongoConverter.getConversionService()).setProperty(idProp, id);
ConversionService conversionService = mongoConverter.getConversionService();
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(savedObject, conversionService);
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
if (idValue != null) {
return;
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(idProp, id);
}
private DBCollection getAndPrepareCollection(DB db, String collectionName) {
@@ -1426,19 +1518,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback, String collectionName) {
try {
DBCursor cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
DBCursor cursor = null;
try {
cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
}
List<T> result = new ArrayList<T>();
while (cursor.hasNext()) {
DBObject object = cursor.next();
result.add(objectCallback.doWith(object));
}
return result;
} finally {
if (cursor != null) {
cursor.close();
}
}
List<T> result = new ArrayList<T>();
for (DBObject object : cursor) {
result.add(objectCallback.doWith(object));
}
return result;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -1448,15 +1553,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DocumentCallbackHandler callbackHandler, String collectionName) {
try {
DBCursor cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
DBCursor cursor = null;
try {
cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
}
while (cursor.hasNext()) {
DBObject dbobject = cursor.next();
callbackHandler.processDocument(dbobject);
}
} finally {
if (cursor != null) {
cursor.close();
}
}
for (DBObject dbobject : cursor) {
callbackHandler.processDocument(dbobject);
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -1467,7 +1584,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
private MongoPersistentProperty getIdPropertyFor(Class<?> type) {
return mappingContext.getPersistentEntity(type).getIdProperty();
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(type);
return persistentEntity == null ? null : persistentEntity.getIdProperty();
}
private <T> String determineEntityCollectionName(T obj) {
@@ -1494,30 +1612,45 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Checks and handles any errors.
* <p/>
* Current implementation logs errors. Future version may make this configurable to log warning, errors or throw
* exception.
* Handles {@link WriteResult} errors based on the configured {@link WriteResultChecking}.
*
* @param writeResult
* @param query
* @param operation
*/
protected void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
protected void handleAnyWriteResultErrors(WriteResult writeResult, DBObject query, MongoActionOperation operation) {
if (WriteResultChecking.NONE == this.writeResultChecking) {
if (writeResultChecking == WriteResultChecking.NONE) {
return;
}
String error = wr.getError();
String error = writeResult.getError();
if (error != null) {
if (error == null) {
return;
}
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
String message;
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.error(message);
return;
}
switch (operation) {
case INSERT:
case SAVE:
message = String.format("Insert/Save for %s failed: %s", query, error);
break;
case INSERT_LIST:
message = String.format("Insert list failed: %s", error);
break;
default:
message = String.format("Execution of %s%s failed: %s", operation,
query == null ? "" : " using query " + query.toString(), error);
}
if (writeResultChecking == WriteResultChecking.EXCEPTION) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.error(message);
return;
}
}
@@ -1533,6 +1666,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return resolved == null ? ex : resolved;
}
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
*
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
private void handleCommandError(CommandResult result, DBObject source) {
try {
result.throwOnError();
} catch (MongoException ex) {
String error = result.getErrorMessage();
error = error == null ? "NO MESSAGE" : error;
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ source, ex);
}
}
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
converter.afterPropertiesSet();
@@ -1584,7 +1738,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static class FindCallback implements CollectionCallback<DBCursor> {
private final DBObject query;
private final DBObject fields;
public FindCallback(DBObject query) {
@@ -1692,12 +1845,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private class DefaultWriteConcernResolver implements WriteConcernResolver {
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}
class QueryCursorPreparer implements CursorPreparer {
@@ -1761,7 +1915,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Creates a new {@link GeoNearResultDbObjectCallback} using the given {@link DbObjectCallback} delegate for
* {@link GeoResult} content unmarshalling.
*
* @param delegate
* @param delegate must not be {@literal null}.
*/
public GeoNearResultDbObjectCallback(DbObjectCallback<T> delegate, Metric metric) {
Assert.notNull(delegate);

View File

@@ -47,7 +47,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* Create an instance of {@link SimpleMongoDbFactory} given the {@link Mongo} instance and database name.
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, UserCredentials.NO_CREDENTIALS, false);
@@ -57,7 +57,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,27 +13,25 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* A strategy interface to determine the WriteConcern to use for a given MongoDbAction.
*
* Return the passed in default WriteConcern (a property on MongoAction) if no determination can be made.
* A strategy interface to determine the {@link WriteConcern} to use for a given {@link MongoAction}. Return the passed
* in default {@link WriteConcern} (a property on {@link MongoAction}) if no determination can be made.
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* Resolve the {@link WriteConcern} given the {@link MongoAction}.
*
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not
* be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
* @param action describes the context of the Mongo action. Contains a default {@link WriteConcern} to use if one
* should not be resolved.
* @return a {@link WriteConcern} based on the passed in {@link MongoAction} value, maybe {@literal null}.
*/
WriteConcern resolve(MongoAction action);
}

View File

@@ -1,5 +1,28 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Enum to represent how strict the check of {@link com.mongodb.WriteResult} shall be. It can either be skipped entirely
* (use {@link #NONE}), or errors can be logged ({@link #LOG}) or cause an exception to be thrown {@link #EXCEPTION}.
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
public enum WriteResultChecking {
NONE, LOG, EXCEPTION
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.math.BigInteger;
@@ -33,8 +32,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObj
* Base class for {@link MongoConverter} implementations. Sets up a {@link GenericConversionService} and populates basic
* converters. Allows registering {@link CustomConversions}.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
* @author Jon Brisbin
* @author Oliver Gierke
*/
public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean {
@@ -94,6 +93,14 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
conversions.registerConvertersIn(conversionService);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object)
*/
public Object convertToMongoType(Object obj) {
return convertToMongoType(obj, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getConversionService()

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -31,6 +31,7 @@ import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
@@ -38,6 +39,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalT
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
@@ -89,6 +92,9 @@ public class CustomConversions {
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
this.converters.addAll(converters);
for (Object c : this.converters) {
@@ -219,7 +225,7 @@ public class CustomConversions {
/**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass
* oth the given expected type though. If {@code expexctedTargetType} is {@literal null} we will simply return the
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
*
* @param source must not be {@literal null}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 by the original author(s).
* Copyright 2011-2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,11 +15,11 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
@@ -31,11 +31,13 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
@@ -46,6 +48,7 @@ import org.springframework.data.mapping.model.PersistentEntityParameterValueProv
import org.springframework.data.mapping.model.PropertyValueProvider;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELExpressionParameterValueProvider;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -215,9 +218,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent);
parameterProvider.setSpELEvaluator(evaluator);
return parameterProvider;
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, Object parent) {
@@ -252,13 +255,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Object obj = getValueInternal(inverseProp, dbo, evaluator, result);
try {
wrapper.setProperty(inverseProp, obj);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(inverseProp, obj);
}
});
@@ -350,13 +349,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
final BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(obj, conversionService);
// Write the ID
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
boolean fieldAccessOnly = idProperty.usePropertyAccess() ? false : useFieldAccessOnly;
try {
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {
}
@@ -370,7 +370,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
Object propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
boolean fieldAccessOnly = prop.usePropertyAccess() ? false : useFieldAccessOnly;
Object propertyObj = wrapper.getProperty(prop, prop.getType(), fieldAccessOnly);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
@@ -470,7 +472,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param collection must not be {@literal null}.
* @param property must not be {@literal null}.
*
* @return
*/
protected DBObject createCollection(Collection<?> collection, MongoPersistentProperty property) {
@@ -676,6 +677,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(target);
if (target instanceof DBRef) {
return (DBRef) target;
}
MongoPersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
if (null == targetEntity) {
@@ -713,36 +718,41 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @return the converted {@link Collections}, will never be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
private Collection<?> readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
Assert.notNull(targetType);
Class<?> collectionType = targetType.getType();
if (sourceValue.isEmpty()) {
return Collections.emptySet();
return getPotentiallyConvertedSimpleRead(new HashSet<Object>(), collectionType);
}
Class<?> collectionType = targetType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, sourceValue.size());
TypeInformation<?> componentType = targetType.getComponentType();
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
if (dbObjItem instanceof DBRef) {
items.add(read(componentType, ((DBRef) dbObjItem).fetch(), parent));
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, ((DBRef) dbObjItem).fetch(),
parent));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, componentType == null ? null : componentType.getType()));
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
}
return items;
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
}
/**
@@ -776,9 +786,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, ((DBRef) value).fetch()));
} else {
Class<?> valueClass = valueType == null ? null : valueType.getType();
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
@@ -803,14 +816,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
*/
@SuppressWarnings("unchecked")
public Object convertToMongoType(Object obj) {
public Object convertToMongoType(Object obj, TypeInformation<?> typeInformation) {
if (obj == null) {
return null;
}
Class<?> target = conversions.getCustomWriteTarget(getClass());
Class<?> target = conversions.getCustomWriteTarget(obj.getClass());
if (target != null) {
return conversionService.convert(obj, target);
}
@@ -851,7 +868,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return removeTypeInfoRecursively(newDbo);
if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
}
public BasicDBList maybeConvertList(Iterable<?> source) {
@@ -923,7 +945,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* (non-Javadoc)
* @see org.springframework.data.convert.PropertyValueProvider#getPropertyValue(org.springframework.data.mapping.PersistentProperty)
*/
@SuppressWarnings("unchecked")
public <T> T getPropertyValue(MongoPersistentProperty property) {
String expression = property.getSpelExpression();
@@ -933,20 +954,60 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
TypeInformation<?> type = property.getTypeInformation();
Class<?> rawType = type.getType();
return readValue(value, property.getTypeInformation(), parent);
}
}
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) read(type, ((DBRef) value).fetch(), parent);
} else if (value instanceof BasicDBList) {
return (T) getPotentiallyConvertedSimpleRead(readCollectionOrArray(type, (BasicDBList) value, parent), rawType);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
/**
* Extension of {@link SpELExpressionParameterValueProvider} to recursively trigger value conversion on the raw
* resolved SpEL value.
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
*
* @param evaluator must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
super(evaluator, conversionService, delegate);
this.parent = parent;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.SpELExpressionParameterValueProvider#potentiallyConvertSpelValue(java.lang.Object, org.springframework.data.mapping.PreferredConstructor.Parameter)
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
}

View File

@@ -17,8 +17,12 @@ package org.springframework.data.mongodb.core.convert;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.util.StringUtils;
@@ -119,4 +123,28 @@ abstract class MongoConverters {
return StringUtils.hasText(source) ? new BigInteger(source) : null;
}
}
public static enum URLToStringConverter implements Converter<URL, String> {
INSTANCE;
public String convert(URL source) {
return source == null ? null : source.toString();
}
}
public static enum StringToURLConverter implements Converter<String, URL> {
INSTANCE;
private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class);
private static final TypeDescriptor TARGET = TypeDescriptor.valueOf(URL.class);
public URL convert(String source) {
try {
return source == null ? null : new URL(source);
} catch (MalformedURLException e) {
throw new ConversionFailedException(SOURCE, TARGET, source, e);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.EntityWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -35,11 +36,21 @@ public interface MongoWriter<T> extends EntityWriter<T, DBObject> {
* Converts the given object into one Mongo will be able to store natively. If the given object can already be stored
* as is, no conversion will happen.
*
* @param obj
* @param obj can be {@literal null}.
* @return
*/
Object convertToMongoType(Object obj);
/**
* Converts the given object into one Mongo will be able to store natively but retains the type information in case
* the given {@link TypeInformation} differs from the given object type.
*
* @param obj can be {@literal null}.
* @param typeInformation can be {@literal null}.
* @return
*/
Object convertToMongoType(Object obj, TypeInformation<?> typeInformation);
/**
* Creates a {@link DBRef} to refer to the given object.
*

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,15 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import org.bson.types.BasicBSONList;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
@@ -33,11 +34,12 @@ import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* A helper class to encapsulate any modifications of a Query object before it gets submitted to the database.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class QueryMapper {
@@ -47,6 +49,7 @@ public class QueryMapper {
private final ConversionService conversionService;
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
@@ -54,9 +57,12 @@ public class QueryMapper {
* @param converter must not be {@literal null}.
*/
public QueryMapper(MongoConverter converter) {
Assert.notNull(converter);
this.conversionService = converter.getConversionService();
this.converter = converter;
this.mappingContext = converter.getMappingContext();
}
/**
@@ -69,48 +75,151 @@ public class QueryMapper {
*/
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
DBObject newDbo = new BasicDBObject();
if (Keyword.isKeyword(query)) {
return getMappedKeyword(new Keyword(query), entity);
}
DBObject result = new BasicDBObject();
for (String key : query.keySet()) {
MongoPersistentEntity<?> nestedEntity = getNestedEntity(entity, key);
String newKey = key;
MongoPersistentProperty targetProperty = getTargetProperty(key, entity);
String newKey = determineKey(key, entity);
Object value = query.get(key);
if (isIdKey(key, entity)) {
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
} else {
value = getMappedObject((DBObject) value, nestedEntity);
}
} else {
value = convertId(value);
}
newKey = "_id";
} else if (key.matches(N_OR_PATTERN)) {
// $or/$nor
Iterable<?> conditions = (Iterable<?>) value;
BasicBSONList newConditions = new BasicBSONList();
Iterator<?> iter = conditions.iterator();
while (iter.hasNext()) {
newConditions.add(getMappedObject((DBObject) iter.next(), nestedEntity));
}
value = newConditions;
} else if (key.equals("$ne")) {
value = convertId(value);
}
newDbo.put(newKey, convertSimpleOrDBObject(value, nestedEntity));
result.put(newKey, getMappedValue(value, targetProperty, newKey));
}
return newDbo;
return result;
}
/**
* Returns the given {@link DBObject} representing a keyword by mapping the keyword's value.
*
* @param query the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
* @param entity
* @return
*/
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN)) {
Iterable<?> conditions = (Iterable<?>) query.value;
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(getMappedObject((DBObject) condition, entity));
}
return new BasicDBObject(query.key, newConditions);
}
return new BasicDBObject(query.key, convertSimpleOrDBObject(query.value, entity));
}
/**
* Returns the mapped keyword considered defining a criteria for the given property.
*
* @param keyword
* @param property
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, MongoPersistentProperty property) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property);
}
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property, keyword.key));
}
/**
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param source the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, MongoPersistentProperty property, String newKey) {
if (property == null) {
return convertSimpleOrDBObject(source, null);
}
if (property.isIdProperty() || "_id".equals(newKey)) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject((DBObject) source, null);
}
return valueDbo;
} else {
return convertId(source);
}
}
if (property.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), property) : convertAssociation(source,
property);
}
return convertSimpleOrDBObject(source, mappingContext.getPersistentEntity(property));
}
private MongoPersistentProperty getTargetProperty(String key, MongoPersistentEntity<?> entity) {
if (isIdKey(key, entity)) {
return entity.getIdProperty();
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? null : path.getLeafProperty();
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return null;
}
try {
PropertyPath path = PropertyPath.from(key, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
/**
* Returns the translated key assuming the given one is a propert (path) reference.
*
* @param key the source key
* @param entity the base entity
* @return the translated key
*/
private String determineKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null && DEFAULT_ID_NAMES.contains(key)) {
return "_id";
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? key : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
}
/**
@@ -133,6 +242,30 @@ public class QueryMapper {
return converter.convertToMongoType(source);
}
/**
* Converts the given source assuming it's actually an association to anoter object.
*
* @param source
* @param property
* @return
*/
private Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || !property.isAssociation()) {
return source;
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
result.add(element instanceof DBRef ? element : converter.toDBRef(element, property));
}
return result;
}
return source == null || source instanceof DBRef ? source : converter.toDBRef(source, property);
}
/**
* Returns whether the given key will be considered an id key.
*
@@ -142,26 +275,19 @@ public class QueryMapper {
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (null != entity && entity.getIdProperty() != null) {
MongoPersistentProperty idProperty = entity.getIdProperty();
if (entity == null) {
return false;
}
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return DEFAULT_ID_NAMES.contains(key);
}
private MongoPersistentEntity<?> getNestedEntity(MongoPersistentEntity<?> entity, String key) {
MongoPersistentProperty property = entity == null ? null : entity.getPersistentProperty(key);
if (property == null || !property.isEntity()) {
return null;
}
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context = converter.getMappingContext();
return context.getPersistentEntity(property);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
@@ -178,4 +304,44 @@ public class QueryMapper {
return converter.convertToMongoType(id);
}
/**
* Value object to capture a query keyword representation.
*
* @author Oliver Gierke
*/
private static class Keyword {
String key;
Object value;
Keyword(Object source) {
Assert.isInstanceOf(DBObject.class, source);
DBObject value = (DBObject) source;
Assert.isTrue(value.keySet().size() == 1, "Keyword must have a single key only!");
this.key = value.keySet().iterator().next();
this.value = value.get(key);
}
/**
* Returns whether the given value actually represents a keyword. If this returns {@literal true} it's safe to call
* the constructor.
*
* @param value
* @return
*/
static boolean isKeyword(Object value) {
if (!(value instanceof DBObject)) {
return false;
}
DBObject dbObject = (DBObject) value;
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
}
}
}

View File

@@ -0,0 +1,43 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @author Oliver Gierke
*/
public class CustomMetric implements Metric {
private final double multiplier;
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
@@ -24,14 +24,29 @@ import java.lang.annotation.Target;
/**
* Mark a class to use compound indexes.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
@Target({ ElementType.TYPE })
@Documented
@Retention(RetentionPolicy.RUNTIME)
public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
*
* @return
*/
String def();
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
* attribute actually.
*
* @return
*/
@Deprecated
IndexDirection direction() default IndexDirection.ASCENDING;
boolean unique() default false;
@@ -40,8 +55,18 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
*
* @return
*/
String name() default "";
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
*
* @return
*/
String collection() default "";
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,37 +13,51 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.util.Assert;
/**
* An implementation of ApplicationEventPublisher that will only fire MappingContextEvents for use by the index creator
* when MongoTemplate is used 'stand-alone', that is not declared inside a Spring ApplicationContext.
* An implementation of ApplicationEventPublisher that will only fire {@link MappingContextEvent}s for use by the index
* creator when MongoTemplate is used 'stand-alone', that is not declared inside a Spring {@link ApplicationContext}.
* Declare {@link MongoTemplate} inside an {@link ApplicationContext} to enable the publishing of all persistence events
* such as {@link AfterLoadEvent}, {@link AfterSaveEvent}, etc.
*
* Declare MongoTemplate inside an ApplicationContext to enable the publishing of all persistence events such as
* {@link AfterLoadEvent}, {@link AfterSaveEvent}, etc.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoMappingEventPublisher implements ApplicationEventPublisher {
private MongoPersistentEntityIndexCreator indexCreator;
private final MongoPersistentEntityIndexCreator indexCreator;
/**
* Creates a new {@link MongoMappingEventPublisher} for the given {@link MongoPersistentEntityIndexCreator}.
*
* @param indexCreator must not be {@literal null}.
*/
public MongoMappingEventPublisher(MongoPersistentEntityIndexCreator indexCreator) {
Assert.notNull(indexCreator);
this.indexCreator = indexCreator;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(org.springframework.context.ApplicationEvent)
*/
@SuppressWarnings("unchecked")
public void publishEvent(ApplicationEvent event) {
if (event instanceof MappingContextEvent) {
indexCreator
.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty>) event);
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
@@ -25,9 +24,9 @@ import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -39,32 +38,35 @@ import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link BasicMongoPersistentEntity} instances contained in the given
* {@link MongoMappingContext} for indexing metadata and ensures the indexes to be available.
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
* for indexing metadata and ensures the indexes to be available.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty>> {
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
private static final Logger log = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoDbFactory mongoDbFactory;
private final MongoMappingContext mappingContext;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@@iteral null}
* @param mongoDbFactory must not be {@@iteral null}
* @param mappingContext must not be {@literal null}
* @param mongoDbFactory must not be {@literal null}
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
Assert.notNull(mongoDbFactory);
Assert.notNull(mappingContext);
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
for (MongoPersistentEntity<?> entity : mappingContext.getPersistentEntities()) {
checkForIndexes(entity);
@@ -75,8 +77,11 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(
MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
}
PersistentEntity<?, ?> entity = event.getPersistentEntity();
@@ -97,12 +102,12 @@ public class MongoPersistentEntityIndexCreator implements
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = index.collection();
if ("".equals(indexColl)) {
indexColl = entity.getCollection();
}
ensureIndex(indexColl, index.name(), index.def(), index.direction(), index.unique(), index.dropDups(),
index.sparse());
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
}
@@ -111,10 +116,14 @@ public class MongoPersistentEntityIndexCreator implements
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
Field field = persistentProperty.getField();
if (field.isAnnotationPresent(Indexed.class)) {
Indexed index = field.getAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = persistentProperty.getFieldName();
} else {
@@ -126,11 +135,17 @@ public class MongoPersistentEntityIndexCreator implements
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
ensureIndex(collection, name, null, index.direction(), index.unique(), index.dropDups(), index.sparse());
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
}
} else if (field.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = field.getAnnotation(GeoSpatialIndexed.class);
@@ -155,21 +170,35 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void ensureIndex(String collection, final String name, final String def, final IndexDirection direction,
final boolean unique, final boolean dropDups, final boolean sparse) {
DBObject defObj;
if (null != def) {
defObj = (DBObject) JSON.parse(def);
} else {
defObj = new BasicDBObject();
defObj.put(name, (direction == IndexDirection.ASCENDING ? 1 : -1));
}
/**
* Returns whether the current index creator was registered for the given {@link MappingContext}.
*
* @param context
* @return
*/
public boolean isIndexCreatorFor(MappingContext<?, ?> context) {
return this.mappingContext.equals(context);
}
/**
* Triggers the actual index creation.
*
* @param collection the collection to create the index in
* @param name the name of the index about to be created
* @param indexDefinition the index definition
* @param unique whether it shall be a unique index
* @param dropDups whether to drop duplicates
* @param sparse sparse or not
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(defObj, opts);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.Comparator;
@@ -23,7 +22,6 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
@@ -34,10 +32,10 @@ import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.StringUtils;
/**
* Mongo specific {@link PersistentEntity} implementation that adds Mongo specific meta-data such as the collection name
* and the like.
* MongoDB specific {@link MongoPersistentEntity} implementation that adds Mongo specific meta-data such as the
* collection name and the like.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
@@ -76,18 +74,17 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
context.addPropertyAccessor(new BeanFactoryAccessor());
context.setBeanResolver(new BeanFactoryResolver(applicationContext));
context.setRootObject(applicationContext);
}
/**
* Returns the collection the entity should be stored in.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -27,14 +27,16 @@ import org.slf4j.LoggerFactory;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* Mongo specific {@link org.springframework.data.mapping.PersistentProperty} implementation.
* MongoDB specific {@link org.springframework.data.mapping.MongoPersistentProperty} implementation.
*
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class BasicMongoPersistentProperty extends AnnotationBasedPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
@@ -45,6 +47,8 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
private static final Set<Class<?>> SUPPORTED_ID_TYPES = new HashSet<Class<?>>();
private static final Set<String> SUPPORTED_ID_PROPERTY_NAMES = new HashSet<String>();
private static final Field CAUSE_FIELD;
static {
SUPPORTED_ID_TYPES.add(ObjectId.class);
SUPPORTED_ID_TYPES.add(String.class);
@@ -52,6 +56,8 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
SUPPORTED_ID_PROPERTY_NAMES.add("id");
SUPPORTED_ID_PROPERTY_NAMES.add("_id");
CAUSE_FIELD = ReflectionUtils.findField(Throwable.class, "cause");
}
/**
@@ -87,6 +93,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
*/
@Override
public boolean isIdProperty() {
if (super.isIdProperty()) {
return true;
}
@@ -111,8 +118,9 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return annotation != null && StringUtils.hasText(annotation.value()) ? annotation.value() : field.getName();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getFieldOrder()
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getFieldOrder()
*/
public int getFieldOrder() {
org.springframework.data.mongodb.core.mapping.Field annotation = getField().getAnnotation(
@@ -120,25 +128,36 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return annotation != null ? annotation.order() : Integer.MAX_VALUE;
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.AbstractPersistentProperty#createAssociation()
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AbstractPersistentProperty#createAssociation()
*/
@Override
protected Association<MongoPersistentProperty> createAssociation() {
return new Association<MongoPersistentProperty>(this, null);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isDbReference()
*/
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isDbReference()
*/
public boolean isDbReference() {
return getField().isAnnotationPresent(DBRef.class);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getDBRef()
*/
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getDBRef()
*/
public DBRef getDBRef() {
return getField().getAnnotation(DBRef.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
*/
public boolean usePropertyAccess() {
return CAUSE_FIELD.equals(getField());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
@@ -23,12 +22,16 @@ import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.util.TypeInformation;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
* Default implementation of a {@link MappingContext} for MongoDB using {@link BasicMongoPersistentEntity} and
* {@link BasicMongoPersistentProperty} as primary abstractions.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty>
implements ApplicationContextAware {
@@ -42,6 +45,15 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#shouldCreatePersistentEntityFor(org.springframework.data.util.TypeInformation)
*/
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.MutablePersistentEntity, org.springframework.data.mapping.SimpleTypeHolder)
@@ -72,7 +84,10 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
super.setApplicationContext(applicationContext);
}
}

View File

@@ -1,12 +1,33 @@
/*
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import org.springframework.data.mapping.PersistentEntity;
/**
* MongoDB specific {@link PersistentEntity} abstraction.
*
* @author Oliver Gierke
*/
public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersistentProperty> {
/**
* Returns the collection the entity shall be persisted to.
*
* @return
*/
String getCollection();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,9 +19,10 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.PersistentProperty;
/**
* Mongo specific {@link org.springframework.data.mapping.PersistentProperty} implementation.
* MongoDB specific {@link org.springframework.data.mapping.PersistentProperty} extension.
*
* @author Oliver Gierke
* @author Patryk Wasik
*/
public interface MongoPersistentProperty extends PersistentProperty<MongoPersistentProperty> {
@@ -72,4 +73,12 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
return source.getFieldName();
}
}
/**
* Returns whether property access shall be used for reading the property value. This means it will use the getter
* instead of field access.
*
* @return
*/
boolean usePropertyAccess();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2012-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -29,22 +29,25 @@ import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
/**
*
* @deprecated use {@link MongoMappingContext} instead.
* @author Oliver Gierke
*/
@Deprecated
public class SimpleMongoMappingContext extends
AbstractMappingContext<SimpleMongoMappingContext.SimpleMongoPersistentEntity<?>, MongoPersistentProperty> {
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation)
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation)
*/
@Override
protected <T> SimpleMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new SimpleMongoPersistentEntity<T>(typeInformation);
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.util.TypeInformation, org.springframework.data.mapping.BasicPersistentEntity)
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.model.MutablePersistentEntity, org.springframework.data.mapping.model.SimpleTypeHolder)
*/
@Override
protected SimplePersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
@@ -112,6 +115,21 @@ public class SimpleMongoMappingContext extends
public DBRef getDBRef() {
return null;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isVersion()
*/
public boolean isVersionProperty() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
*/
public boolean usePropertyAccess() {
return false;
}
}
static class SimpleMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
@@ -130,5 +148,20 @@ public class SimpleMongoMappingContext extends
public String getCollection() {
return MongoCollectionUtils.getPreferredCollectionName(getType());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getVersionProperty()
*/
public MongoPersistentProperty getVersionProperty() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasVersionProperty()
*/
public boolean hasVersionProperty() {
return false;
}
}
}

View File

@@ -0,0 +1,39 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import static java.lang.annotation.ElementType.*;
import static java.lang.annotation.RetentionPolicy.*;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
/**
* Demarcates a property to be used as version field to implement optimistic locking on entities.
*
* @since 1.4
* @author Patryk Wasik
* @deprecated use {@link org.springframework.data.annotation.Version} instead.
*/
@Deprecated
@Documented
@Target({ FIELD })
@Retention(RUNTIME)
@org.springframework.data.annotation.Version
public @interface Version {
}

View File

@@ -0,0 +1,53 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import org.springframework.context.ApplicationListener;
import org.springframework.data.auditing.AuditingHandler;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.util.Assert;
/**
* Event listener to populate auditing related fields on an entity about to be saved.
*
* @author Oliver Gierke
*/
public class AuditingEventListener implements ApplicationListener<BeforeConvertEvent<Object>> {
private final IsNewAwareAuditingHandler<Object> auditingHandler;
/**
* Creates a new {@link AuditingEventListener} using the given {@link MappingContext} and {@link AuditingHandler}.
*
* @param auditingHandler must not be {@literal null}.
*/
public AuditingEventListener(IsNewAwareAuditingHandler<Object> auditingHandler) {
Assert.notNull(auditingHandler, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandler = auditingHandler;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(BeforeConvertEvent<Object> event) {
Object entity = event.getSource();
auditingHandler.markAudited(entity);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,17 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Event being thrown before a domain object is converted to be persisted.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class BeforeConvertEvent<T> extends MongoMappingEvent<T> {
private static final long serialVersionUID = 1L;
private static final long serialVersionUID = 252614269008845243L;
public BeforeConvertEvent(T source) {
super(source, null);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,30 +16,42 @@
package org.springframework.data.mongodb.core.mapreduce;
/**
* Value object to encapsulate results of a map-reduce count.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class MapReduceCounts {
private final int inputCount;
private final int emitCount;
private final int outputCount;
public static MapReduceCounts NONE = new MapReduceCounts(-1, -1, -1);
private final long inputCount;
private final long emitCount;
private final long outputCount;
/**
* Creates a new {@link MapReduceCounts} using the given input count, emit count, and output count.
*
* @param inputCount
* @param emitCount
* @param outputCount
*/
public MapReduceCounts(long inputCount, long emitCount, long outputCount) {
public MapReduceCounts(int inputCount, int emitCount, int outputCount) {
super();
this.inputCount = inputCount;
this.emitCount = emitCount;
this.outputCount = outputCount;
}
public int getInputCount() {
public long getInputCount() {
return inputCount;
}
public int getEmitCount() {
public long getEmitCount() {
return emitCount;
}
public int getOutputCount() {
public long getOutputCount() {
return outputCount;
}
@@ -59,12 +71,15 @@ public class MapReduceCounts {
*/
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
long result = 1;
result = prime * result + emitCount;
result = prime * result + inputCount;
result = prime * result + outputCount;
return result;
return Long.valueOf(result).intValue();
}
/*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,33 +26,39 @@ import com.mongodb.DBObject;
* Collects the results of performing a MapReduce operations.
*
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
* @author Oliver Gierke
* @param <T> The class in which the results are mapped onto, accessible via an iterator.
*/
public class MapReduceResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private final DBObject rawResults;
private final String outputCollection;
private final MapReduceTiming mapReduceTiming;
private final MapReduceCounts mapReduceCounts;
private DBObject rawResults;
private MapReduceTiming mapReduceTiming;
private MapReduceCounts mapReduceCounts;
private String outputCollection;
/**
* Creates a new {@link MapReduceResults} from the given mapped results and the raw one.
*
* @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}.
*/
public MapReduceResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = mappedResults;
this.rawResults = rawResults;
parseTiming(rawResults);
parseCounts(rawResults);
if (rawResults.get("result") != null) {
this.outputCollection = (String) rawResults.get("result");
}
this.mapReduceTiming = parseTiming(rawResults);
this.mapReduceCounts = parseCounts(rawResults);
this.outputCollection = parseOutputCollection(rawResults);
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}
@@ -73,28 +79,70 @@ public class MapReduceResults<T> implements Iterable<T> {
return rawResults;
}
protected void parseTiming(DBObject rawResults) {
private MapReduceTiming parseTiming(DBObject rawResults) {
DBObject timing = (DBObject) rawResults.get("timing");
if (timing != null) {
if (timing.get("mapTime") != null && timing.get("emitLoop") != null && timing.get("total") != null) {
mapReduceTiming = new MapReduceTiming((Long) timing.get("mapTime"), (Integer) timing.get("emitLoop"),
(Integer) timing.get("total"));
}
} else {
mapReduceTiming = new MapReduceTiming(-1, -1, -1);
if (timing == null) {
return new MapReduceTiming(-1, -1, -1);
}
if (timing.get("mapTime") != null && timing.get("emitLoop") != null && timing.get("total") != null) {
return new MapReduceTiming(getAsLong(timing, "mapTime"), getAsLong(timing, "emitLoop"),
getAsLong(timing, "total"));
}
return new MapReduceTiming(-1, -1, -1);
}
protected void parseCounts(DBObject rawResults) {
/**
* Returns the value of the source's field with the given key as {@link Long}.
*
* @param source
* @param key
* @return
*/
private Long getAsLong(DBObject source, String key) {
Object raw = source.get(key);
return raw instanceof Long ? (Long) raw : (Integer) raw;
}
/**
* Parses the raw {@link DBObject} result into a {@link MapReduceCounts} value object.
*
* @param rawResults
* @return
*/
private MapReduceCounts parseCounts(DBObject rawResults) {
DBObject counts = (DBObject) rawResults.get("counts");
if (counts != null) {
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
mapReduceCounts = new MapReduceCounts((Integer) counts.get("input"), (Integer) counts.get("emit"),
(Integer) counts.get("output"));
}
} else {
mapReduceCounts = new MapReduceCounts(-1, -1, -1);
if (counts == null) {
return MapReduceCounts.NONE;
}
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
return new MapReduceCounts(getAsLong(counts, "input"), getAsLong(counts, "emit"), getAsLong(counts, "output"));
}
return MapReduceCounts.NONE;
}
/**
* Parses the output collection from the raw {@link DBObject} result.
*
* @param rawResults
* @return
*/
private String parseOutputCollection(DBObject rawResults) {
Object resultField = rawResults.get("result");
if (resultField == null) {
return null;
}
return resultField instanceof DBObject ? ((DBObject) resultField).get("collection").toString() : resultField
.toString();
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
@@ -23,13 +25,14 @@ import java.util.List;
import java.util.regex.Pattern;
import org.bson.BSON;
import org.bson.types.BasicBSONList;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -397,7 +400,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria orOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
}
@@ -408,7 +411,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
}
@@ -419,7 +422,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return this;
}
@@ -429,11 +432,9 @@ public class Criteria implements CriteriaDefinition {
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
*/
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
return criteriaChain.get(0).getSingleCriteriaObject();
@@ -477,8 +478,8 @@ public class Criteria implements CriteriaDefinition {
return queryCriteria;
}
private BasicBSONList createCriteriaList(Criteria[] criteria) {
BasicBSONList bsonList = new BasicBSONList();
private BasicDBList createCriteriaList(Criteria[] criteria) {
BasicDBList bsonList = new BasicDBList();
for (Criteria c : criteria) {
bsonList.add(c.getCriteriaObject());
}
@@ -496,4 +497,82 @@ public class Criteria implements CriteriaDefinition {
}
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Criteria that = (Criteria) obj;
if (this.criteriaChain.size() != that.criteriaChain.size()) {
return false;
}
for (int i = 0; i < this.criteriaChain.size(); i++) {
Criteria left = this.criteriaChain.get(i);
Criteria right = that.criteriaChain.get(i);
if (!simpleCriteriaEquals(left, right)) {
return false;
}
}
return true;
}
private boolean simpleCriteriaEquals(Criteria left, Criteria right) {
boolean keyEqual = left.key == null ? right.key == null : left.key.equals(right.key);
boolean criteriaEqual = left.criteria.equals(right.criteria);
boolean valueEqual = isEqual(left.isValue, right.isValue);
return keyEqual && criteriaEqual && valueEqual;
}
/**
* Checks the given objects for equality. Handles {@link Pattern} and arrays correctly.
*
* @param left
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
if (left == null) {
return right == null;
}
if (left instanceof Pattern) {
return right instanceof Pattern ? ((Pattern) left).pattern().equals(((Pattern) right).pattern()) : false;
}
return ObjectUtils.nullSafeEquals(left, right);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += nullSafeHashCode(key);
result += criteria.hashCode();
result += nullSafeHashCode(isValue);
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.geo.Metrics;
@@ -31,10 +32,12 @@ import com.mongodb.DBObject;
*/
public class NearQuery {
private final DBObject criteria;
private final Point point;
private Query query;
private Double maxDistance;
private Distance maxDistance;
private Metric metric;
private boolean spherical;
private Integer num;
/**
* Creates a new {@link NearQuery}.
@@ -45,13 +48,11 @@ public class NearQuery {
Assert.notNull(point);
this.criteria = new BasicDBObject();
this.criteria.put("near", point.asList());
this.point = point;
this.spherical = false;
this.metric = metric;
if (metric != null) {
spherical(true);
distanceMultiplier(metric);
in(metric);
}
}
@@ -105,12 +106,13 @@ public class NearQuery {
}
/**
* Returns the {@link Metric} underlying the actual query.
* Returns the {@link Metric} underlying the actual query. If no metric was set explicitly {@link Metrics#NEUTRAL}
* will be returned.
*
* @return
* @return will never be {@literal null}.
*/
public Metric getMetric() {
return metric;
return metric == null ? Metrics.NEUTRAL : metric;
}
/**
@@ -120,68 +122,96 @@ public class NearQuery {
* @return
*/
public NearQuery num(int num) {
this.criteria.put("num", num);
this.num = num;
return this;
}
/**
* Sets the max distance results shall have from the configured origin. Will normalize the given value using a
* potentially already configured {@link Metric}.
* Sets the max distance results shall have from the configured origin. If a {@link Metric} was set before the given
* value will be interpreted as being a value in that metric. E.g.
*
* <pre>
* NearQuery query = near(10.0, 20.0, Metrics.KILOMETERS).maxDistance(150);
* </pre>
*
* Will set the maximum distance to 150 kilometers.
*
* @param maxDistance
* @return
*/
public NearQuery maxDistance(double maxDistance) {
this.maxDistance = getNormalizedDistance(maxDistance, this.metric);
return this;
return maxDistance(new Distance(maxDistance, getMetric()));
}
/**
* Sets the maximum distance supplied in a given metric. Will normalize the distance but not reconfigure the query's
* {@link Metric}.
* result {@link Metric} if one was configured before.
*
* @param maxDistance
* @param metric must not be {@literal null}.
* @return
*/
public NearQuery maxDistance(double maxDistance, Metric metric) {
Assert.notNull(metric);
this.spherical(true);
return maxDistance(getNormalizedDistance(maxDistance, metric));
return maxDistance(new Distance(maxDistance, metric));
}
/**
* Sets the maximum distance to the given {@link Distance}.
* Sets the maximum distance to the given {@link Distance}. Will set the returned {@link Metric} to be the one of the
* given {@link Distance} if no {@link Metric} was set before.
*
* @param distance
* @param distance must not be {@literal null}.
* @return
*/
public NearQuery maxDistance(Distance distance) {
Assert.notNull(distance);
return maxDistance(distance.getValue(), distance.getMetric());
if (distance.getMetric() != Metrics.NEUTRAL) {
this.spherical(true);
}
if (this.metric == null) {
in(distance.getMetric());
}
this.maxDistance = distance;
return this;
}
/**
* Configures a distance multiplier the resulting distances get applied.
* Returns the maximum {@link Distance}.
*
* @return
*/
public Distance getMaxDistance() {
return this.maxDistance;
}
/**
* Configures a {@link CustomMetric} with the given multiplier.
*
* @param distanceMultiplier
* @return
*/
public NearQuery distanceMultiplier(double distanceMultiplier) {
this.criteria.put("distanceMultiplier", distanceMultiplier);
this.metric = new CustomMetric(distanceMultiplier);
return this;
}
/**
* Configures the distance multiplier to the multiplier of the given {@link Metric}. Does <em>not</em> recalculate the
* {@link #maxDistance(double)}.
* Configures the distance multiplier to the multiplier of the given {@link Metric}.
*
* @deprecated use {@link #in(Metric)} instead.
* @param metric must not be {@literal null}.
* @return
*/
@Deprecated
public NearQuery distanceMultiplier(Metric metric) {
Assert.notNull(metric);
return distanceMultiplier(metric.getMultiplier());
return in(metric);
}
/**
@@ -191,10 +221,19 @@ public class NearQuery {
* @return
*/
public NearQuery spherical(boolean spherical) {
this.criteria.put("spherical", spherical);
this.spherical = spherical;
return this;
}
/**
* Returns whether spharical values will be returned.
*
* @return
*/
public boolean isSpherical() {
return this.spherical;
}
/**
* Will cause the results' distances being returned in kilometers. Sets {@link #distanceMultiplier(double)} and
* {@link #spherical(boolean)} accordingly.
@@ -215,6 +254,18 @@ public class NearQuery {
return adaptMetric(Metrics.MILES);
}
/**
* Will cause the results' distances being returned in the given metric. Sets {@link #distanceMultiplier(double)}
* accordingly as well as {@link #spherical(boolean)} if the given {@link Metric} is not {@link Metrics#NEUTRAL}.
*
* @param metric the metric the results shall be returned in. Uses {@link Metrics#NEUTRAL} if {@literal null} is
* passed.
* @return
*/
public NearQuery in(Metric metric) {
return adaptMetric(metric == null ? Metrics.NEUTRAL : metric);
}
/**
* Configures the given {@link Metric} to be used as base on for this query and recalculate the maximum distance if no
* metric was set before.
@@ -223,12 +274,12 @@ public class NearQuery {
*/
private NearQuery adaptMetric(Metric metric) {
if (this.metric == null && maxDistance != null) {
maxDistance(this.maxDistance, metric);
if (metric != Metrics.NEUTRAL) {
spherical(true);
}
spherical(true);
return distanceMultiplier(metric);
this.metric = metric;
return this;
}
/**
@@ -249,18 +300,27 @@ public class NearQuery {
*/
public DBObject toDBObject() {
BasicDBObject dbObject = new BasicDBObject(criteria.toMap());
BasicDBObject dbObject = new BasicDBObject();
if (query != null) {
dbObject.put("query", query.getQueryObject());
}
if (maxDistance != null) {
dbObject.put("maxDistance", maxDistance);
dbObject.put("maxDistance", this.maxDistance.getNormalizedValue());
}
if (metric != null) {
dbObject.put("distanceMultiplier", metric.getMultiplier());
}
if (num != null) {
dbObject.put("num", num);
}
dbObject.put("near", point.asList());
dbObject.put("spherical", spherical);
return dbObject;
}
private double getNormalizedDistance(double distance, Metric metric) {
return metric == null ? distance : distance / metric.getMultiplier();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,21 +15,32 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Thomas Risberg
* @author Oliver Gierke
*/
public class Query {
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private Field fieldSpec;
private Sort sort;
private Sort coreSort;
@SuppressWarnings("deprecation")
private org.springframework.data.mongodb.core.query.Sort sort;
private int skip;
private int limit;
private String hint;
@@ -93,14 +104,61 @@ public class Query {
return this;
}
public Sort sort() {
/**
* Returns a {@link org.springframework.data.mongodb.core.query.Sort} instance to define ordering properties.
*
* @deprecated use {@link #with(Sort)} instead
* @return
*/
@Deprecated
public org.springframework.data.mongodb.core.query.Sort sort() {
if (this.sort == null) {
this.sort = new Sort();
this.sort = new org.springframework.data.mongodb.core.query.Sort();
}
return this.sort;
}
/**
* Sets the given pagination information on the {@link Query} instance. Will transparently set {@code skip} and
* {@code limit} as well as applying the {@link Sort} instance defined with the {@link Pageable}.
*
* @param pageable
* @return
*/
public Query with(Pageable pageable) {
if (pageable == null) {
return this;
}
this.limit = pageable.getPageSize();
this.skip = pageable.getOffset();
return with(pageable.getSort());
}
/**
* Adds a {@link Sort} to the {@link Query} instance.
*
* @param sort
* @return
*/
public Query with(Sort sort) {
if (sort == null) {
return this;
}
if (this.coreSort == null) {
this.coreSort = sort;
} else {
this.coreSort = this.coreSort.and(sort);
}
return this;
}
public DBObject getQueryObject() {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
@@ -118,11 +176,26 @@ public class Query {
return fieldSpec.getFieldsObject();
}
@SuppressWarnings("deprecation")
public DBObject getSortObject() {
if (this.sort == null) {
if (this.coreSort == null && this.sort == null) {
return null;
}
return this.sort.getSortObject();
DBObject dbo = new BasicDBObject();
if (this.coreSort != null) {
for (org.springframework.data.domain.Sort.Order order : this.coreSort) {
dbo.put(order.getProperty(), order.isAscending() ? 1 : -1);
}
}
if (this.sort != null) {
dbo.putAll(this.sort.getSortObject());
}
return dbo;
}
public int getSkip() {
@@ -140,4 +213,60 @@ public class Query {
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Query: %s, Fields: %s, Sort: %s", serializeToJsonSafely(getQueryObject()),
serializeToJsonSafely(getFieldsObject()), serializeToJsonSafely(getSortObject()));
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Query that = (Query) obj;
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * criteria.hashCode();
result += 31 * nullSafeHashCode(fieldSpec);
result += 31 * nullSafeHashCode(sort);
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
return result;
}
}

View File

@@ -0,0 +1,110 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.Collection;
import java.util.Iterator;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Utility methods for JSON serialization.
*
* @author Oliver Gierke
*/
public abstract class SerializationUtils {
private SerializationUtils() {
}
/**
* Serializes the given object into pseudo-JSON meaning it's trying to create a JSON representation as far as possible
* but falling back to the given object's {@link Object#toString()} method if it's not serializable. Useful for
* printing raw {@link DBObject}s containing complex values before actually converting them into Mongo native types.
*
* @param value
* @return
*/
public static String serializeToJsonSafely(Object value) {
if (value == null) {
return null;
}
try {
return JSON.serialize(value);
} catch (Exception e) {
if (value instanceof Collection) {
return toString((Collection<?>) value);
} else if (value instanceof Map) {
return toString((Map<?, ?>) value);
} else if (value instanceof DBObject) {
return toString(((DBObject) value).toMap());
} else {
return String.format("{ $java : %s }", value.toString());
}
}
}
private static String toString(Map<?, ?> source) {
return iterableToDelimitedString(source.entrySet(), "{ ", " }", new Converter<Entry<?, ?>, Object>() {
public Object convert(Entry<?, ?> source) {
return String.format("\"%s\" : %s", source.getKey(), serializeToJsonSafely(source.getValue()));
}
});
}
private static String toString(Collection<?> source) {
return iterableToDelimitedString(source, "[ ", " ]", new Converter<Object, Object>() {
public Object convert(Object source) {
return serializeToJsonSafely(source);
}
});
}
/**
* Creates a string representation from the given {@link Iterable} prepending the postfix, applying the given
* {@link Converter} to each element before adding it to the result {@link String}, concatenating each element with
* {@literal ,} and applying the postfix.
*
* @param source
* @param prefix
* @param postfix
* @param transformer
* @return
*/
private static <T> String iterableToDelimitedString(Iterable<T> source, String prefix, String postfix,
Converter<? super T, Object> transformer) {
StringBuilder builder = new StringBuilder(prefix);
Iterator<T> iterator = source.iterator();
while (iterator.hasNext()) {
builder.append(transformer.convert(iterator.next()));
if (iterator.hasNext()) {
builder.append(", ");
}
}
return builder.append(postfix).toString();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,15 @@ import java.util.Map;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Helper class to define sorting criterias for a Query instance.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @deprecated use {@link org.springframework.data.domain.Sort} instead. See
* {@link Query#with(org.springframework.data.domain.Sort)}.
*/
@Deprecated
public class Sort {
private Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
@@ -40,7 +49,7 @@ public class Sort {
public DBObject getSortObject() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, (fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1));
dbo.put(k, fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1);
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,13 +15,23 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.dao.InvalidDataAccessApiUsageException;
/**
* Class to easily construct MongoDB update clauses.
*
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
*/
public class Update {
public enum Position {
@@ -40,6 +50,31 @@ public class Update {
return new Update().set(key, value);
}
/**
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exlude fields from making
* it into the created {@link Update} object.
*
* @param object the source {@link DBObject} to create the update from.
* @param exclude the fields to exclude.
* @return
*/
public static Update fromDBObject(DBObject object, String... exclude) {
Update update = new Update();
List<String> excludeList = Arrays.asList(exclude);
for (String key : object.keySet()) {
if (excludeList.contains(key)) {
continue;
}
update.set(key, object.get(key));
}
return update;
}
/**
* Update using the $set update modifier
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import java.io.InputStream;
import java.util.List;
import org.springframework.core.io.support.ResourcePatternResolver;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBObject;
@@ -29,6 +30,7 @@ import com.mongodb.gridfs.GridFSFile;
* Collection of operations to store and read files from MongoDB GridFS.
*
* @author Oliver Gierke
* @author Philipp Schneider
*/
public interface GridFsOperations extends ResourcePatternResolver {
@@ -41,30 +43,65 @@ public interface GridFsOperations extends ResourcePatternResolver {
*/
GridFSFile store(InputStream content, String filename);
/**
* Stores the given content into a file with the given name and content type.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType);
/**
* Stores the given content into a file with the given name using the given metadata. The metadata object will be
* marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, Object metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType, Object metadata);
/**
* Stores the given content into a file with the given name using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, DBObject metadata);
/**
* Returns all files matching the given query.
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType, DBObject metadata);
/**
* Returns all files matching the given query. Note, that currently {@link Sort} criterias defined at the
* {@link Query} will not be regarded as MongoDB does not support ordering for GridFS file access.
*
* @see https://jira.mongodb.org/browse/JAVA-431
* @param query
* @return
*/
@@ -102,4 +139,4 @@ public interface GridFsOperations extends ResourcePatternResolver {
* @see ResourcePatternResolver#getResources(String)
*/
GridFsResource[] getResources(String filenamePattern);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -42,6 +42,7 @@ import com.mongodb.gridfs.GridFSInputFile;
* {@link GridFsOperations} implementation to store content into MongoDB GridFS.
*
* @author Oliver Gierke
* @author Philipp Schneider
*/
public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver {
@@ -87,15 +88,37 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
return store(content, filename, (Object) null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String)
*/
public GridFSFile store(InputStream content, String filename, String contentType) {
return store(content, filename, contentType, (Object) null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.Object)
*/
public GridFSFile store(InputStream content, String filename, Object metadata) {
DBObject dbObject = new BasicDBObject();
converter.write(metadata, dbObject);
return store(content, filename, dbObject);
return store(content, filename, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String, java.lang.Object)
*/
public GridFSFile store(InputStream content, String filename, String contentType, Object metadata) {
DBObject dbObject = null;
if (metadata != null) {
dbObject = new BasicDBObject();
converter.write(metadata, dbObject);
}
return store(content, filename, contentType, dbObject);
}
/*
@@ -103,16 +126,30 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.DBObject)
*/
public GridFSFile store(InputStream content, String filename, DBObject metadata) {
return this.store(content, filename, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.DBObject)
*/
public GridFSFile store(InputStream content, String filename, String contentType, DBObject metadata) {
Assert.notNull(content);
Assert.hasText(filename);
Assert.notNull(metadata);
GridFSInputFile file = getGridFs().createFile(content);
file.setFilename(filename);
file.setMetaData(metadata);
file.save();
if (metadata != null) {
file.setMetaData(metadata);
}
if (contentType != null) {
file.setContentType(contentType);
}
file.save();
return file;
}

View File

@@ -0,0 +1,122 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.context.annotation.ComponentScan.Filter;
import org.springframework.context.annotation.Import;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
/**
* Annotation to activate MongoDB repositories. If no base package is configured through either {@link #value()},
* {@link #basePackages()} or {@link #basePackageClasses()} it will trigger scanning of the package of annotated class.
*
* @author Oliver Gierke
*/
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Inherited
@Import(MongoRepositoriesRegistrar.class)
public @interface EnableMongoRepositories {
/**
* Alias for the {@link #basePackages()} attribute. Allows for more concise annotation declarations e.g.:
* {@code @EnableJpaRepositories("org.my.pkg")} instead of {@code @EnableJpaRepositories(basePackages="org.my.pkg")}.
*/
String[] value() default {};
/**
* Base packages to scan for annotated components. {@link #value()} is an alias for (and mutually exclusive with) this
* attribute. Use {@link #basePackageClasses()} for a type-safe alternative to String-based package names.
*/
String[] basePackages() default {};
/**
* Type-safe alternative to {@link #basePackages()} for specifying the packages to scan for annotated components. The
* package of each class specified will be scanned. Consider creating a special no-op marker class or interface in
* each package that serves no purpose other than being referenced by this attribute.
*/
Class<?>[] basePackageClasses() default {};
/**
* Specifies which types are eligible for component scanning. Further narrows the set of candidate components from
* everything in {@link #basePackages()} to everything in the base packages that matches the given filter or filters.
*/
Filter[] includeFilters() default {};
/**
* Specifies which types are not eligible for component scanning.
*/
Filter[] excludeFilters() default {};
/**
* Returns the postfix to be used when looking up custom repository implementations. Defaults to {@literal Impl}. So
* for a repository named {@code PersonRepository} the corresponding implementation class will be looked up scanning
* for {@code PersonRepositoryImpl}.
*
* @return
*/
String repositoryImplementationPostfix() default "";
/**
* Configures the location of where to find the Spring Data named queries properties file. Will default to
* {@code META-INFO/mongo-named-queries.properties}.
*
* @return
*/
String namedQueriesLocation() default "";
/**
* Returns the key of the {@link QueryLookupStrategy} to be used for lookup queries for query methods. Defaults to
* {@link Key#CREATE_IF_NOT_FOUND}.
*
* @return
*/
Key queryLookupStrategy() default Key.CREATE_IF_NOT_FOUND;
/**
* Returns the {@link FactoryBean} class to be used for each repository instance. Defaults to
* {@link MongoRepositoryFactoryBean}.
*
* @return
*/
Class<?> repositoryFactoryBeanClass() default MongoRepositoryFactoryBean.class;
/**
* Configures the name of the {@link MongoTemplate} bean to be used with the repositories detected.
*
* @return
*/
String mongoTemplateRef() default "mongoTemplate";
/**
* Whether to automatically create indexes for query methods defined in the repository interface.
*
* @return
*/
boolean createIndexesForQueryMethods() default false;
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Annotation;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* Mongo-specific {@link ImportBeanDefinitionRegistrar}.
*
* @author Oliver Gierke
*/
class MongoRepositoriesRegistrar extends RepositoryBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoRepositories.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport#getExtension()
*/
@Override
protected RepositoryConfigurationExtension getExtension() {
return new MongoRepositoryConfigurationExtension();
}
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.data.mongodb.repository.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration;
import org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser;
import org.w3c.dom.Element;
/**
* {@link org.springframework.beans.factory.xml.BeanDefinitionParser} to create Mongo DB repositories from classpath
* scanning or manual definition.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigParser extends
AbstractRepositoryConfigDefinitionParser<SimpleMongoRepositoryConfiguration, MongoRepositoryConfiguration> {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser#getGlobalRepositoryConfigInformation(org.w3c.dom.Element)
*/
@Override
protected SimpleMongoRepositoryConfiguration getGlobalRepositoryConfigInformation(Element element) {
return new SimpleMongoRepositoryConfiguration(element);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser#postProcessBeanDefinition(org.springframework.data.repository.config.SingleRepositoryConfigInformation, org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.beans.factory.support.BeanDefinitionRegistry, java.lang.Object)
*/
@Override
protected void postProcessBeanDefinition(MongoRepositoryConfiguration context, BeanDefinitionBuilder builder,
BeanDefinitionRegistry registry, Object beanSource) {
builder.addPropertyReference("mongoOperations", context.getMongoTemplateRef());
builder.addPropertyValue("createIndexesForQueryMethods", context.getCreateQueryIndexes());
}
}

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport;
import org.springframework.data.repository.config.XmlRepositoryConfigurationSource;
import org.w3c.dom.Element;
/**
* {@link RepositoryConfigurationExtension} for MongoDB.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigurationExtension extends RepositoryConfigurationExtensionSupport {
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String CREATE_QUERY_INDEXES = "create-query-indexes";
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModulePrefix()
*/
@Override
protected String getModulePrefix() {
return "mongo";
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtension#getRepositoryFactoryClassName()
*/
public String getRepositoryFactoryClassName() {
return MongoRepositoryFactoryBean.class.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.XmlRepositoryConfigurationSource)
*/
@Override
public void postProcess(BeanDefinitionBuilder builder, XmlRepositoryConfigurationSource config) {
Element element = config.getElement();
ParsingUtils.setPropertyReference(builder, element, MONGO_TEMPLATE_REF, "mongoOperations");
ParsingUtils.setPropertyValue(builder, element, CREATE_QUERY_INDEXES, "createIndexesForQueryMethods");
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource)
*/
@Override
public void postProcess(BeanDefinitionBuilder builder, AnnotationRepositoryConfigurationSource config) {
AnnotationAttributes attributes = config.getAttributes();
builder.addPropertyReference("mongoOperations", attributes.getString("mongoTemplateRef"));
builder.addPropertyValue("createIndexesForQueryMethods", attributes.getBoolean("createIndexesForQueryMethods"));
}
}

View File

@@ -1,196 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AutomaticRepositoryConfigInformation;
import org.springframework.data.repository.config.ManualRepositoryConfigInformation;
import org.springframework.data.repository.config.RepositoryConfig;
import org.springframework.data.repository.config.SingleRepositoryConfigInformation;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link RepositoryConfig} implementation to create {@link MongoRepositoryConfiguration} instances for both automatic
* and manual configuration.
*
* @author Oliver Gierke
*/
public class SimpleMongoRepositoryConfiguration
extends
RepositoryConfig<SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration, SimpleMongoRepositoryConfiguration> {
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String CREATE_QUERY_INDEXES = "create-query-indexes";
private static final String DEFAULT_MONGO_TEMPLATE_REF = "mongoTemplate";
/**
* Creates a new {@link SimpleMongoRepositoryConfiguration} for the given {@link Element}.
*
* @param repositoriesElement
*/
protected SimpleMongoRepositoryConfiguration(Element repositoriesElement) {
super(repositoriesElement, MongoRepositoryFactoryBean.class.getName());
}
/**
* Returns the bean name of the {@link org.springframework.data.mongodb.core.core.MongoTemplate} to be referenced.
*
* @return
*/
public String getMongoTemplateRef() {
String templateRef = getSource().getAttribute(MONGO_TEMPLATE_REF);
return StringUtils.hasText(templateRef) ? templateRef : DEFAULT_MONGO_TEMPLATE_REF;
}
/**
* Returns whether to create indexes for query methods.
*
* @return
*/
public boolean getCreateQueryIndexes() {
String createQueryIndexes = getSource().getAttribute(CREATE_QUERY_INDEXES);
return StringUtils.hasText(createQueryIndexes) ? Boolean.parseBoolean(createQueryIndexes) : false;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.config.GlobalRepositoryConfigInformation
* #getAutoconfigRepositoryInformation(java.lang.String)
*/
public MongoRepositoryConfiguration getAutoconfigRepositoryInformation(String interfaceName) {
return new AutomaticMongoRepositoryConfiguration(interfaceName, this);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfig#getNamedQueriesLocation()
*/
public String getNamedQueriesLocation() {
return "classpath*:META-INF/mongo-named-queries.properties";
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.config.RepositoryConfig#
* createSingleRepositoryConfigInformationFor(org.w3c.dom.Element)
*/
@Override
protected MongoRepositoryConfiguration createSingleRepositoryConfigInformationFor(Element element) {
return new ManualMongoRepositoryConfiguration(element, this);
}
/**
* Simple interface for configuration values specific to Mongo repositories.
*
* @author Oliver Gierke
*/
public interface MongoRepositoryConfiguration extends
SingleRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> {
String getMongoTemplateRef();
boolean getCreateQueryIndexes();
}
/**
* Implements manual lookup of the additional attributes.
*
* @author Oliver Gierke
*/
private static class ManualMongoRepositoryConfiguration extends
ManualRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> implements MongoRepositoryConfiguration {
/**
* Creates a new {@link ManualMongoRepositoryConfiguration} for the given {@link Element} and parent.
*
* @param element
* @param parent
*/
public ManualMongoRepositoryConfiguration(Element element, SimpleMongoRepositoryConfiguration parent) {
super(element, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getAttribute(MONGO_TEMPLATE_REF);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getCreateQueryIndexes()
*/
public boolean getCreateQueryIndexes() {
String attribute = getAttribute(CREATE_QUERY_INDEXES);
return attribute == null ? false : Boolean.parseBoolean(attribute);
}
}
/**
* Implements the lookup of the additional attributes during automatic configuration.
*
* @author Oliver Gierke
*/
private static class AutomaticMongoRepositoryConfiguration extends
AutomaticRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> implements MongoRepositoryConfiguration {
/**
* Creates a new {@link AutomaticMongoRepositoryConfiguration} for the given interface and parent.
*
* @param interfaceName
* @param parent
*/
public AutomaticMongoRepositoryConfiguration(String interfaceName, SimpleMongoRepositoryConfiguration parent) {
super(interfaceName, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getParent().getMongoTemplateRef();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getCreateQueryIndexes()
*/
public boolean getCreateQueryIndexes() {
return getParent().getCreateQueryIndexes();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.springframework.data.mongodb.repository.query.QueryUtils.*;
import java.util.List;
import org.springframework.data.domain.PageImpl;
@@ -85,7 +83,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
} else if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
} else if (method.isCollectionQuery()) {
return new CollectionExecution().execute(query);
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
@@ -119,7 +117,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
protected List<?> readCollection(Query query) {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
return operations.find(query, metadata.getJavaType(), collectionName);
@@ -133,13 +131,19 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
class CollectionExecution extends Execution {
private final Pageable pageable;
CollectionExecution(Pageable pageable) {
this.pageable = pageable;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Object execute(Query query) {
return readCollection(query);
return readCollection(query.with(pageable));
}
}
@@ -171,11 +175,10 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
@SuppressWarnings({ "rawtypes", "unchecked" })
Object execute(Query query) {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(query, metadata.getCollectionName());
List<?> result = operations.find(applyPagination(query, pageable), metadata.getJavaType(),
metadata.getCollectionName());
List<?> result = operations.find(query.with(pageable), metadata.getJavaType(), metadata.getCollectionName());
return new PageImpl(result, pageable, count);
}
@@ -195,8 +198,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
@Override
Object execute(Query query) {
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return operations.findOne(query, entityInformation.getJavaType());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return operations.findOne(query, metadata.getJavaType());
}
}
@@ -233,8 +236,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
Object execute(Query query, Query countQuery) {
MongoEntityInformation<?, ?> information = method.getEntityInformation();
long count = operations.count(countQuery, information.getCollectionName());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(countQuery, metadata.getCollectionName());
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
}
@@ -251,12 +254,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Distance maxDistance = accessor.getMaxDistance();
if (maxDistance != null) {
nearQuery.maxDistance(maxDistance);
nearQuery.maxDistance(maxDistance).in(maxDistance.getMetric());
}
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, entityInformation.getJavaType(),
entityInformation.getCollectionName());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}
private boolean isListOfGeoResult() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,11 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -24,7 +28,11 @@ import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import com.mongodb.DBRef;
/**
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
@@ -78,12 +86,12 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getSort();
}
/* (non-Javadoc)
* @see org.springframework.data.repository.query.ParameterAccessor#getBindableParameter(int)
*/
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ParameterAccessor#getBindableValue(int)
*/
public Object getBindableValue(int index) {
return getConvertedValue(delegate.getBindableValue(index));
return getConvertedValue(delegate.getBindableValue(index), null);
}
/*
@@ -94,7 +102,8 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getMaxDistance();
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoParameterAccessor#getGeoNearLocation()
*/
public Point getGeoNearLocation() {
@@ -104,11 +113,12 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
/**
* Converts the given value with the underlying {@link MongoWriter}.
*
* @param value
* @param value can be {@literal null}.
* @param typeInformation can be {@literal null}.
* @return
*/
private Object getConvertedValue(Object value) {
return writer.convertToMongoType(value);
private Object getConvertedValue(Object value, TypeInformation<?> typeInformation) {
return writer.convertToMongoType(value, typeInformation == null ? null : typeInformation.getActualType());
}
/*
@@ -158,7 +168,28 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
* @see org.springframework.data.mongodb.repository.ConvertingParameterAccessor.PotentiallConvertingIterator#nextConverted()
*/
public Object nextConverted(MongoPersistentProperty property) {
return property.isAssociation() ? writer.toDBRef(next(), property) : getConvertedValue(next());
Object next = next();
if (next == null) {
return null;
}
if (property.isAssociation()) {
if (next.getClass().isArray() || next instanceof Iterable) {
List<DBRef> dbRefs = new ArrayList<DBRef>();
for (Object element : asCollection(next)) {
dbRefs.add(writer.toDBRef(element, property));
}
return dbRefs;
} else {
return writer.toDBRef(next, property);
}
}
return getConvertedValue(next, property.getTypeInformation());
}
/*
@@ -170,6 +201,33 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
}
}
/**
* Returns the given object as {@link Collection}. Will do a copy of it if it implements {@link Iterable} or is an
* array. Will return an empty {@link Collection} in case {@literal null} is given. Will wrap all other types into a
* single-element collction
*
* @param source
* @return
*/
private static Collection<?> asCollection(Object source) {
if (source instanceof Iterable) {
List<Object> result = new ArrayList<Object>();
for (Object element : (Iterable<?>) source) {
result.add(element);
}
return result;
}
if (source == null) {
return Collections.emptySet();
}
return source.getClass().isArray() ? CollectionUtils.arrayToList(source) : Collections.singleton(source);
}
/**
* Custom {@link Iterator} that adds a method to access elements in a converted manner.
*

View File

@@ -1,47 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import java.io.Serializable;
/**
* Interface for components being able to provide {@link EntityInformationCreator} for a given {@link Class}.
*
* @author Oliver Gierke
*/
public interface EntityInformationCreator {
/**
* Returns a {@link MongoEntityInformation} for the given domain class.
*
* @param domainClass the domain class to create the {@link MongoEntityInformation} for, must not be {@literal null}.
* @return
*/
<T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass);
/**
* Returns a {@link MongoEntityInformation} for the given domain class and class to retrieve the collection to query
* against from.
*
* @param domainClass the domain class to create the {@link MongoEntityInformation} for, must not be {@literal null}.
* @param collectionClass the class to derive the collection from queries to retrieve the domain classes from shall be
* ran against, must not be {@literal null}.
* @return
*/
<T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
Class<?> collectionClass);
}

View File

@@ -0,0 +1,33 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.repository.core.EntityMetadata;
/**
* Extension of {@link EntityMetadata} to additionally expose the collection name an entity shall be persisted to.
*
* @author Oliver Gierke
*/
public interface MongoEntityMetadata<T> extends EntityMetadata<T> {
/**
* Returns the name of the collection the entity shall be persisted to.
*
* @return
*/
String getCollectionName();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2010 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -87,9 +87,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#create(org.springframework.data.repository.query.parser.Part, java.util.Iterator)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#create(org.springframework.data.repository.query.parser.Part, java.util.Iterator)
*/
@Override
protected Criteria create(Part part, Iterator<Object> iterator) {
@@ -107,9 +107,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#and(org.springframework.data.repository.query.parser.Part, java.lang.Object, java.util.Iterator)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#and(org.springframework.data.repository.query.parser.Part, java.lang.Object, java.util.Iterator)
*/
@Override
protected Criteria and(Part part, Criteria base, Iterator<Object> iterator) {
@@ -120,20 +120,15 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
Criteria criteria = from(part.getType(), property,
where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
return from(part.getType(), property,
base.and(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return criteria.andOperator(criteria);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.parser.AbstractQueryCreator
* #or(java.lang.Object, java.lang.Object)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#or(java.lang.Object, java.lang.Object)
*/
@Override
protected Criteria or(Criteria base, Criteria criteria) {
@@ -142,12 +137,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.parser.AbstractQueryCreator
* #complete(java.lang.Object, org.springframework.data.domain.Sort)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#complete(java.lang.Object, org.springframework.data.domain.Sort)
*/
@Override
protected Query complete(Criteria criteria, Sort sort) {
@@ -155,11 +147,10 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return null;
}
Query query = new Query(criteria);
QueryUtils.applySorting(query, sort);
Query query = new Query(criteria).with(sort);
if (LOG.isDebugEnabled()) {
LOG.debug("Created query " + query.getQueryObject());
LOG.debug("Created query " + query);
}
return query;
@@ -236,7 +227,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.not().is(parameters.nextConverted(property));
return criteria.ne(parameters.nextConverted(property));
}
throw new IllegalArgumentException("Unsupported keyword!");
@@ -290,4 +281,4 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return source.replaceAll("\\*", ".*");
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,12 @@ import java.util.Arrays;
import java.util.List;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.geo.GeoPage;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.Parameters;
@@ -33,8 +36,7 @@ import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* TODO - Extract methods for {@link #getAnnotatedQuery()} into superclass as it is currently copied from Spring Data
* JPA
* Mongo specific implementation of {@link QueryMethod}.
*
* @author Oliver Gierke
*/
@@ -45,19 +47,24 @@ public class MongoQueryMethod extends QueryMethod {
.asList(GeoResult.class, GeoResults.class, GeoPage.class);
private final Method method;
private final MongoEntityInformation<?, ?> entityInformation;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private MongoEntityMetadata<?> metadata;
/**
* Creates a new {@link MongoQueryMethod} from the given {@link Method}.
*
* @param method
*/
public MongoQueryMethod(Method method, RepositoryMetadata metadata, EntityInformationCreator entityInformationCreator) {
public MongoQueryMethod(Method method, RepositoryMetadata metadata,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(method, metadata);
Assert.notNull(entityInformationCreator, "DefaultEntityInformationCreator must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
this.method = method;
this.entityInformation = entityInformationCreator.getEntityInformation(metadata.getReturnedDomainClass(method),
getDomainClass());
this.mappingContext = mappingContext;
}
/*
@@ -101,14 +108,30 @@ public class MongoQueryMethod extends QueryMethod {
return StringUtils.hasText(value) ? value : null;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryMethod#getEntityInformation()
*/
@Override
public MongoEntityInformation<?, ?> getEntityInformation() {
@SuppressWarnings("unchecked")
public MongoEntityMetadata<?> getEntityInformation() {
return entityInformation;
if (metadata == null) {
Class<?> returnedObjectType = getReturnedObjectType();
Class<?> domainClass = getDomainClass();
MongoPersistentEntity<?> returnedEntity = mappingContext.getPersistentEntity(getReturnedObjectType());
MongoPersistentEntity<?> managedEntity = mappingContext.getPersistentEntity(domainClass);
returnedEntity = returnedEntity == null ? managedEntity : returnedEntity;
MongoPersistentEntity<?> collectionEntity = domainClass.isAssignableFrom(returnedObjectType) ? returnedEntity
: managedEntity;
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(),
collectionEntity.getCollection());
}
return this.metadata;
}
/*
@@ -121,12 +144,11 @@ public class MongoQueryMethod extends QueryMethod {
}
/**
* Returns whether te query is a geoNear query.
* Returns whether te query is a geo near query.
*
* @return
*/
public boolean isGeoNearQuery() {
return isGeoNearQuery(this.method);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,12 +15,13 @@
*/
package org.springframework.data.mongodb.repository.query;
import com.mongodb.DBCursor;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
/**
* Collection of utility methods to apply sorting and pagination to a {@link DBCursor}.
*
@@ -36,10 +37,12 @@ public abstract class QueryUtils {
* Applies the given {@link Pageable} to the given {@link Query}. Will do nothing if {@link Pageable} is
* {@literal null}.
*
* @param query
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param pageable
* @return
*/
@Deprecated
public static Query applyPagination(Query query, Pageable pageable) {
if (pageable == null) {
@@ -49,16 +52,18 @@ public abstract class QueryUtils {
query.limit(pageable.getPageSize());
query.skip(pageable.getOffset());
return applySorting(query, pageable.getSort());
return query.with(pageable.getSort());
}
/**
* Applies the given {@link Sort} to the {@link Query}. Will do nothing if {@link Sort} is {@literal null}.
*
* @param query
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param sort
* @return
*/
@Deprecated
public static Query applySorting(Query query, Sort sort) {
if (sort == null) {

View File

@@ -0,0 +1,60 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.util.Assert;
/**
* Bean based implementation of {@link MongoEntityMetadata}.
*
* @author Oliver Gierke
*/
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final String collectionName;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and collection name.
*
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
*/
public SimpleMongoEntityMetadata(Class<T> type, String collectionName) {
Assert.notNull(type, "Type must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.type = type;
this.collectionName = collectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.EntityMetadata#getJavaType()
*/
public Class<T> getJavaType() {
return type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoEntityMetadata#getCollectionName()
*/
public String getCollectionName() {
return collectionName;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -73,7 +73,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
query = new BasicQuery(queryString);
}
QueryUtils.applySorting(query, accessor.getSort());
query.with(accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s", query.getQueryObject()));

View File

@@ -1,66 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.util.Assert;
/**
* Simple {@link EntityInformationCreator} to to create {@link MongoEntityInformation} instances based on a
* {@link MappingContext}.
*
* @author Oliver Gierke
*/
public class DefaultEntityInformationCreator implements EntityInformationCreator {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public DefaultEntityInformationCreator(
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
Assert.notNull(mappingContext);
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.EntityInformationCreator#getEntityInformation(java.lang.Class)
*/
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return getEntityInformation(domainClass, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.EntityInformationCreator#getEntityInformation(java.lang.Class, java.lang.Class)
*/
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
Class<?> collectionClass) {
MongoPersistentEntity<T> persistentEntity = (MongoPersistentEntity<T>) mappingContext
.getPersistentEntity(domainClass);
String customCollectionName = collectionClass == null ? null : mappingContext.getPersistentEntity(collectionClass)
.getCollection();
return new MappingMongoEntityInformation<T, ID>(persistentEntity, customCollectionName);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.query.MongoEntityMetadata;
import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
import org.springframework.data.mongodb.repository.query.QueryUtils;
import org.springframework.data.repository.core.support.QueryCreationListener;
@@ -85,7 +85,7 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
}
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
MongoEntityMetadata<?> metadata = query.getQueryMethod().getEntityInformation();
operations.indexOps(metadata.getCollectionName()).ensureIndex(index);
LOG.debug(String.format("Created %s!", index));
}
@@ -99,4 +99,4 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
org.springframework.data.domain.Sort.Order order = sort.getOrderFor(property);
return order == null ? Order.DESCENDING : order.isAscending() ? Order.ASCENDING : Order.DESCENDING;
}
}
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -57,7 +58,8 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends A
this.customCollectionName = customCollectionName;
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.EntityInformation#getId(java.lang.Object)
*/
@SuppressWarnings("unchecked")
@@ -65,6 +67,10 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends A
MongoPersistentProperty idProperty = entityMetadata.getIdProperty();
if (idProperty == null) {
return null;
}
try {
return (ID) BeanWrapper.create(entity, null).getProperty(idProperty);
} catch (Exception e) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,10 +21,11 @@ import java.io.Serializable;
import java.lang.reflect.Method;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.query.MongoQueryMethod;
import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
@@ -46,20 +47,18 @@ import org.springframework.util.Assert;
public class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoOperations mongoOperations;
private final EntityInformationCreator entityInformationCreator;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoTemplate} and {@link MappingContext}.
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoOperations}.
*
* @param template must not be {@literal null}
* @param mappingContext
* @param mongoOperations must not be {@literal null}
*/
public MongoRepositoryFactory(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
this.mongoOperations = mongoOperations;
this.entityInformationCreator = new DefaultEntityInformationCreator(mongoOperations.getConverter()
.getMappingContext());
this.mappingContext = mongoOperations.getConverter().getMappingContext();
}
/*
@@ -117,7 +116,7 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, mappingContext);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
@@ -136,7 +135,16 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getEntityInformation(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return entityInformationCreator.getEntityInformation(domainClass);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(domainClass);
if (entity == null) {
throw new MappingException(String.format("Could not lookup mapping metadata for domain class %s!",
domainClass.getName()));
}
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity);
}
}
}

View File

@@ -41,8 +41,8 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
* @param operations the operations to set
*/
public void setMongoOperations(MongoOperations operations) {
this.operations = operations;
setMappingContext(operations.getConverter().getMappingContext());
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,19 +17,14 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import java.util.List;
import java.util.regex.Pattern;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
@@ -37,14 +32,10 @@ import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.EntityPath;
import com.mysema.query.types.Expression;
import com.mysema.query.types.OrderSpecifier;
import com.mysema.query.types.Path;
import com.mysema.query.types.PathMetadata;
import com.mysema.query.types.Predicate;
import com.mysema.query.types.path.PathBuilder;
@@ -56,7 +47,6 @@ import com.mysema.query.types.path.PathBuilder;
public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleMongoRepository<T, ID> implements
QueryDslPredicateExecutor<T> {
private final MongodbSerializer serializer;
private final PathBuilder<T> builder;
/**
@@ -67,7 +57,6 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @param template
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
}
@@ -86,40 +75,27 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.serializer = new SpringDataMongodbSerializer(mongoOperations.getConverter());
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findOne(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findOne(com.mysema.query.types.Predicate)
*/
public T findOne(Predicate predicate) {
return createQueryFor(predicate).uniqueResult();
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate)
*/
public List<T> findAll(Predicate predicate) {
return createQueryFor(predicate).list();
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate,
* com.mysema.query.types.OrderSpecifier<?>[])
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, com.mysema.query.types.OrderSpecifier<?>[])
*/
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
@@ -128,11 +104,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate,
* org.springframework.data.domain.Pageable)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, org.springframework.data.domain.Pageable)
*/
public Page<T> findAll(Predicate predicate, Pageable pageable) {
@@ -144,13 +116,9 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #count(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#count(com.mysema.query.types.Predicate)
*/
public long count(Predicate predicate) {
return createQueryFor(predicate).count();
}
@@ -164,7 +132,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
Class<T> domainType = getEntityInformation().getJavaType();
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), serializer, domainType);
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), domainType);
return query.where(predicate);
}
@@ -219,40 +187,4 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return new OrderSpecifier(order.isAscending() ? com.mysema.query.types.Order.ASC
: com.mysema.query.types.Order.DESC, property);
}
/**
* Custom {@link MongodbSerializer} to take mapping information into account when building keys for constraints.
*
* @author Oliver Gierke
*/
static class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
}
@Override
protected String getKeyForPath(Path<?> expr, PathMetadata<?> metadata) {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}
@Override
protected DBObject asDBObject(String key, Object value) {
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,6 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.EntityPath;
/**
@@ -33,7 +32,6 @@ public abstract class QuerydslRepositorySupport {
private final MongoOperations template;
private final MappingContext<? extends MongoPersistentEntity<?>, ?> context;
private final MongodbSerializer serializer;
/**
* Creates a new {@link QuerydslRepositorySupport} for the given {@link MongoOperations}.
@@ -41,10 +39,11 @@ public abstract class QuerydslRepositorySupport {
* @param operations must not be {@literal null}
*/
public QuerydslRepositorySupport(MongoOperations operations) {
Assert.notNull(operations);
this.template = operations;
this.context = operations.getConverter().getMappingContext();
this.serializer = new MongodbSerializer();
}
/**
@@ -72,6 +71,6 @@ public abstract class QuerydslRepositorySupport {
Assert.notNull(path);
Assert.hasText(collection);
return new SpringDataMongodbQuery<T>(template, serializer, path.getType(), collection);
return new SpringDataMongodbQuery<T>(template, path.getType(), collection);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,6 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.query.QueryUtils;
import org.springframework.util.Assert;
/**
@@ -115,8 +114,11 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
entityInformation.getCollectionName()) != null;
final Query idQuery = getIdQuery(id);
idQuery.fields();
return mongoOperations.findOne(idQuery, entityInformation.getJavaType(), entityInformation.getCollectionName()) != null;
}
/*
@@ -197,7 +199,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
public Page<T> findAll(final Pageable pageable) {
Long count = count();
List<T> list = findAll(QueryUtils.applyPagination(new Query(), pageable));
List<T> list = findAll(new Query().with(pageable));
return new PageImpl<T>(list, pageable, count);
}
@@ -206,9 +208,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* (non-Javadoc)
* @see org.springframework.data.repository.PagingAndSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
public List<T> findAll(final Sort sort) {
return findAll(QueryUtils.applySorting(new Query(), sort));
public List<T> findAll(Sort sort) {
return findAll(new Query().with(sort));
}
private List<T> findAll(Query query) {

View File

@@ -21,7 +21,6 @@ import com.google.common.base.Function;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
/**
* Spring Data specfic {@link MongodbQuery} implementation.
@@ -36,30 +35,26 @@ class SpringDataMongodbQuery<T> extends MongodbQuery<T> {
* Creates a new {@link SpringDataMongodbQuery}.
*
* @param operations must not be {@literal null}.
* @param serializer must not be {@literal null}.
* @param type must not be {@literal null}.
*/
public SpringDataMongodbQuery(final MongoOperations operations, final MongodbSerializer serializer,
final Class<? extends T> type) {
this(operations, serializer, type, operations.getCollectionName(type));
public SpringDataMongodbQuery(final MongoOperations operations, final Class<? extends T> type) {
this(operations, type, operations.getCollectionName(type));
}
/**
* Creates a new {@link SpringDataMongodbQuery} to query the given collection.
*
* @param operations must not be {@literal null}.
* @param serializer must not be {@literal null}.
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
*/
public SpringDataMongodbQuery(final MongoOperations operations, final MongodbSerializer serializer,
final Class<? extends T> type, String collectionName) {
public SpringDataMongodbQuery(final MongoOperations operations, final Class<? extends T> type, String collectionName) {
super(operations.getCollection(collectionName), new Function<DBObject, T>() {
public T apply(DBObject input) {
return operations.getConverter().read(type, input);
}
}, serializer);
}, new SpringDataMongodbSerializer(operations.getConverter()));
this.operations = operations;
}
@@ -72,4 +67,4 @@ class SpringDataMongodbQuery<T> extends MongodbQuery<T> {
protected DBCollection getCollection(Class<?> type) {
return operations.getCollection(operations.getCollectionName(type));
}
}
}

View File

@@ -0,0 +1,83 @@
/*
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.util.regex.Pattern;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.Path;
import com.mysema.query.types.PathMetadata;
/**
* Custom {@link MongodbSerializer} to take mapping information into account when building keys for constraints.
*
* @author Oliver Gierke
*/
class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MongoConverter converter) {
Assert.notNull(converter, "MongoConverter must not be null!");
this.mappingContext = converter.getMappingContext();
this.converter = converter;
this.mapper = new QueryMapper(converter);
}
/*
* (non-Javadoc)
* @see com.mysema.query.mongodb.MongodbSerializer#getKeyForPath(com.mysema.query.types.Path, com.mysema.query.types.PathMetadata)
*/
@Override
protected String getKeyForPath(Path<?> expr, PathMetadata<?> metadata) {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}
/*
* (non-Javadoc)
* @see com.mysema.query.mongodb.MongodbSerializer#asDBObject(java.lang.String, java.lang.Object)
*/
@Override
protected DBObject asDBObject(String key, Object value) {
if ("_id".equals(key)) {
return super.asDBObject(key, mapper.convertId(value));
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}

View File

@@ -1,3 +1,4 @@
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd=org/springframework/data/mongodb/config/spring-mongo-1.0.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.2.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd

View File

@@ -14,7 +14,7 @@
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
@@ -44,8 +44,9 @@ The name of the mongo definition (by default "mongoDbFactory").]]></xsd:document
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>

View File

@@ -44,8 +44,9 @@ The name of the mongo definition (by default "mongoDbFactory").]]></xsd:document
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -103,15 +104,6 @@ The Mongo URI string.]]></xsd:documentation>
</xsd:complexType>
</xsd:element>
<xsd:complexType name="mongo-repository">
<xsd:complexContent>
<xsd:extension base="repository:repository">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
@@ -134,9 +126,6 @@ The Mongo URI string.]]></xsd:documentation>
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:sequence>
<xsd:element name="repository" minOccurs="0" maxOccurs="unbounded" type="mongo-repository"/>
</xsd:sequence>
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>

View File

@@ -0,0 +1,483 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoTemplate">
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:schema>

View File

@@ -0,0 +1,97 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.mapping.Document;
import com.mongodb.Mongo;
/**
* Unit tests for {@link AbstractMongoConfiguration}.
*
* @author Oliver Gierke
*/
public class AbstractMongoConfigurationUnitTests {
/**
* @see DATAMONGO-496
*/
@Test
public void usesConfigClassPackageAsBaseMappingPackage() throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration();
assertThat(configuration.getMappingBasePackage(), is(SampleMongoConfiguration.class.getPackage().getName()));
assertThat(configuration.getInitialEntitySet(), hasSize(1));
assertThat(configuration.getInitialEntitySet(), hasItem(Entity.class));
}
/**
* @see DATAMONGO-496
*/
@Test
public void doesNotScanPackageIfMappingPackageIsNull() throws ClassNotFoundException {
assertScanningDisabled(null);
}
/**
* @see DATAMONGO-496
*/
@Test
public void doesNotScanPackageIfMappingPackageIsEmpty() throws ClassNotFoundException {
assertScanningDisabled("");
assertScanningDisabled(" ");
}
private static void assertScanningDisabled(final String value) throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration() {
@Override
protected String getMappingBasePackage() {
return value;
}
};
assertThat(configuration.getMappingBasePackage(), is(value));
assertThat(configuration.getInitialEntitySet(), hasSize(0));
}
static class SampleMongoConfiguration extends AbstractMongoConfiguration {
@Override
protected String getDatabaseName() {
return "database";
}
@Bean
@Override
public Mongo mongo() throws Exception {
return new Mongo();
}
}
@Document
static class Entity {
}
}

View File

@@ -0,0 +1,68 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.joda.time.DateTime;
import org.junit.Test;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.data.annotation.CreatedDate;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.LastModifiedDate;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
/**
* Integration test for the auditing support.
*
* @author Oliver Gierke
*/
public class AuditingIntegrationTests {
@Test
public void enablesAuditingAndSetsPropertiesAccordingly() {
ApplicationContext context = new ClassPathXmlApplicationContext("auditing.xml", getClass());
Entity entity = new Entity();
BeforeConvertEvent<Entity> event = new BeforeConvertEvent<Entity>(entity);
context.publishEvent(event);
assertThat(entity.created, is(notNullValue()));
assertThat(entity.modified, is(entity.created));
entity.id = 1L;
event = new BeforeConvertEvent<Entity>(entity);
context.publishEvent(event);
assertThat(entity.created, is(notNullValue()));
assertThat(entity.modified, is(not(entity.created)));
}
class Entity {
@CreatedDate
DateTime created;
@LastModifiedDate
DateTime modified;
@Id
Long id;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.CommandResult;
import com.mongodb.Mongo;
@@ -36,47 +38,45 @@ import com.mongodb.ServerAddress;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
public class MongoNamespaceReplicaSetTests {
@Autowired
private ApplicationContext ctx;
@Test
@SuppressWarnings("unchecked")
public void testParsingMongoWithReplicaSets() throws Exception {
assertTrue(ctx.containsBean("replicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&replicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("127.0.0.1", replicaSetSeeds.get(0).getHost());
assertEquals(10001, replicaSetSeeds.get(0).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasItems(new ServerAddress("127.0.0.1", 10001), new ServerAddress("localhost", 10002)));
}
@Test
@SuppressWarnings("unchecked")
public void testParsingWithPropertyPlaceHolder() throws Exception {
assertTrue(ctx.containsBean("manyReplicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&manyReplicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("192.168.174.130", replicaSetSeeds.get(0).getHost());
assertEquals(27017, replicaSetSeeds.get(0).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(1).getHost());
assertEquals(27018, replicaSetSeeds.get(1).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(2).getHost());
assertEquals(27019, replicaSetSeeds.get(2).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasSize(3));
assertThat(
replicaSetSeeds,
hasItems(new ServerAddress("192.168.174.130", 27017), new ServerAddress("192.168.174.130", 27018),
new ServerAddress("192.168.174.130", 27019)));
}
@Test
@Ignore("CI infrastructure does not yet support replica sets")
public void testMongoWithReplicaSets() {
Mongo mongo = ctx.getBean(Mongo.class);
assertEquals(2, mongo.getAllAddress().size());
List<ServerAddress> servers = mongo.getAllAddress();
@@ -88,6 +88,5 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
MongoTemplate template = new MongoTemplate(mongo, "admin");
CommandResult result = template.executeCommand("{replSetGetStatus : 1}");
assertEquals("blort", result.getString("set"));
}
}

View File

@@ -1,42 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.reflect.Field;
public class NamespaceTestSupport {
@SuppressWarnings({ "unchecked" })
public static <T> T readField(String name, Object target) throws Exception {
Field field = null;
Class<?> clazz = target.getClass();
do {
try {
field = clazz.getDeclaredField(name);
} catch (Exception ex) {
}
clazz = clazz.getSuperclass();
} while (field == null && !clazz.equals(Object.class));
if (field == null)
throw new IllegalArgumentException("Cannot find field '" + name + "' in the class hierarchy of "
+ target.getClass());
field.setAccessible(true);
return (T) field.get(target);
}
}

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.ServerAddress;
/**
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
*/
public class ServerAddressPropertyEditorUnitTests {
ServerAddressPropertyEditor editor;
@Before
public void setUp() {
editor = new ServerAddressPropertyEditor();
}
/**
* @see DATAMONGO-454
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsAddressConfigWithoutASingleParsableServerAddress() {
editor.setAsText("foo, bar");
}
/**
* @see DATAMONGO-454
*/
@Test
public void skipsUnparsableAddressIfAtLeastOneIsParsable() throws UnknownHostException {
editor.setAsText("foo, localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-454
*/
@Test
public void handlesEmptyAddressAsParseError() throws UnknownHostException {
editor.setAsText(", localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));
Collection<ServerAddress> addresses = Arrays.asList((ServerAddress[]) result);
assertThat(addresses, hasSize(1));
assertThat(addresses, hasItem(new ServerAddress("localhost")));
}
}

View File

@@ -0,0 +1,81 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Helper classes to ease assertions on {@link DBObject}s.
*
* @author Oliver Gierke
*/
public abstract class DBObjectUtils {
private DBObjectUtils() {
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link DBObject} in turn and returns it.
*
* @param source the {@link DBObject} to lookup the nested one
* @param key the key of the field to lokup the nested {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(DBObject source, String key) {
return getTypedValue(source, key, DBObject.class);
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link BasicDBList}.
*
* @param source the {@link DBObject} to lookup the {@link BasicDBList} in
* @param key the key of the field to find the {@link BasicDBList} in
* @return
*/
public static BasicDBList getAsDBList(DBObject source, String key) {
return getTypedValue(source, key, BasicDBList.class);
}
/**
* Expects the list element with the given index to be a non-{@literal null} {@link DBObject} and returns it.
*
* @param source the {@link BasicDBList} to look up the {@link DBObject} element in
* @param index the index of the element expected to contain a {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(BasicDBList source, int index) {
assertThat(source.size(), greaterThanOrEqualTo(index + 1));
Object value = source.get(index);
assertThat(value, is(instanceOf(DBObject.class)));
return (DBObject) value;
}
@SuppressWarnings("unchecked")
private static <T> T getTypedValue(DBObject source, String key, Class<T> type) {
Object value = source.get(key);
assertThat(value, is(notNullValue()));
assertThat(value, is(instanceOf(type)));
return (T) value;
}
}

View File

@@ -0,0 +1,124 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import org.springframework.dao.DataAccessException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.scheduling.concurrent.ThreadPoolExecutorFactoryBean;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
/**
* Integration tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
*/
public class MongoDbUtilsIntegrationTests {
static final String DATABASE_NAME = "dbAuthTests";
static final UserCredentials CREDENTIALS = new UserCredentials("admin", "admin");
static Mongo mongo;
static MongoTemplate template;
static ThreadPoolExecutorFactoryBean factory;
static ExecutorService service;
Exception exception;
@BeforeClass
public static void setUp() throws Exception {
mongo = new Mongo();
template = new MongoTemplate(mongo, DATABASE_NAME);
// Create sample user
template.execute(new DbCallback<Void>() {
public Void doInDB(DB db) throws MongoException, DataAccessException {
db.addUser("admin", "admin".toCharArray());
return null;
}
});
factory = new ThreadPoolExecutorFactoryBean();
factory.setCorePoolSize(2);
factory.setMaxPoolSize(10);
factory.setWaitForTasksToCompleteOnShutdown(true);
factory.afterPropertiesSet();
service = factory.getObject();
}
@AfterClass
public static void tearDown() {
factory.destroy();
// Remove test database
template.execute(new DbCallback<Void>() {
public Void doInDB(DB db) throws MongoException, DataAccessException {
db.dropDatabase();
return null;
}
});
}
/**
* @see DATAMONGO-585
*/
@Test
public void authenticatesCorrectlyInMultithreadedEnvironment() throws Exception {
Callable<Void> callable = new Callable<Void>() {
public Void call() throws Exception {
try {
DB db = MongoDbUtils.getDB(mongo, DATABASE_NAME, CREDENTIALS);
assertThat(db, is(notNullValue()));
} catch (Exception o_O) {
MongoDbUtilsIntegrationTests.this.exception = o_O;
}
return null;
}
};
List<Callable<Void>> callables = new ArrayList<Callable<Void>>();
for (int i = 0; i < 10; i++) {
callables.add(callable);
}
service.invokeAll(callables);
if (exception != null) {
fail("Exception occurred!" + exception);
}
}
}

View File

@@ -0,0 +1,84 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDbUtilsUnitTests {
@Mock
Mongo mongo;
@Before
public void setUp() throws Exception {
when(mongo.getDB(anyString())).then(new Answer<DB>() {
public DB answer(InvocationOnMock invocation) throws Throwable {
return mock(DB.class);
}
});
TransactionSynchronizationManager.initSynchronization();
}
@After
public void tearDown() {
for (Object key : TransactionSynchronizationManager.getResourceMap().keySet()) {
TransactionSynchronizationManager.unbindResource(key);
}
TransactionSynchronizationManager.clearSynchronization();
}
@Test
public void returnsNewInstanceForDifferentDatabaseName() {
DB first = MongoDbUtils.getDB(mongo, "first");
DB second = MongoDbUtils.getDB(mongo, "second");
assertThat(second, is(not(first)));
}
@Test
public void returnsSameInstanceForSameDatabaseName() {
DB first = MongoDbUtils.getDB(mongo, "first");
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(sameInstance(first)));
}
}

View File

@@ -0,0 +1,159 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.io.IOException;
import java.net.UnknownHostException;
import org.junit.Before;
import org.junit.Test;
import org.springframework.core.NestedRuntimeException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import com.mongodb.ServerAddress;
/**
* Unit tests for {@link MongoExceptionTranslator}.
*
* @author Michal Vich
* @author Oliver Gierke
*/
public class MongoExceptionTranslatorUnitTests {
MongoExceptionTranslator translator;
@Before
public void setUp() {
translator = new MongoExceptionTranslator();
}
@Test
public void translateDuplicateKey() {
DuplicateKey exception = new DuplicateKey(1, "Duplicated key");
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
expectExceptionWithCauseMessage(translatedException, DuplicateKeyException.class, "Duplicated key");
}
@Test
public void translateNetwork() {
Network exception = new Network("IOException", new IOException("IOException"));
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
expectExceptionWithCauseMessage(translatedException, DataAccessResourceFailureException.class, "IOException");
}
@Test
public void translateCursorNotFound() throws UnknownHostException {
MongoException.CursorNotFound exception = new MongoException.CursorNotFound(1, new ServerAddress());
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
expectExceptionWithCauseMessage(translatedException, DataAccessResourceFailureException.class);
}
@Test
public void translateToDuplicateKeyException() {
checkTranslatedMongoException(DuplicateKeyException.class, 11000);
checkTranslatedMongoException(DuplicateKeyException.class, 11001);
}
@Test
public void translateToDataAccessResourceFailureException() {
checkTranslatedMongoException(DataAccessResourceFailureException.class, 12000);
checkTranslatedMongoException(DataAccessResourceFailureException.class, 13440);
}
@Test
public void translateToInvalidDataAccessApiUsageException() {
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 10003);
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 12001);
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 12010);
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 12011);
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 12012);
}
@Test
public void translateToUncategorizedMongoDbException() {
MongoException exception = new MongoException(0, "");
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
expectExceptionWithCauseMessage(translatedException, UncategorizedMongoDbException.class);
}
@Test
public void translateMongoInternalException() {
MongoInternalException exception = new MongoInternalException("Internal exception");
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
expectExceptionWithCauseMessage(translatedException, InvalidDataAccessResourceUsageException.class);
}
@Test
public void translateUnsupportedException() {
RuntimeException exception = new RuntimeException();
assertThat(translator.translateExceptionIfPossible(exception), is(nullValue()));
}
private void checkTranslatedMongoException(Class<? extends Exception> clazz, int code) {
try {
translator.translateExceptionIfPossible(new MongoException(code, ""));
fail("Expected exception of type " + clazz.getName() + "!");
} catch (NestedRuntimeException e) {
Throwable cause = e.getRootCause();
assertThat(cause, is(instanceOf(MongoException.class)));
assertThat(((MongoException) cause).getCode(), is(code));
}
}
private static void expectExceptionWithCauseMessage(NestedRuntimeException e,
Class<? extends NestedRuntimeException> type) {
expectExceptionWithCauseMessage(e, type, null);
}
private static void expectExceptionWithCauseMessage(NestedRuntimeException e,
Class<? extends NestedRuntimeException> type, String message) {
assertThat(e, is(instanceOf(type)));
if (message != null) {
assertThat(e.getRootCause(), is(notNullValue()));
assertThat(e.getRootCause().getMessage(), containsString(message));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,7 @@ import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -78,7 +79,7 @@ public abstract class MongoOperationsUnitTests {
return null;
}
public Object convertToMongoType(Object obj) {
public Object convertToMongoType(Object obj, TypeInformation<?> typeInformation) {
return null;
}

Some files were not shown because too many files have changed in this diff Show More