Compare commits

..

92 Commits

Author SHA1 Message Date
Spring Buildmaster
b171f4192d DATAMONGO-690 - Release version 1.3.0.M1. 2013-06-04 11:37:13 -07:00
Oliver Gierke
21a1ce985c DATAMONGO-690 - Prepare 1.3 M1 release.
Upgraded to Spring Data Build 1.1.RELEASE, Spring Data Commons 1.6 M1. Switched to milestone repository. Updated changelog. Refer to latest docs from Spring Data Commons.
2013-06-04 20:07:43 +02:00
Oliver Gierke
97caba50bf DATAMONGO-680, DATAMONGO-681 - Expose MongoOperations.exists(…).
We're now exposing dedicated exists(…) methods on MongoOperations which simply looks up a cursor and inspects it for the presence of at least one element. SimpleMongoRepository implementation now also uses this optimized exists check.
2013-05-24 09:39:25 +02:00
A. B. M. Kowser Patwary
818f739d5a DATAMONGO-676 - SimpleMongoRepository now consistently uses EntityInformation.
SimpleMongoRepository.findOne(…) & ….delete(…) now use entityInformation.getCollectionName() to resolve the collection name to interact with. This allows global customizations of the collection name on repositories.
2013-05-24 09:39:07 +02:00
Oliver Gierke
a44c1fdd2d DATAMONGO-683 - QueryMapper now handles default ID names it mapping metadata missing.
Changed the implementation so that _id is considered an id field if no metadata is present. Heavily refactored QueryMapper internals so that the conversion code is more readable.
2013-05-24 09:16:57 +02:00
Oliver Gierke
6b35ca80d4 DATAMONGO-679 - Fixed saving plain JSON strings in MongoTemplate.
MongoTemplate.doSave(…) didn't handle plain JSON strings correctly. It now correctly passes the marshaled object to the underlying method.
2013-05-23 20:40:16 +02:00
Patryk Wąsik
23b276745c DATAMONGO-677 - QueryMapper now handles DBRefs in Maps correctly.
QueryMapper now handle Map with DBRef value, which is needed to process an update in MongoTemplate.doUpdate(…) to save versioned document correctly.
2013-05-23 20:10:18 +02:00
Oliver Gierke
be0092d3f5 DATAMONGO-682 - Performance improvement for mapping hotspots.
This commit includes the MongoDB specific parts for the mapping subsystem performance improvements. Reworked PerformanceTest to output more reasonable numbers.

Heavily inspired by Patryk Wasik's contribution at https://github.com/SpringSource/spring-data-mongodb/pull/37.

GitHub PR: #37
2013-05-23 19:45:09 +02:00
Oliver Gierke
f36792d419 DATAMONGO-569 - Improved AbstractMongoConfiguration.
AbstractMongoConfiguration doesn't expose a Mongo instance anymore until you explicitly make it one by annotating the implementation method in the configuration sub lass with @Bean.

Removed the custom call to MongoMappingContext.initialize() as Spring call the lifecycle method for us anyway.
2013-05-17 19:50:35 +02:00
Oliver Gierke
31393db6ff DATACMNS-379 - Adapt to changes in Spring Data Commons. 2013-05-13 09:29:04 +02:00
Martin Baumgartner
8b50af07ce DATAMONGO-545 - Add BeforeDeleteEvent and AfterDeleteEvent.
Added events for before and after deletion of documents. Polished logging code in AbstractMongoEventListener.
2013-04-29 15:45:14 +02:00
Oliver Gierke
0eb315a758 DATAMONGO-667 - Removed deprecated types and added further deprecations.
Removed Sort in favor of Sort in Spring Data Commons. Deprecated Order in favor of Direction in Spring Data Commons. Changed implementation to use the types from Spring Data Commons.
2013-04-29 14:02:10 +02:00
Oliver Gierke
976f5dd0e3 DATAMONGO-666 - Fixed architecture violation caused by exception class.
Moved MongoDataIntegrityViolationException into core package to break up package cycle. Updated SOnargraph architecture description to capture issues more closely.
2013-04-29 13:04:24 +02:00
Philipp Schneider
f614364918 DATAMONGO-554 - Add background attribute to @Indexed and @CompoundIndex.
@Indexed and @CompoundIndex now carry a background flag that can be used to enable background indexing. Updated MongoPersistentEntityIndexCreator to consider the flag and hand it to the ensureIndex(…) call.
2013-04-29 11:18:44 +02:00
Philipp Schneider
38a86033be DATAMONGO-663 - Added equals(…) and hashCode() to Field. 2013-04-26 15:37:51 +02:00
Patryk Wąsik
d11c20d548 DATAMONGO-657 - Support DBRef for maps
Added capability to write DBRef instances for map values. Reading had been supported before but writing wasn't.
2013-04-26 14:44:31 +02:00
Oliver Gierke
3e2387ae6b Fix path to repo to make sure snapshot build sees all repositories. 2013-04-26 14:44:19 +02:00
Oliver Gierke
e8a1caec53 Fixed pom.xml to refer to correct snapshot repo. 2013-04-18 13:33:28 +02:00
Oliver Gierke
44c0b14018 DATAMONGO-658 - Major overhaul of README.md.
Simplified repository usage. Added JavaConfig sample. Use Markdown syntax consistently.
2013-04-18 10:38:17 +02:00
Arthur Loder
3daf3fc95b DATAMONGO-658 - Fixes in README.md. 2013-04-18 10:20:05 +02:00
Oliver Gierke
bd3aac8342 DATAMONGO-658 - Pull forward changes made to README in 1.2.x branch. 2013-04-18 10:11:06 +02:00
Oliver Gierke
94f697da10 DATAMONGO-656 - Fixed potential NPE in MongoTemplate. 2013-04-17 10:25:01 +02:00
Oliver Gierke
0cdec56a27 DATAMONGO-651 - MongoTemplate now throws Mongo-specific exception with WriteResult.
If the WriteResultChecking is set to EXCEPTION on a MongoTemplate, we now throw a Mongo-specific exception that captures both the WriteResult and MongoActionOperation for further evaluation.
2013-04-15 17:03:35 +02:00
Patryk Wąsik
9d83331f9f DATAMONGO-652 - Added support for $elemMatch and $ positional operator. 2013-04-12 16:13:58 +02:00
Oliver Gierke
071cd1647f DATAMONGO-571 - Fixed setting null values during update of versioned entities.
In case of updating a versioned object,the Update object is now constructed from plain key value pairs, not using $set anymore. This will correctly set the null values in the updated document.
2013-04-11 12:01:38 +02:00
Oliver Gierke
92af5aa345 DATAMONGO-650 - Added snapshot repository to resolve Spring Data Commons. 2013-04-11 10:57:04 +02:00
Oliver Gierke
c07ad0fdf6 DATAMONGO-648 - Replaced XSD ids with XSD strings.
This change allows usage of Spring Data MongoDB XML namespace elements with <bean /> element using a profile. This scenario creates the case of e.g. two <mongo:db-factory /> declarations in the same XML file.
2013-04-10 20:33:31 +02:00
Oliver Gierke
04e0f5c4a7 DATAMONGO-642 - MongoChangeSetPersister now considers mapped collection.
So far the change set persister has used the plain domain type name to persist data. We now consider the collection name defined by the object mapping (through @Document(collection = "…")).
2013-04-02 11:42:36 +02:00
Oliver Gierke
133975fb44 DATAMONGO-641 - Fixed potential NullPointerException in MongoLog4jAppender.
Log4j appender now only closes Mongo instance if available.
2013-04-02 10:56:34 +02:00
Oliver Gierke
9a372a57e0 DATAMONGO-641 - Reformatting to prepare fix. 2013-04-02 10:53:25 +02:00
Oliver Gierke
e67644094a DATAMONGO-628 - Allow dependency of config layer to GridFS.
Updated Sonargraph architecture description to allow a dependency form the configuration layer to the GridFS layer. Dependency was introduced by c5a99b5b5e.
2013-04-02 10:31:08 +02:00
Martin Baumgartner
c5a99b5b5e DATAMONGO-140, DATAMONGO-628 - Namespace elements for MongoTemplate and GridFsTemplate. 2013-04-02 09:50:45 +02:00
Oliver Gierke
9564bcb280 DATAMONGO-603 - Expose abbreviate field names flag in JavaConfig.
We're now exposing a abbreviateFieldNames() flag in AbstractMongoConfiguration to allow easy enablement of field abbreviation.
2013-03-28 17:16:50 +01:00
Oliver Gierke
f1e961a1ee DATAMONGO-607 - Introduced FieldNamingStrategy SPI interface.
MongoMappingContext can now get a FieldNamingStrategy configured to allow the customization of field names used to persist property values to in case *no manual mapping is defined* (e.g. through @Field). The default strategy will simply use the property name as it did before.

We now also expose a abbreviate-field-names attribute on the <mongo:mapping-context /> XML namespace element to transparently register a CamelCaseAbbreviatingFieldNamingStrategy which abbreviates the property's name to the first letters of its camel-case structure. A property fooBar would then be persisted to a field named fb. If you're not using the XML namespace simply configure the strategy on your MongoMappingContext instance.

To avoid field name mapping ambiguities being introduced through a custom FieldNamingStrategy (imagine the camel-case strategy just mentioned and two properties lastname and level which would both map to l) the PersistentEntity implementation verifies the mapping metadata and throws an exception in case an ambiguity is found.
2013-03-28 16:41:42 +01:00
Oliver Gierke
0c69c87787 DATAMONGO-638 - MappingContext does not create PersistentEntities for AbstractMaps. 2013-03-28 16:28:02 +01:00
Oliver Gierke
48b4a88a6a DATAMONGO-629 - Fixed QueryMapper not to massage queries without type information.
So far the QueryMapper applied the id massaging (especially interpreting the default id keys) even if there was no persistence metadata available to do so. This caused e.g. queries handed into MongoTemplate.count(Query, String) to get keys of "id" massaged into "_id" which shouldn't be the case as we cannot assume anything about the documents and the keys contained in them.

So we now only apply the defaults if there is at least persistence metadata present. This means that for methods on MongoOperations that don't take type information of any kind the queries have to be defined in terms of the document, not the object model as we cannot refer to it.
2013-03-27 12:01:53 +01:00
Andrey Bloschetsov
d0c0866f88 DATAMONGO-637 - Fixed typo in Query.query(…).
Correcting mispelled "critera". Polished JavaDoc a little.
2013-03-27 11:23:35 +01:00
Oliver Gierke
ab18bb5b96 DATAMONGO-636 - Added support for countBy in derived queries.
We now change the query execution to a count execution in case a derived query has the PartTree.isCountProjection() set to true or a query defined in @Query has the newly introduced count() attribute set to true.

As the native Mongo count() returns a long we use a default ConversionService to potentially massage the query result into other numerical types.
2013-03-26 17:51:46 +01:00
Oliver Gierke
509be3d681 DATAMONGO-633 - Upgraded to Querydsl APT plugin 1.0.8. 2013-03-26 17:26:33 +01:00
Oliver Gierke
8e01f95b29 DATAMONGO-635 - Fixed some Sonar warnings. 2013-03-25 18:13:08 +01:00
Oliver Gierke
4dcec1f6e2 DATAMONGO-634 - Expand scope of CDI bean to application scope.
We simply inherit the application scope from the parent implementation.
2013-03-25 17:16:32 +01:00
Oliver Gierke
d3bf6c0a19 DATAMONGO-632 - Removed schemaLocation attribute from namespace references. 2013-03-25 17:16:26 +01:00
Oliver Gierke
3410a0589c DATAMONGO-631 - Explictly reject Order instances with ignore-case flag. 2013-03-25 17:13:02 +01:00
Oliver Gierke
7a64766496 DATAMONGO-633 - Upgraded to Querydsl 3.0.0.
Adapted custom APT serializer implementation to the new API.
2013-03-25 17:12:55 +01:00
Oliver Gierke
e56a8597b8 DATAMONGO-622 - Saving unversioned object now uses doInsert(…).
We now rather use doInsert(…) if a versioned object is saved the first time to prevent accidental updates not bumping the version number.
2013-02-26 11:31:48 +01:00
Oliver Gierke
e13208b4b3 DATAMONGO-621 - Initializing version property now uses ConversionService.
The BeanWrapper used in MongoTemplate.initializeVersionProperty(…) now uses the ConversionService held in the MongoConverter.
2013-02-26 11:29:55 +01:00
Oliver Gierke
6139e83d8d DATAMONGO-620 - Updating versioned object uses explicit collection name.
We're now handing the collection name given to doSaveVersioned(…) to the update method.
2013-02-26 11:29:40 +01:00
Oliver Gierke
f33790013f DATAMONGO-617 - Fixed potential NullPointerException in MongoTemplate.insert(…).
If MongoTemplate.insert(…) was called with a Mongo-simple type (such as a raw DBBobject) it caused a NullPointerException during the lookup of a version property. This is now fixed by correcting the guard.
2013-02-20 12:04:24 +01:00
Oliver Gierke
bf81d95d21 DATAMONGO-613 - Readded JConsole image and upgrade to Spring Data Build 1.1. 2013-02-11 12:40:41 +01:00
Oliver Gierke
158e4f033c DATAMONGO-612 - Fixed reconfiguration of dist.id. 2013-02-11 12:38:40 +01:00
Spring Buildmaster
81097061ad DATAMONGO-609 - Prepare next development iteration. 2013-02-08 04:15:50 -08:00
Spring Buildmaster
2bc6ebc250 DATAMONGO-609 - Release version 1.2.0.RELEASE 2013-02-08 04:15:46 -08:00
Oliver Gierke
909110cf4e DATAMONGO-609 - Upgraded to parent project and Spring Data Commons in release versions. 2013-02-08 13:05:49 +01:00
Oliver Gierke
ba094da5a7 DATAMONGO-611 - Upgraded to MongoDB Java driver 2.10.1. 2013-02-08 13:04:33 +01:00
Oliver Gierke
876b31bc52 DATAMONGO-609 - Updated changelog. 2013-02-08 13:00:20 +01:00
Oliver Gierke
dc8e8281eb DATAMONGO-606 - Add JodaTime specific Converter implementations if JodaTime is present on classpath. 2013-02-06 17:53:41 +01:00
Oliver Gierke
48deb1a150 DATAMONGO-603 - Polishing unit tests. 2013-02-06 17:39:33 +01:00
Michal Vich
48625956b7 DATAMONGO-568 - MongoTemplate.find(…) does not throw NullPointerException anymore.
Trigger findAll(…) if the Query object given to a find(…) method is null.
2013-02-06 15:04:39 +01:00
Michal Vich
782cf6e10d DATAMONGO-81 - Added more unit tests for MongoExceptionTranslator. 2013-02-06 14:41:56 +01:00
Oliver Gierke
c28e51cf86 DATAMONGO-603 - Make sure geo queries and repositories get correct Metric applied.
Heavily refactored NearQuery to simplify implementation and testability. Made sure the metric of the maximum distance is applied if no metric had been defined before. Adapted query preparation code to re-enforce the correct metric being set to not rely on the auto-application behavior of NearQuery to be safe against potential further refactorings.
2013-02-01 15:38:29 +01:00
Oliver Gierke
4b1065cac5 DATAMONGO-598 - Set needed properties and activate distribution artifact upload. 2013-01-29 20:36:41 +01:00
Oliver Gierke
c807b2abcf DATAMONGO-601 - Fixed exposing password by MongoDbUtils.
MongoDbUtils is now not exposing the plain password in the exception message piped into CannotGetMongoDbConnectionException but uses the newly introduced toString() method of UserCredentials (see DATACMNS-275).
2013-01-29 13:16:06 +01:00
Oliver Gierke
19ad2d3aac DATACMNS-274 - Adapt to moved type MappingContextIsNewStrategyFactory. 2013-01-29 10:54:36 +01:00
Oliver Gierke
bd7fe5bfd3 DATAMONGO-600 - Fixed parameter binding for derived queries on properties using polymorphism.
We need to retain the type in the serialized DBObject in case a derived query binds parameters for properties that use polymorphism as we serialize the object as nested document and this only matches in an all-or-nothing way. So if the type information is missing, we won't see any results.

So we now hand the TypeInformation of the property into the conversion logic and transparently skip the type information removal in case the types differ.
2013-01-25 11:29:39 +01:00
Oliver Gierke
a237999037 DATAMONGO-598 - Upgraded to new build infrastructure.
Fixed test ordering issue in Cross-Store module.
2013-01-18 14:45:48 +01:00
Oliver Gierke
d8a9752724 DATAMONGO-592 - Fix handling of SpEL evaluation values.
During constructor parameter value resolution the value to be used can result from evaluating a SpEL expression on the root document. This value has to be converted recursively if it is a nested document. Adapted to the altered API in SD Commons and override potentiallyConvertSpelValue(…) to re-invoke converter.
2012-12-16 15:26:25 +01:00
Philipp Schneider
4c8bf0dec2 DATAMONGO-583 - Using while (…) instead of for (…) for DBCursors to avoid memory leak.
Instead of iterating over the DBCursor using a for-loop we now use a while-loop to avoid the potential memory leak outlined in [0].

[0] https://jira.mongodb.org/browse/JAVA-664
2012-12-13 12:07:11 +05:30
Oliver Gierke
cffc27d83a DATAMONGO-590 - Code cleanups in MongoTemplate and error handling APIs. 2012-12-13 12:06:55 +05:30
Oliver Gierke
42ab4cb7b4 DATAMONGO-588 - Version attribute now gets initialized on insert.
The version attribute of an entity has not been correctly initialized to 0 when inserting objects using MongoTemplate.insert(…). We now set it to 0 correctly.
2012-12-10 17:38:16 +09:00
Oliver Gierke
42df25434f DATAMONGO-585 - Fix concurrency issue in database authentication.
So far the authentication of the database was synchronized *after* the check whether the database under consideration had already been authenticated. This can cause threads to concurrently try to authenticate the database which is rejected by the driver. Improved the synchronization block to include both the ….isAuthenticated() check as well as the authentication.
2012-12-03 11:37:06 +01:00
Oliver Gierke
4e16f7ebe2 DATACMNS-378 - Fixed ClassCastException in MapReduceResults.
The parseCount(…) method now safely converts both integer and long values from the source DBObject.
2012-11-29 13:22:38 +01:00
Oliver Gierke
04a87b373d DATAMONGO-577 - Cleaned up obsolete property.
The version property in BasicMongoPersistentEntity became obsolete after the upgrade in Spring Data Commons.
2012-11-28 17:39:20 +01:00
Oliver Gierke
c4eca7eadd DATAMONGO-581 - Expose the PersistentEntity managed in the repository factory bean.
MongoRepositoryFactoryBean now populates the MappingContext of the super class when the MongoOperations instance is set.
2012-11-28 17:35:47 +01:00
Philipp Schneider
3a9dc2b98c DATAMONGO-503 - Added support for content type on GridFsOperations.
Added methods to take the content type of the file to be created. Clarified nullability of arguments in JavaDoc.

Pull request: #19.
2012-11-28 14:30:05 +01:00
Oliver Gierke
c568b7cbc2 DATAMONGO-580 - Improved BeanDefinitionParser for MappingMongoConverter. 2012-11-27 14:44:00 +01:00
Oliver Gierke
9482350062 DATAMONGO-577 - Added support for auditing.
Introduced AuditingEventListener to invoke auditing subsystem available through Spring Data Commons. Added <mongo:auditing /> namespace element to transparently activate it.
2012-11-27 14:00:52 +01:00
Oliver Gierke
772a140def DATAMONGO-576 - Configure java.util.logging to prevent verbose test logging.
Added Slf4j bridge and configured surefire to configure JUL accordingly.
2012-11-23 10:50:29 +01:00
Oliver Gierke
936259a766 DATAMONGO-575 - Improved implementation of entity metadata lookup in MongoQueryMethod.
Got rid of obsolete EntityInformationCreator API and moved to a custom EntityMetadata extension instead.
2012-11-23 10:00:49 +01:00
Oliver Gierke
533d21281e DATAMONGO-570 - Guard against null values in references.
QueryMapper does not try to convert null values into DBRef objects anymore but returns the plain null value as is.
2012-11-13 18:23:02 +01:00
Oliver Gierke
6977fa87e6 DATAMONGO-573 - Moved to Logback for test logging. 2012-11-13 17:54:19 +01:00
Oliver Gierke
41dcebb010 DATAMONGO-559 - Upgraded to Spring Data Commons 1.5.0.BUILD-SNAPSHOT. 2012-11-13 17:54:19 +01:00
Oliver Gierke
ef077182f6 DATAMONGO-563 - Upgraded MongoDB Java driver to 2.9.2. 2012-11-13 17:54:19 +01:00
Dave Syer
21e7c63766 Update README.md 2012-10-26 18:02:05 +02:00
Jan Kronquist
4b018a9d7d DATAMONGO-562 - Saving versioned entity with already set id now works.
Instead of inspecting the id property we now assume the version property unset for initialization and work from there.
2012-10-24 16:07:26 +02:00
Spring Buildmaster
ecf15b93e0 DATAMONGO-559 - Prepare next development iteration. 2012-10-17 15:00:21 -07:00
Spring Buildmaster
6323a86560 DATAMONGO-559 - Release version 1.1.1.RELEASE. 2012-10-17 14:59:59 -07:00
Oliver Gierke
3b0b7315e1 DATAMONGO-559 - Prepare 1.1.1.RELEASE release. 2012-10-17 17:53:39 -04:00
Oliver Gierke
4aeba6f92d DATAMONGO-553 - MappingMongoConverter now uses MongoPersistentProperty.usePropertyAccess() to decide whether to use property access.
Introduced usePropertyAccess() method on MongoPersistentProperty and added an implementation that forces that into being true for Throwable.cause as using the field would cause a cycle as it points to this in the first place but rather massages the value in the getter.
2012-10-17 17:27:15 -04:00
Oliver Gierke
342f9ae837 DATAMONGO-551 - MongoTemplate can now persist plain Strings if they are valid JSON.
Added some code to MongoTemplate that inspects the object to be saved for being a String. If it is we try to parse the given String into a JSON document and continue as if we had been given a DBObject initially. Non-parseable Strings are rejected with a MappingException.
2012-10-15 11:49:29 -04:00
Oliver Gierke
66d98b355e DATAMONGO-550 - Fixed potential NullPointerExceptions in MongoTemplate.
Guarded access to results of mappingContext.getPersistentEntity(…) to prevent NullPointerExceptions in case the template is used with non-entity types like a plain DBObject. Added some custom logic for plain BasicDBObjects to get the _id field populated appropriately.
2012-10-15 11:38:00 -04:00
Oliver Gierke
cabbe747f8 DATAMONGO-549 - Fixed potential NullPointerException in MongoTemplate.
Added assertions and guards against MongoPersistentEntity lookups resulting in null values.
2012-10-11 08:49:49 +02:00
Spring Buildmaster
5ff3064acd DATAMONGO-541 - Prepare next development iteration. 2012-10-10 05:30:21 -07:00
219 changed files with 7100 additions and 4267 deletions

197
README.md
View File

@@ -1,150 +1,141 @@
Spring Data MongoDB
======================
# Spring Data MongoDB
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB aims to provide a familiar and consistent Spring-based programming model for for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a Repository style data access layer
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
Getting Help
------------
## Getting Help
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to the The [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
For a comprehensive treatmet of all the Spring Data MongoDB features, please refer to:
The [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
The home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
For more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
* the [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
* the [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
Quick Start
-----------
## Quick Start
## MongoDB
### Maven configuration
For those in a hurry:
Add the Maven dependency:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
</dependency>
```
* Download the jar through Maven:
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
</dependency>
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.1.RELEASE</version>
</dependency>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven SNAPSHOT Repository</name>
<url>http://maven.springframework.org/snapshot</url>
</repository>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.springsource.org/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Connection Affinity callback
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
Future plans are to support optional logging and/or exception throwing based on WriteResult return value, common map-reduce operations, GridFS operations. A simple API for partial document updates is also planned.
### Spring Data repositories
### Easy Data Repository generation
To simplify the creation of data repositories Sprin Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
To simplify the creation of Data Repositories a generic Repository interface and default implementation is provided. Furthermore, Spring will automatically create a Repository implementation for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
The Repository interface is
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
public interface Repository<T, ID extends Serializable> {
List<Person> findByLastname(String lastname);
T save(T entity);
List<Person> findByFirstnameLike(String firstname);
}
```
List<T> save(Iterable<? extends T> entities);
The queries issued on execution will be derived from the method name. Exending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
T findById(ID id);
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
boolean exists(ID id);
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
List<T> findAll();
@Override
public Mongo mongo() throws Exception {
return new Mongo();
}
Long count();
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
void delete(T entity);
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
void delete(Iterable<? extends T> entities);
```xml
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
void deleteAll();
}
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
The MongoRepository extends Repository and will in future add more Mongo specific methods.
```java
@Service
public class MyService {
public interface MongoRepository<T, ID extends Serializable> extends
Repository<T, ID> {
}
private final PersonRepository repository;
SimpleMongoRepository is the out of the box implementation of the MongoRepository you can use for basid CRUD operations.
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
To go beyond basic CRUD, extend the MongoRepository interface and supply your own finder methods that follow simple naming conventions such that they can be easily converted into queries.
public void doWork() {
For example, given a Person class with first and last name properties, a PersonRepository interface that can query for Person by last name and when the first name matches a regular expression is shown below
repository.deleteAll();
public interface PersonRepository extends MongoRepository<Person, Long> {
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> findByLastname(String lastname);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
List<Person> findByFirstnameLike(String firstname);
}
You can have Spring automatically generate the implemention as shown below
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
<property name="defaultCollectionName" value="springdata" />
</bean>
<bean class="org.springframework.data.document.mongodb.repository.MongoRepositoryFactoryBean">
<property name="template" ref="template" />
<property name="repositoryInterface" value="org.springframework.data.document.mongodb.repository.PersonRepository" />
</bean>
This will register an object in the container named PersonRepository. You can use it as shown below
@Service
public class MyService {
@Autowired
PersonRepository repository;
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
Contributing to Spring Data
---------------------------
## Contributing to Spring Data
Here are some ways for you to get involved in the community:

380
pom.xml
View File

@@ -1,297 +1,99 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.1.0.RELEASE</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-parent</module>
</modules>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<developers>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<packaging>pom</packaging>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0</url>
<comments>
Copyright 2010 the original author or authors.
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.1.0.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
</modules>
http://www.apache.org/licenses/LICENSE-2.0
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.6.0.M1</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
<developers>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Project Lean</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
See the License for the specific language governing permissions and
limitations under the License.
</comments>
</license>
</licenses>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
<dist.releaseType>snapshot</dist.releaseType>
<dist.finalName>${dist.id}-${dist.version}</dist.finalName>
<dist.fileName>${dist.finalName}.zip</dist.fileName>
<dist.filePath>target/${dist.fileName}</dist.filePath>
<dist.bucketName>dist.springframework.org</dist.bucketName>
<!-- these properties should be in ~/.m2/settings.xml
<dist.accessKey>s3 access key</dist.accessKey>
<dist.secretKey>s3 secret key</dist.secretKey>
-->
</properties>
<build>
<extensions>
<extension>
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>com.agilejava.docbkx</groupId>
<artifactId>docbkx-maven-plugin</artifactId>
<!-- yes it really needs to be this (2.0.7) otherwise pdf generation from a clean build doesn't work -->
<version>2.0.7</version>
<executions>
<execution>
<goals>
<goal>generate-html</goal>
<goal>generate-pdf</goal>
</goals>
<phase>pre-site</phase>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.docbook</groupId>
<artifactId>docbook-xml</artifactId>
<version>4.4</version>
<scope>runtime</scope>
</dependency>
</dependencies>
<configuration>
<includes>index.xml</includes>
<xincludeSupported>true</xincludeSupported>
<foCustomization>${project.basedir}/src/docbkx/resources/xsl/fopdf.xsl</foCustomization>
<htmlStylesheet>css/html.css</htmlStylesheet>
<chunkedOutput>false</chunkedOutput>
<htmlCustomization>${project.basedir}/src/docbkx/resources/xsl/html.xsl</htmlCustomization>
<useExtensions>1</useExtensions>
<highlightSource>1</highlightSource>
<highlightDefaultLanguage />
<!-- callouts -->
<entities>
<entity>
<name>version</name>
<value>${project.version}</value>
</entity>
</entities>
<postProcess>
<copy todir="${project.basedir}/target/site/reference">
<fileset dir="${project.basedir}/target/docbkx">
<include name="**/*.html" />
<include name="**/*.pdf" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources">
<include name="**/*.css" />
<include name="**/*.png" />
<include name="**/*.gif" />
<include name="**/*.jpg" />
</fileset>
</copy>
<copy todir="${project.basedir}/target/site/reference/html">
<fileset dir="${project.basedir}/src/docbkx/resources/images">
<include name="*.png" />
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf" tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf" failonerror="false" />
</postProcess>
</configuration>
</plugin>
<plugin>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.5</version>
<configuration>
<aggregate>true</aggregate>
<breakiterator>true</breakiterator>
<header>Spring Data Document</header>
<source>1.5</source>
<quiet>true</quiet>
<javadocDirectory>${project.basedir}/src/main/javadoc</javadocDirectory>
<overview>${project.basedir}/src/main/javadoc/overview.html</overview>
<stylesheetfile>${project.basedir}/src/main/javadoc/spring-javadoc.css</stylesheetfile>
<!-- copies doc-files subdirectory which contains image resources -->
<docfilessubdirs>true</docfilessubdirs>
<links>
<link>http://static.springframework.org/spring/docs/3.0.x/javadoc-api</link>
<link>http://download.oracle.com/javase/1.5.0/docs/api</link>
<link>http://api.mongodb.org/java/2.3</link>
</links>
</configuration>
</plugin>
<plugin><!--
run `mvn package assembly:assembly` to trigger assembly creation.
see http://www.sonatype.com/books/mvnref-book/reference/assemblies-set-dist-assemblies.html -->
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-5</version>
<inherited>false</inherited>
<executions>
<execution>
<id>distribution</id>
<goals>
<goal>single</goal>
</goals>
<phase>package</phase>
<configuration>
<descriptors>
<descriptor>${project.basedir}/src/assembly/distribution.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>upload-dist</id>
<phase>deploy</phase>
<configuration>
<tasks>
<ant antfile="${basedir}/src/ant/upload-dist.xml">
<target name="upload-dist" />
</ant>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.springframework.build</groupId>
<artifactId>org.springframework.build.aws.ant</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<dependency>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
<version>0.7.2</version>
</dependency>
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>
<pluginRepositories>
<!-- Necessary for the build extension -->
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
</pluginRepositories>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/spring-data</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<scm>
<url>https://github.com/SpringSource/spring-data-mongodb</url>
</scm>
<dependencies>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-lib-milestone</id>
<url>http://repo.springsource.org/libs-milestone-local</url>
</repository>
</repositories>
</project>

View File

@@ -1,139 +1,145 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version.range}</version>
</dependency>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Persistence Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<dependencies>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.1.0.RELEASE</version>
</dependency>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${spring}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.0.M1</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>3.5.5-Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
</dependency>
</dependencies>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<scope>test</scope>
</dependency>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.2</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
<!-- <aspectLibrary>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-aspects</artifactId>
</aspectLibrary> -->
</aspectLibraries>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<source>${source.level}</source>
<target>${source.level}</target>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,8 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
@@ -34,6 +34,10 @@ import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
@@ -44,7 +48,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
protected final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
@@ -58,6 +62,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
@@ -100,6 +108,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
@@ -109,6 +121,10 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
return o;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
@@ -169,8 +185,13 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -21,13 +21,12 @@ import javax.persistence.EntityManager;
import javax.persistence.Transient;
import javax.persistence.Entity;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
@@ -44,7 +43,7 @@ import org.springframework.transaction.support.TransactionSynchronizationManager
*/
public aspect MongoDocumentBacking {
private static final Log LOGGER = LogFactory.getLog(MongoDocumentBacking.class);
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDocumentBacking.class);
// Aspect shared config
private ChangeSetPersister<Object> changeSetPersister;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,9 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
@@ -26,7 +28,6 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
@@ -35,55 +36,76 @@ import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
MongoTemplate mongoTemplate;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
EntityManager entityManager;
@Autowired
PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate;
@Before
public void setUp() {
txTemplate = new TransactionTemplate(transactionManager);
clearData(Person.class);
Address address = new Address(12, "MAin St.", "Boston", "MA", "02101");
Resume resume = new Resume();
resume.addEducation("Skanstulls High School, 1975");
resume.addEducation("Univ. of Stockholm, 1980");
resume.addJob("DiMark, DBA, 1990-2000");
resume.addJob("VMware, Developer, 2007-");
final Person person = new Person("Thomas", 20);
person.setAddress(address);
person.setResume(resume);
person.setId(1L);
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.persist(person);
return null;
}
});
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
@After
public void tearDown() {
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.remove(entityManager.find(Person.class, 1L));
return null;
}
});
}
private void clearData(Class<?> domainType) {
String collectionName = mongoTemplate.getCollectionName(domainType);
mongoTemplate.dropCollection(collectionName);
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
@@ -91,15 +113,18 @@ public class CrossStoreMongoTests {
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; " + "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
found.setAge(44);
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
entityManager.merge(found);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
@@ -111,14 +136,19 @@ public class CrossStoreMongoTests {
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
entityManager.detach(detached);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
Person result = entityManager.merge(detached);
entityManager.flush();
return result;
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
@@ -127,7 +157,7 @@ public class CrossStoreMongoTests {
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
@@ -154,8 +184,10 @@ public class CrossStoreMongoTests {
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
for (DBObject dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;

View File

@@ -1,12 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<!--
<logger name="org.springframework" level="debug" />
-->
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -0,0 +1,18 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="0",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
org.springframework.*;version="${spring30:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -0,0 +1,38 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
<name>Spring Data MongoDB - Distribution</name>
<description>Distribution build for Spring Data MongoDB</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,40 +1,30 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB Log4J Appender</name>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<dependencies>
<artifactId>spring-data-mongodb-log4j</artifactId>
<name>Spring Data MongoDB - Log4J Appender</name>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
</dependency>
<properties>
<log4j>1.2.16</log4j>
</properties>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependencies>
</dependencies>
<!-- Logging -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j}</version>
</dependency>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</dependencies>
</project>

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
@@ -21,188 +20,207 @@ import java.util.Arrays;
import java.util.Calendar;
import java.util.Map;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.Level;
import org.apache.log4j.MDC;
import org.apache.log4j.PatternLayout;
import org.apache.log4j.spi.LoggingEvent;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.WriteConcern;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Log4j appender writing log entries into a MongoDB instance.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoLog4jAppender extends AppenderSkeleton {
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
public static final String LEVEL = "level";
public static final String NAME = "name";
public static final String APP_ID = "applicationId";
public static final String TIMESTAMP = "timestamp";
public static final String PROPERTIES = "properties";
public static final String TRACEBACK = "traceback";
public static final String MESSAGE = "message";
public static final String YEAR = "year";
public static final String MONTH = "month";
public static final String DAY = "day";
public static final String HOUR = "hour";
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
protected String host = "localhost";
protected int port = 27017;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {
}
public MongoLog4jAppender() {
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
}
public String getHost() {
return host;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
public void setHost(String host) {
this.host = host;
}
public int getPort() {
return port;
}
public int getPort() {
return port;
}
public void setPort(int port) {
this.port = port;
}
public void setPort(int port) {
this.port = port;
}
public String getDatabase() {
return database;
}
public String getDatabase() {
return database;
}
public void setDatabase(String database) {
this.database = database;
}
public void setDatabase(String database) {
this.database = database;
}
public String getCollectionPattern() {
return collectionPattern;
}
public String getCollectionPattern() {
return collectionPattern;
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public void setCollectionPattern(String collectionPattern) {
this.collectionPattern = collectionPattern;
this.collectionLayout = new PatternLayout(collectionPattern);
}
public String getApplicationId() {
return applicationId;
}
public String getApplicationId() {
return applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setApplicationId(String applicationId) {
this.applicationId = applicationId;
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
public void setInfoOrLowerWriteConcern(String wc) {
this.infoOrLowerWriteConcern = WriteConcern.valueOf(wc);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
@SuppressWarnings({"unchecked"})
@Override
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)
*/
@Override
@SuppressWarnings({ "unchecked" })
protected void append(final LoggingEvent event) {
if (null == db) {
try {
connectToMongo();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
BasicDBObject dbo = new BasicDBObject();
if (null != applicationId) {
dbo.put(APP_ID, applicationId);
MDC.put(APP_ID, applicationId);
}
dbo.put(NAME, event.getLogger().getName());
dbo.put(LEVEL, event.getLevel().toString());
Calendar tstamp = Calendar.getInstance();
tstamp.setTimeInMillis(event.getTimeStamp());
dbo.put(TIMESTAMP, tstamp.getTime());
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());
}
dbo.put(PROPERTIES, propsDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Copy traceback info (if there is any) into the document
String[] traceback = event.getThrowableStrRep();
if (null != traceback && traceback.length > 0) {
BasicDBList tbDbo = new BasicDBList();
tbDbo.addAll(Arrays.asList(traceback));
dbo.put(TRACEBACK, tbDbo);
}
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Put the rendered message into the document
dbo.put(MESSAGE, event.getRenderedMessage());
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
// Insert the document
Calendar now = Calendar.getInstance();
MDC.put(YEAR, now.get(Calendar.YEAR));
MDC.put(MONTH, String.format("%1$02d", now.get(Calendar.MONTH) + 1));
MDC.put(DAY, String.format("%1$02d", now.get(Calendar.DAY_OF_MONTH)));
MDC.put(HOUR, String.format("%1$02d", now.get(Calendar.HOUR_OF_DAY)));
String coll = collectionLayout.format(event);
String coll = collectionLayout.format(event);
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
MDC.remove(YEAR);
MDC.remove(MONTH);
MDC.remove(DAY);
MDC.remove(HOUR);
if (null != applicationId) {
MDC.remove(APP_ID);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
WriteConcern wc;
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
wc = warnOrHigherWriteConcern;
} else {
wc = infoOrLowerWriteConcern;
}
db.getCollection(coll).insert(dbo, wc);
}
public void close() {
mongo.close();
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#close()
*/
public void close() {
public boolean requiresLayout() {
return true;
}
if (mongo != null) {
mongo.close();
}
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#requiresLayout()
*/
public boolean requiresLayout() {
return true;
}
}

View File

@@ -35,41 +35,41 @@ import org.junit.Test;
*/
public class AppenderTest {
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
private static final String NAME = AppenderTest.class.getName();
private Logger log = Logger.getLogger(NAME);
private Mongo mongo;
private DB db;
private String collection;
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
@Before
public void setup() {
try {
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
} catch (UnknownHostException e) {
throw new RuntimeException(e.getMessage(), e);
}
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import org.junit.Test;
/**
* Unit tests for {@link MongoLog4jAppender}.
*
* @author Oliver Gierke
*/
public class MongoLog4jAppenderUnitTests {
/**
* @see DATAMONGO-641
*/
@Test
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}
}

View File

@@ -5,6 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo.version:[=.=,+1.0.0)}",
org.apache.log4j.*;version="[1.2.15, 2.0.0)",
org.apache.log4j.spi.*;version="[1.2.15, 2.0.0)"
com.mongodb.*;version="${mongo:[=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,353 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.1.0.RELEASE</version>
<packaging>pom</packaging>
<developers>
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.SpringSource.com</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>-5</timezone>
</developer>
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
</developers>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0</url>
<comments>
Copyright 2010 the original author or authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied.
See the License for the specific language governing permissions and
limitations under the License.
</comments>
</license>
</licenses>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- versions for commonly-used dependencies -->
<mongo.version>2.9.1</mongo.version>
<junit.version>4.10</junit.version>
<log4j.version>1.2.16</log4j.version>
<org.mockito.version>1.9.0</org.mockito.version>
<org.slf4j.version>1.6.1</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.range>3.1.2.RELEASE</org.springframework.version.range>
<data.commons.version>1.4.0.RELEASE</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
<bundlor.failOnWarnings>true</bundlor.failOnWarnings>
</properties>
<distributionManagement>
<!-- see 'staging' profile for dry-run deployment settings -->
<downloadUrl>http://www.springsource.com/download/community
</downloadUrl>
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/snapshot-site
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>s3://maven.springframework.org/snapshot</url>
</snapshotRepository>
</distributionManagement>
<dependencies>
<!-- Test dependencies -->
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-library</artifactId>
<version>1.2.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit-dep</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>1.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${org.slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${org.slf4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<extensions>
<extension>
<!--
available only in the springframework maven repository. see
<repositories> section below
-->
<groupId>org.springframework.build.aws</groupId>
<artifactId>org.springframework.build.aws.maven</artifactId>
<version>3.1.0.RELEASE</version>
</extension>
</extensions>
<resources>
<resource>
<directory>${project.basedir}/src/main/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<directory>${project.basedir}/src/main/resources</directory>
<includes>
<include>**/*</include>
</includes>
</resource>
</resources>
<testResources>
<testResource>
<directory>${project.basedir}/src/test/java</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
<testResource>
<directory>${project.basedir}/src/test/resources</directory>
<includes>
<include>**/*</include>
</includes>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
<compilerArgument>-Xlint:-path</compilerArgument>
<showWarnings>true</showWarnings>
<showDeprecation>false</showDeprecation>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3.1</version>
<configuration>
<useDefaultManifestFile>true</useDefaultManifestFile>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<plugin>
<!--
configures the springsource bundlor plugin, which generates
OSGI-compatible MANIFEST.MF files during the 'compile' phase of
the maven build. this plugin is declared within the
pluginManagement section because not every module that inherits
from this pom needs bundlor's services, e.g.:
spring-integration-samples and all its children. for this reason,
all modules that wish to use bundlor must declare it explicitly.
it is not necessary to specify the <version> or <configuration>
sections, but groupId and artifactId are required. see
http://static.springsource.org/s2-bundlor/1.0.x/user-guide/html/ch04s03.html
for more info
-->
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
<version>1.0.0.RELEASE</version>
<configuration>
<failOnWarnings>${bundlor.failOnWarnings}</failOnWarnings>
</configuration>
<executions>
<execution>
<id>bundlor</id>
<goals>
<goal>bundlor</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.springsource.org/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>querydsl</id>
<url>http://source.mysema.com/maven2/releases</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>spring-libs-release</id>
<url>http://repo.springsource.org/libs-release</url>
</repository>
</repositories>
<reporting>
<plugins>
<plugin>
<!--
significantly speeds up the 'Dependencies' report during site
creation see
http://old.nabble.com/Skipping-dependency-report-during-Maven2-site-generation-td20116761.html
-->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>2.1</version>
<configuration>
<dependencyLocationsEnabled>false</dependencyLocationsEnabled>
</configuration>
</plugin>
</plugins>
</reporting>
<scm>
<url>https://github.com/SpringSource/spring-data-mongodb</url>
</scm>
</project>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.0.3.1152">
<context version="7.1.7.187">
<scope name="spring-data-mongodb" type="Project">
<element name="Filter" type="TypeFilterReferenceOverridden">
<element name="org.springframework.data.mongodb.**" type="IncludeTypePattern"/>
@@ -10,6 +10,7 @@
<element name="**.config.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|GridFS"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories"/>
</element>
@@ -93,6 +94,12 @@
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query"/>
</element>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="org.springframework.data.mongodb.*" type="IncludeTypePattern"/>
</element>
<stereotype name="Public"/>
</element>
</architecture>
<workspace>
<element name="src/main/java" type="JavaRootDirectory">

View File

@@ -1,84 +1,80 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
<description>MongoDB support for Spring Data</description>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
<version>1.3.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB</name>
<properties>
<querydsl.version>2.8.0</querydsl.version>
<cdi.version>1.0</cdi.version>
<validation.version>1.0.0.GA</validation.version>
<webbeans.version>1.1.3</webbeans.version>
<validation>1.0.0.GA</validation>
</properties>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version.range}</version>
<version>${spring}</version>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
<version>${data.commons.version}</version>
</dependency>
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo.version}</version>
<exclusions>
<exclusion>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
</exclusion>
</exclusions>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
<version>${springdata.commons}</version>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
@@ -90,7 +86,7 @@
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
@@ -98,14 +94,14 @@
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi.version}</version>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans.version}</version>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
@@ -120,7 +116,7 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation.version}</version>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -130,49 +126,28 @@
<version>4.2.0.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>performance-tests</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<includes>
<include>**/PerformanceTests.java</include>
</includes>
<excludes>
<exclude>none</exclude>
</excludes>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>maven-apt-plugin</artifactId>
<version>1.0.4</version>
<artifactId>apt-maven-plugin</artifactId>
<version>1.0.8</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<version>${querydsl}</version>
</dependency>
</dependencies>
<executions>
@@ -188,7 +163,25 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,12 +27,16 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -55,12 +59,12 @@ public abstract class AbstractMongoConfiguration {
protected abstract String getDatabaseName();
/**
* Return the {@link Mongo} instance to connect to.
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
@Bean
public abstract Mongo mongo() throws Exception;
/**
@@ -131,11 +135,25 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.initialize();
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
@@ -191,4 +209,15 @@ public abstract class AbstractMongoConfiguration {
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,11 +13,14 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Constants to declare bean names used by the namespace configuration.
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public abstract class BeanNames {
@@ -26,4 +29,8 @@ public abstract class BeanNames {
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
static final String MONGO_TEMPLATE = "mongoTemplate";
static final String GRID_FS_TEMPLATE = "gridFsTemplate";
}

View File

@@ -0,0 +1,78 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.gridfs.GridFsTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
BeanDefinitionBuilder gridFsTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(GridFsTemplate.class);
if (StringUtils.hasText(dbFactoryRef)) {
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
}
if (StringUtils.hasText(converterRef)) {
gridFsTemplateBuilder.addConstructorArgReference(converterRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
}
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE)
.getBeanDefinition();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,12 +24,12 @@ import java.util.List;
import java.util.Set;
import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -38,6 +38,7 @@ import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
@@ -49,9 +50,11 @@ import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -68,25 +71,28 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
*/
public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean jsr303Present = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : "mappingConverter";
}
/* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : "mappingConverter";
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, id);
createIsNewStrategyFactoryBeanDefinition(ctxRef, parserContext, element);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
@@ -111,18 +117,24 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(ctxRef));
indexHelperBuilder.addConstructorArgValue(new RuntimeBeanReference(dbFactoryRef));
registry.registerBeanDefinition(INDEX_HELPER, indexHelperBuilder.getBeanDefinition());
indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef);
indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
INDEX_HELPER));
}
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null) {
registry.registerBeanDefinition(VALIDATING_EVENT_LISTENER, validatingMongoEventListener);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatingMongoEventListener,
VALIDATING_EVENT_LISTENER));
}
return converterBuilder.getBeanDefinition();
parserContext.registerBeanComponent(new BeanComponentDefinition(converterBuilder.getBeanDefinition(), id));
parserContext.popAndRegisterContainingComponent();
return null;
}
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
@@ -136,7 +148,6 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
RuntimeBeanReference validator = getValidator(builder, parserContext);
if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.addConstructorArgValue(validator);
@@ -158,13 +169,13 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerComponent(new BeanComponentDefinition(validatorDef, validatorName));
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName);
}
private String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition) {
static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
@@ -191,11 +202,15 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
ctxRef = converterId + "." + MAPPING_CONTEXT;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
return ctxRef;
}
@@ -233,7 +248,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
AbstractBeanDefinition conversionsBean = conversionsBuilder.getBeanDefinition();
conversionsBean.setSource(parserContext.extractSource(element));
parserContext.getRegistry().registerBeanDefinition("customConversions", conversionsBean);
parserContext.registerBeanComponent(new BeanComponentDefinition(conversionsBean, "customConversions"));
return conversionsBean;
}
@@ -241,7 +256,7 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
private static Set<String> getInititalEntityClasses(Element element, BeanDefinitionBuilder builder) {
String basePackage = element.getAttribute(BASE_PACKAGE);
@@ -280,6 +295,19 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
return null;
}
public static String createIsNewStrategyFactoryBeanDefinition(String mappingContextRef, ParserContext context,
Element element) {
BeanDefinitionBuilder mappingContextStrategyFactoryBuilder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY));
return IS_NEW_STRATEGY_FACTORY;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
*
* @author Oliver Gierke
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEventListener.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
String mappingContextName = BeanNames.MAPPING_CONTEXT;
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT)) {
mappingContextName = MappingMongoConverterParser.potentiallyCreateMappingContext(element, parserContext, null,
BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
}
MappingMongoConverterParser.createIsNewStrategyFactoryBeanDefinition(mappingContextName, parserContext, element);
}
BeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(BeanNames.IS_NEW_STRATEGY_FACTORY);
BeanDefinition handlerBeanDefinition = parser.parse(element, parserContext);
builder.addConstructorArgValue(handlerBeanDefinition);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,17 +21,17 @@ import org.springframework.data.repository.config.RepositoryBeanDefinitionParser
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
*
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
@@ -42,5 +42,8 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());
registerBeanDefinitionParser("template", new MongoTemplateParser());
registerBeanDefinitionParser("gridFsTemplate", new GridFsTemplateParser());
}
}

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String converterRef = element.getAttribute("converter-ref");
String dbFactoryRef = element.getAttribute("db-factory-ref");
BeanDefinitionBuilder mongoTemplateBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoTemplate.class);
setPropertyValue(mongoTemplateBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(dbFactoryRef)) {
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
}
if (StringUtils.hasText(converterRef)) {
mongoTemplateBuilder.addConstructorArgReference(converterRef);
}
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder, BeanNames.MONGO_TEMPLATE)
.getBeanDefinition();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,59 +13,53 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern. The EntityClass is null only for
* the MongoActionOperaton.INSERT_LIST.
*
* {@link WriteConcern} based on this information.
* <ul>
* <li>INSERT, SAVE have null query</li>
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityClass, document, and query</li>
* <li>INSERT_LIST has null entityType, document, and query</li>
* </ul>
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public class MongoAction {
private String collectionName;
private WriteConcern defaultWriteConcern;
private Class<?> entityClass;
private MongoActionOperation mongoActionOperation;
private DBObject query;
private DBObject document;
private final String collectionName;
private final WriteConcern defaultWriteConcern;
private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation;
private final DBObject query;
private final DBObject document;
/**
* Create an instance of a MongoAction
* Create an instance of a {@link MongoAction}.
*
* @param defaultWriteConcern the default write concern
* @param defaultWriteConcern the default write concern.
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
* @param entityClass the POJO that is being operated against
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityClass, DBObject document, DBObject query) {
super();
String collectionName, Class<?> entityType, DBObject document, DBObject query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;
this.collectionName = collectionName;
this.entityClass = entityClass;
this.entityType = entityType;
this.query = query;
this.document = document;
}
@@ -78,8 +72,16 @@ public class MongoAction {
return defaultWriteConcern;
}
/**
* @deprecated use {@link #getEntityType()} instead.
*/
@Deprecated
public Class<?> getEntityClass() {
return entityClass;
return entityType;
}
public Class<?> getEntityType() {
return entityType;
}
public MongoActionOperation getMongoActionOperation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,8 +20,8 @@ package org.springframework.data.mongodb.core;
* for a given mutating operation
*
* @author Mark Pollack
* @author Oliver Gierke
* @see MongoAction
*
*/
public enum MongoActionOperation {

View File

@@ -0,0 +1,71 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.util.Assert;
import com.mongodb.WriteResult;
/**
* Mongo-specific {@link DataIntegrityViolationException}.
*
* @author Oliver Gierke
*/
public class MongoDataIntegrityViolationException extends DataIntegrityViolationException {
private static final long serialVersionUID = -186980521176764046L;
private final WriteResult writeResult;
private final MongoActionOperation actionOperation;
/**
* Creates a new {@link MongoDataIntegrityViolationException} using the given message and {@link WriteResult}.
*
* @param message the exception message
* @param writeResult the {@link WriteResult} that causes the exception, must not be {@literal null}.
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
*/
public MongoDataIntegrityViolationException(String message, WriteResult writeResult,
MongoActionOperation actionOperation) {
super(message);
Assert.notNull(writeResult, "WriteResult must not be null!");
Assert.notNull(actionOperation, "MongoActionOperation must not be null!");
this.writeResult = writeResult;
this.actionOperation = actionOperation;
}
/**
* Returns the {@link WriteResult} that caused the exception.
*
* @return the writeResult
*/
public WriteResult getWriteResult() {
return writeResult;
}
/**
* Returns the {@link MongoActionOperation} in which the current exception occured.
*
* @return the actionOperation
*/
public MongoActionOperation getActionOperation() {
return actionOperation;
}
}

View File

@@ -104,15 +104,16 @@ public abstract class MongoDbUtils {
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (credentialsGiven && !db.isAuthenticated()) {
synchronized (db) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (credentialsGiven && !db.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
synchronized (db) {
if (!db.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName
+ "], username = [" + username + "], password = [" + password + "]", databaseName, credentials);
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,12 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DuplicateKeyException;
@@ -29,21 +23,26 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
* exception from the {@code org.springframework.dao} hierarchy. Return {@literal null} if no translation is
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
* @author Oliver Gierke
* @author Michal Vich
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
/*
* (non-Javadoc)
*
* @see org.springframework.dao.support.PersistenceExceptionTranslator#
* translateExceptionIfPossible(java.lang.RuntimeException)
*/
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
// Check for well-known MongoException subclasses.
@@ -52,14 +51,23 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
if (ex instanceof DuplicateKey) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
@@ -69,9 +77,6 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
// If we get here, we have an exception that resulted from user code,
// rather than the persistence provider, so we return null to indicate

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -117,6 +117,7 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
Mongo mongo;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -408,11 +408,16 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
boolean exists(Query query, String collectionName);
boolean exists(Query query, Class<?> entityClass);
boolean exists(Query query, Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <p/>
@@ -442,7 +447,6 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the List of converted objects
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
@@ -464,7 +468,6 @@ public interface MongoOperations {
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
*
* @param <T>
* @return
*/
@@ -510,7 +513,6 @@ public interface MongoOperations {
* specification
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
*
* @return the converted object
*/
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
@@ -713,11 +715,12 @@ public interface MongoOperations {
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param <T>
* @param query
* @param entityClass
*/
<T> void remove(Query query, Class<T> entityClass);
void remove(Query query, Class<?> entityClass);
void remove(Query query, Class<?> entityClass, String collectionName);
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
@@ -734,4 +737,4 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,7 +44,6 @@ import org.springframework.core.convert.ConversionService;
import org.springframework.core.io.Resource;
import org.springframework.core.io.ResourceLoader;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.authentication.UserCredentials;
@@ -69,9 +68,11 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterDeleteEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeDeleteEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
@@ -101,6 +102,7 @@ import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
@@ -115,7 +117,7 @@ import com.mongodb.util.JSON;
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoTemplate.class);
private static final String ID = "_id";
private static final String ID_FIELD = "_id";
private static final WriteResultChecking DEFAULT_WRITE_RESULT_CHECKING = WriteResultChecking.NONE;
private static final Collection<String> ITERABLE_CLASSES;
@@ -129,32 +131,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ITERABLE_CLASSES = Collections.unmodifiableCollection(iterableClasses);
}
/*
* WriteConcern to be used for write operations if it has been specified.
* Otherwise we should not use a WriteConcern defaulting to the one set for
* the DB or Collection.
*/
private WriteConcern writeConcern = null;
private WriteConcernResolver writeConcernResolver = new DefaultWriteConcernResolver();
/*
* WriteResultChecking to be used for write operations if it has been
* specified. Otherwise we should not do any checking.
*/
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
/**
* Set the ReadPreference when operating on a collection. See {@link #prepareCollection(DBCollection)}
*/
private ReadPreference readPreference = null;
private final MongoConverter mongoConverter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoDbFactory mongoDbFactory;
private final MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
private final QueryMapper mapper;
private WriteConcern writeConcern;
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
private ReadPreference readPreference;
private ApplicationEventPublisher eventPublisher;
private ResourceLoader resourceLoader;
private MongoPersistentEntityIndexCreator indexCreator;
@@ -162,8 +148,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration
*
* @param mongo
* @param databaseName
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
public MongoTemplate(Mongo mongo, String databaseName) {
this(new SimpleMongoDbFactory(mongo, databaseName), null);
@@ -173,8 +159,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Constructor used for a template configuration with user credentials in the form of
* {@link org.springframework.data.authentication.UserCredentials}
*
* @param mongo
* @param databaseName
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param userCredentials
*/
public MongoTemplate(Mongo mongo, String databaseName, UserCredentials userCredentials) {
@@ -182,9 +168,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Constructor used for a basic template configuration
* Constructor used for a basic template configuration.
*
* @param mongoDbFactory
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoTemplate(MongoDbFactory mongoDbFactory) {
this(mongoDbFactory, null);
@@ -193,7 +179,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration.
*
* @param mongoDbFactory
* @param mongoDbFactory must not be {@literal null}.
* @param mongoConverter
*/
public MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter) {
@@ -227,7 +213,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Configures the {@link WriteConcern} to be used with the template.
* Configures the {@link WriteConcern} to be used with the template. If none is configured the {@link WriteConcern}
* configured on the {@link MongoDbFactory} will apply. If you configured a {@link Mongo} instance no
* {@link WriteConcern} will be used.
*
* @param writeConcern
*/
@@ -275,7 +263,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* can be found we manually add the internally created one as {@link ApplicationListener} to make sure indexes get
* created appropriately for entity types persisted through this {@link MongoTemplate} instance.
*
* @param context
* @param context must not be {@literal null}.
*/
private void prepareIndexCreator(ApplicationContext context) {
@@ -493,6 +481,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
public boolean exists(Query query, Class<?> entityClass) {
return exists(query, entityClass, determineCollectionName(entityClass));
}
public boolean exists(Query query, String collectionName) {
return exists(query, null, collectionName);
}
public boolean exists(Query query, Class<?> entityClass, String collectionName) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
}
DBObject mappedQuery = mapper.getMappedObject(query.getQueryObject(), getPersistentEntity(entityClass));
return execute(collectionName, new FindCallback(mappedQuery)).hasNext();
}
// Find methods that take a Query to express the query and that return a List of objects.
public <T> List<T> find(Query query, Class<T> entityClass) {
@@ -500,19 +506,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> List<T> find(final Query query, Class<T> entityClass, String collectionName) {
CursorPreparer cursorPreparer = query == null ? null : new QueryCursorPreparer(query);
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, cursorPreparer);
if (query == null) {
return findAll(entityClass, collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
}
public <T> T findById(Object id, Class<T> entityClass) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
return findById(id, entityClass, persistentEntity.getCollection());
return findById(id, entityClass, determineCollectionName(entityClass));
}
public <T> T findById(Object id, Class<T> entityClass, String collectionName) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
String idKey = idProperty == null ? ID : idProperty.getName();
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
String idKey = idProperty == null ? ID_FIELD : idProperty.getName();
return doFindOne(collectionName, new BasicDBObject(idKey, id), null, entityClass);
}
@@ -657,6 +667,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
initializeVersionProperty(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
@@ -669,6 +681,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
}
private void initializeVersionProperty(Object entity) {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity,
this.mongoConverter.getConversionService());
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
public void insert(Collection<? extends Object> batchToSave, Class<?> entityClass) {
doInsertBatch(determineCollectionName(entityClass), batchToSave, this.mongoConverter);
}
@@ -713,6 +736,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
List<DBObject> dbObjectList = new ArrayList<DBObject>();
for (T o : batchToSave) {
initializeVersionProperty(o);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(o));
@@ -733,15 +758,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void save(Object objectToSave) {
Assert.notNull(objectToSave);
save(objectToSave, determineEntityCollectionName(objectToSave));
}
public void save(Object objectToSave, String collectionName) {
Assert.notNull(objectToSave);
Assert.hasText(collectionName);
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(objectToSave.getClass());
// No optimistic locking -> simple save
if (!mongoPersistentEntity.hasVersionProperty()) {
if (mongoPersistentEntity == null || !mongoPersistentEntity.hasVersionProperty()) {
doSave(collectionName, objectToSave, this.mongoConverter);
return;
}
@@ -755,18 +785,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
this.mongoConverter.getConversionService());
MongoPersistentProperty idProperty = entity.getIdProperty();
MongoPersistentProperty versionProperty = entity.getVersionProperty();
Object id = beanWrapper.getProperty(idProperty);
Number version = beanWrapper.getProperty(versionProperty, Number.class, !versionProperty.usePropertyAccess());
// Fresh instance -> initialize version property
if (id == null) {
beanWrapper.setProperty(versionProperty, 0);
doSave(collectionName, objectToSave, this.mongoConverter);
if (version == null) {
doInsert(collectionName, objectToSave, this.mongoConverter);
} else {
assertUpdateableIdIfNotSet(objectToSave);
// Create query for entity with the id and old version
Object version = beanWrapper.getProperty(versionProperty);
Object id = beanWrapper.getProperty(idProperty);
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
@@ -779,9 +809,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
this.mongoConverter.write(objectToSave, dbObject);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbObject));
Update update = Update.fromDBObject(dbObject, ID);
Update update = Update.fromDBObject(dbObject, ID_FIELD);
updateFirst(query, update, objectToSave.getClass());
doUpdate(collectionName, query, update, objectToSave.getClass(), false, false);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbObject));
}
}
@@ -790,10 +820,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
DBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
if (!(objectToSave instanceof String)) {
writer.write(objectToSave, dbDoc);
} else {
try {
dbDoc = (DBObject) JSON.parse((String) objectToSave);
} catch (JSONParseException e) {
throw new MappingException("Could not parse given String to save into a JSON document!", e);
}
}
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
@@ -804,21 +843,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected Object insertDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
LOGGER.debug("Inserting DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
wr = collection.insert(dbDoc);
} else {
wr = collection.insert(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "insert");
return dbDoc.get(ID);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDoc) : collection.insert(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.INSERT);
return dbDoc.get(ID_FIELD);
}
});
}
@@ -829,27 +864,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert list of DBObjects containing " + dbDocList.size() + " items");
LOGGER.debug("Inserting list of DBObjects containing " + dbDocList.size() + " items");
}
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
wr = collection.insert(dbDocList);
} else {
wr = collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
}
handleAnyWriteResultErrors(wr, null, "insert_list");
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDocList) : collection.insert(
dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
handleAnyWriteResultErrors(writeResult, null, MongoActionOperation.INSERT_LIST);
return null;
}
});
List<ObjectId> ids = new ArrayList<ObjectId>();
for (DBObject dbo : dbDocList) {
Object id = dbo.get(ID);
Object id = dbo.get(ID_FIELD);
if (id instanceof ObjectId) {
ids.add((ObjectId) id);
} else {
@@ -862,21 +893,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected Object saveDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("save DBObject containing fields: " + dbDoc.keySet());
LOGGER.debug("Saving DBObject containing fields: " + dbDoc.keySet());
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
wr = collection.save(dbDoc);
} else {
wr = collection.save(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "save");
return dbDoc.get(ID);
WriteResult writeResult = writeConcernToUse == null ? collection.save(dbDoc) : collection.save(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.SAVE);
return dbDoc.get(ID_FIELD);
}
});
}
@@ -919,29 +946,25 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
LOGGER.debug("Calling update using query: " + queryObj + " and update: " + updateObj + " in collection: "
+ collectionName);
}
WriteResult wr;
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
entityClass, updateObj, queryObj);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
wr = collection.update(queryObj, updateObj, upsert, multi);
} else {
wr = collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
}
WriteResult writeResult = writeConcernToUse == null ? collection.update(queryObj, updateObj, upsert, multi)
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
if (entity != null && entity.hasVersionProperty() && !multi) {
if (wr.getN() == 0) {
if (writeResult.getN() == 0) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString());
+ updateObj.toMap().toString() + " to collection " + collectionName);
}
}
handleAnyWriteResultErrors(wr, queryObj, "update with '" + updateObj + "'");
return wr;
handleAnyWriteResultErrors(writeResult, queryObj, MongoActionOperation.UPDATE);
return writeResult;
}
});
}
@@ -976,11 +999,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
Class<?> objectType = object.getClass();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + entity.getType().getName());
throw new MappingException("No id property found for object of type " + objectType);
}
ConversionService service = mongoConverter.getConversionService();
@@ -993,7 +1017,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private void assertUpdateableIdIfNotSet(Object entity) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
MongoPersistentProperty idProperty = persistentEntity == null ? null : persistentEntity.getIdProperty();
if (idProperty == null) {
return;
@@ -1009,42 +1033,55 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
public <T> void remove(Query query, Class<T> entityClass) {
Assert.notNull(query);
doRemove(determineCollectionName(entityClass), query, entityClass);
public void remove(Query query, String collectionName) {
remove(query, null, collectionName);
}
public void remove(Query query, Class<?> entityClass) {
remove(query, entityClass, determineCollectionName(entityClass));
}
public void remove(Query query, Class<?> entityClass, String collectionName) {
doRemove(collectionName, query, entityClass);
}
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
}
Assert.hasText(collectionName, "Collection name must not be null or empty!");
final DBObject queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
DBObject dboq = mapper.getMappedObject(queryObject, entity);
WriteResult wr = null;
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + dboq + " in collection: " + collection.getName());
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { dboq, collection.getName() });
}
if (writeConcernToUse == null) {
wr = collection.remove(dboq);
} else {
wr = collection.remove(dboq, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dboq, "remove");
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
writeConcernToUse);
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass));
return null;
}
});
}
public void remove(final Query query, String collectionName) {
doRemove(collectionName, query, null);
}
public <T> List<T> findAll(Class<T> entityClass) {
return executeFindMultiInternal(new FindCallback(null), null, new ReadDbObjectCallback<T>(mongoConverter,
entityClass), determineCollectionName(entityClass));
@@ -1092,7 +1129,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("MapReduce command result = [%s]", serializeToJsonSafely(commandObject)));
LOGGER.debug("MapReduce command result = [{}]", serializeToJsonSafely(commandObject));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
@@ -1145,14 +1182,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Executing Group with DBObject [%s]", serializeToJsonSafely(commandObject)));
LOGGER.debug("Executing Group with DBObject [{}]", serializeToJsonSafely(commandObject));
}
CommandResult commandResult = executeCommand(commandObject, getDb().getOptions());
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
LOGGER.debug("Group command result = [{}]", commandResult);
}
@SuppressWarnings("unchecked")
@@ -1268,7 +1305,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCollection coll = db.createCollection(collectionName, collectionOptions);
// TODO: Emit a collection created event
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created collection [" + coll.getFullName() + "]");
LOGGER.debug("Created collection [{}]", coll.getFullName());
}
return coll;
}
@@ -1296,14 +1333,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
* <p/>
* Can be overridden by subclasses.
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless configured
* otherwise, an instance of SimpleMongoConverter will be used. The query document is specified as a standard DBObject
* and so is the fields specification. Can be overridden by subclasses.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1334,9 +1367,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter. The
* query document is specified as a standard DBObject and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
@@ -1433,6 +1465,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
if (savedObject instanceof BasicDBObject) {
DBObject dbObject = (DBObject) savedObject;
dbObject.put(ID_FIELD, id);
return;
}
MongoPersistentProperty idProp = getIdPropertyFor(savedObject.getClass());
if (idProp == null) {
@@ -1509,19 +1547,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
CursorPreparer preparer, DbObjectCallback<T> objectCallback, String collectionName) {
try {
DBCursor cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
DBCursor cursor = null;
try {
cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
}
List<T> result = new ArrayList<T>();
while (cursor.hasNext()) {
DBObject object = cursor.next();
result.add(objectCallback.doWith(object));
}
return result;
} finally {
if (cursor != null) {
cursor.close();
}
}
List<T> result = new ArrayList<T>();
for (DBObject object : cursor) {
result.add(objectCallback.doWith(object));
}
return result;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -1531,15 +1582,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DocumentCallbackHandler callbackHandler, String collectionName) {
try {
DBCursor cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
DBCursor cursor = null;
try {
cursor = collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
cursor = preparer.prepare(cursor);
}
while (cursor.hasNext()) {
DBObject dbobject = cursor.next();
callbackHandler.processDocument(dbobject);
}
} finally {
if (cursor != null) {
cursor.close();
}
}
for (DBObject dbobject : cursor) {
callbackHandler.processDocument(dbobject);
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
@@ -1550,7 +1613,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
private MongoPersistentProperty getIdPropertyFor(Class<?> type) {
return mappingContext.getPersistentEntity(type).getIdProperty();
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(type);
return persistentEntity == null ? null : persistentEntity.getIdProperty();
}
private <T> String determineEntityCollectionName(T obj) {
@@ -1577,37 +1641,45 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Checks and handles any errors.
* <p/>
* Current implementation logs errors. Future version may make this configurable to log warning, errors or throw
* exception.
* Handles {@link WriteResult} errors based on the configured {@link WriteResultChecking}.
*
* @param writeResult
* @param query
* @param operation
*/
protected void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
protected void handleAnyWriteResultErrors(WriteResult writeResult, DBObject query, MongoActionOperation operation) {
if (WriteResultChecking.NONE == this.writeResultChecking) {
if (writeResultChecking == WriteResultChecking.NONE) {
return;
}
String error = wr.getError();
String error = writeResult.getError();
if (error != null) {
String message;
if (operation.equals("insert") || operation.equals("save")) {
// assuming the insert operations will begin with insert string
if (error == null) {
return;
}
String message;
switch (operation) {
case INSERT:
case SAVE:
message = String.format("Insert/Save for %s failed: %s", query, error);
} else if (operation.equals("insert_list")) {
break;
case INSERT_LIST:
message = String.format("Insert list failed: %s", error);
} else {
break;
default:
message = String.format("Execution of %s%s failed: %s", operation,
query == null ? "" : "' using '" + query.toString() + "' query", error);
}
query == null ? "" : " using query " + query.toString(), error);
}
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.error(message);
return;
}
if (writeResultChecking == WriteResultChecking.EXCEPTION) {
throw new MongoDataIntegrityViolationException(message, writeResult, operation);
} else {
LOGGER.error(message);
return;
}
}
@@ -1695,7 +1767,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static class FindCallback implements CollectionCallback<DBCursor> {
private final DBObject query;
private final DBObject fields;
public FindCallback(DBObject query) {
@@ -1803,12 +1874,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private class DefaultWriteConcernResolver implements WriteConcernResolver {
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}
class QueryCursorPreparer implements CursorPreparer {
@@ -1872,7 +1944,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Creates a new {@link GeoNearResultDbObjectCallback} using the given {@link DbObjectCallback} delegate for
* {@link GeoResult} content unmarshalling.
*
* @param delegate
* @param delegate must not be {@literal null}.
*/
public GeoNearResultDbObjectCallback(DbObjectCallback<T> delegate, Metric metric) {
Assert.notNull(delegate);

View File

@@ -47,7 +47,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* Create an instance of {@link SimpleMongoDbFactory} given the {@link Mongo} instance and database name.
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, UserCredentials.NO_CREDENTIALS, false);
@@ -57,7 +57,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
*/
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,27 +13,25 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* A strategy interface to determine the WriteConcern to use for a given MongoDbAction.
*
* Return the passed in default WriteConcern (a property on MongoAction) if no determination can be made.
* A strategy interface to determine the {@link WriteConcern} to use for a given {@link MongoAction}. Return the passed
* in default {@link WriteConcern} (a property on {@link MongoAction}) if no determination can be made.
*
* @author Mark Pollack
*
* @author Oliver Gierke
*/
public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* Resolve the {@link WriteConcern} given the {@link MongoAction}.
*
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not
* be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
* @param action describes the context of the Mongo action. Contains a default {@link WriteConcern} to use if one
* should not be resolved.
* @return a {@link WriteConcern} based on the passed in {@link MongoAction} value, maybe {@literal null}.
*/
WriteConcern resolve(MongoAction action);
}

View File

@@ -1,5 +1,28 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Enum to represent how strict the check of {@link com.mongodb.WriteResult} shall be. It can either be skipped entirely
* (use {@link #NONE}), or errors can be logged ({@link #LOG}) or cause an exception to be thrown {@link #EXCEPTION}.
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
public enum WriteResultChecking {
NONE, LOG, EXCEPTION
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.math.BigInteger;
@@ -33,8 +32,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObj
* Base class for {@link MongoConverter} implementations. Sets up a {@link GenericConversionService} and populates basic
* converters. Allows registering {@link CustomConversions}.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
* @author Jon Brisbin
* @author Oliver Gierke
*/
public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean {
@@ -94,6 +93,14 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
conversions.registerConvertersIn(conversionService);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object)
*/
public Object convertToMongoType(Object obj) {
return convertToMongoType(obj, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getConversionService()

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,9 +17,11 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import org.slf4j.Logger;
@@ -31,6 +33,7 @@ import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
@@ -62,6 +65,7 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final Map<Class<?>, HashMap<Class<?>, CacheValue>> cache;
private final List<Object> converters;
@@ -84,6 +88,7 @@ public class CustomConversions {
this.readingPairs = new HashSet<ConvertiblePair>();
this.writingPairs = new HashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.cache = new HashMap<Class<?>, HashMap<Class<?>, CacheValue>>();
this.converters = new ArrayList<Object>();
this.converters.add(CustomToStringConverter.INSTANCE);
@@ -93,6 +98,7 @@ public class CustomConversions {
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
this.converters.addAll(converters);
for (Object c : this.converters) {
@@ -266,9 +272,11 @@ public class CustomConversions {
* @return
*/
public boolean hasCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
Assert.notNull(source);
Assert.notNull(expectedTargetType);
return getCustomTarget(source, expectedTargetType, readingPairs) != null;
return getCustomReadTarget(source, expectedTargetType) != null;
}
/**
@@ -297,8 +305,32 @@ public class CustomConversions {
return null;
}
private Class<?> getCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
Class<?> type = expectedTargetType == null ? PlaceholderType.class : expectedTargetType;
Map<Class<?>, CacheValue> map;
CacheValue toReturn;
if ((map = cache.get(source)) == null || (toReturn = map.get(type)) == null) {
Class<?> target = getCustomTarget(source, type, readingPairs);
if (cache.get(source) == null) {
cache.put(source, new HashMap<Class<?>, CacheValue>());
}
Map<Class<?>, CacheValue> value = cache.get(source);
toReturn = target == null ? CacheValue.NULL : new CacheValue(target);
value.put(type, toReturn);
}
return toReturn.clazz;
}
@WritingConverter
private enum CustomToStringConverter implements GenericConverter {
INSTANCE;
public Set<ConvertiblePair> getConvertibleTypes() {
@@ -311,4 +343,30 @@ public class CustomConversions {
return source.toString();
}
}
/**
* Placeholder type to allow registering not-found values in the converter cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
*/
private static class PlaceholderType {
}
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
*/
private static class CacheValue {
public static final CacheValue NULL = new CacheValue(null);
private final Class<?> clazz;
public CacheValue(Class<?> clazz) {
this.clazz = clazz;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 by the original author(s).
* Copyright 2011-2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,11 +31,13 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
@@ -46,6 +48,7 @@ import org.springframework.data.mapping.model.PersistentEntityParameterValueProv
import org.springframework.data.mapping.model.PropertyValueProvider;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELExpressionParameterValueProvider;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -68,6 +71,7 @@ import com.mongodb.DBRef;
*
* @author Oliver Gierke
* @author Jon Brisbin
* @author Patrik Wasik
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
@@ -215,9 +219,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent);
parameterProvider.setSpELEvaluator(evaluator);
return parameterProvider;
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, Object parent) {
@@ -235,10 +239,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
boolean isConstructorProperty = entity.isConstructorArgument(prop);
boolean hasValueForProperty = dbo.containsField(prop.getFieldName());
if (!hasValueForProperty || isConstructorProperty) {
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return;
}
@@ -346,13 +347,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
final BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(obj, conversionService);
// Write the ID
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
boolean fieldAccessOnly = idProperty.usePropertyAccess() ? false : useFieldAccessOnly;
try {
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {
}
@@ -366,7 +368,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
Object propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
boolean fieldAccessOnly = prop.usePropertyAccess() ? false : useFieldAccessOnly;
Object propertyObj = wrapper.getProperty(prop, prop.getType(), fieldAccessOnly);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
@@ -408,8 +412,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (valueType.isMap()) {
BasicDBObject mapDbObj = new BasicDBObject();
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
DBObject mapDbObj = createMap((Map<Object, Object>) obj, prop);
dbo.put(name, mapDbObj);
return;
}
@@ -489,6 +492,42 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbList;
}
/**
* Writes the given {@link Map} using the given {@link MongoPersistentProperty} information.
*
* @param map must not {@literal null}.
* @param property must not be {@literal null}.
* @return
*/
protected DBObject createMap(Map<Object, Object> map, MongoPersistentProperty property) {
Assert.notNull(map, "Given map must not be null!");
Assert.notNull(property, "PersistentProperty must not be null!");
if (!property.isDbReference()) {
return writeMapInternal(map, new BasicDBObject(), property.getTypeInformation());
}
BasicDBObject dbObject = new BasicDBObject();
for (Map.Entry<Object, Object> entry : map.entrySet()) {
Object key = entry.getKey();
Object value = entry.getValue();
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property.getDBRef()) : null);
} else {
throw new MappingException("Cannot use a complex object as a key value.");
}
}
return dbObject;
}
/**
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
*
@@ -810,8 +849,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return rootList;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
*/
@SuppressWarnings("unchecked")
public Object convertToMongoType(Object obj) {
public Object convertToMongoType(Object obj, TypeInformation<?> typeInformation) {
if (obj == null) {
return null;
@@ -822,7 +865,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return conversionService.convert(obj, target);
}
if (null != obj && conversions.isSimpleType(obj.getClass())) {
if (conversions.isSimpleType(obj.getClass())) {
// Doesn't need conversion
return getPotentiallyConvertedSimpleWrite(obj);
}
@@ -858,7 +901,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return removeTypeInfoRecursively(newDbo);
if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
}
public BasicDBList maybeConvertList(Iterable<?> source) {
@@ -930,7 +978,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* (non-Javadoc)
* @see org.springframework.data.convert.PropertyValueProvider#getPropertyValue(org.springframework.data.mapping.PersistentProperty)
*/
@SuppressWarnings("unchecked")
public <T> T getPropertyValue(MongoPersistentProperty property) {
String expression = property.getSpelExpression();
@@ -940,20 +987,60 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
TypeInformation<?> type = property.getTypeInformation();
Class<?> rawType = type.getType();
return readValue(value, property.getTypeInformation(), parent);
}
}
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
/**
* Extension of {@link SpELExpressionParameterValueProvider} to recursively trigger value conversion on the raw
* resolved SpEL value.
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent;
/**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
*
* @param evaluator must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) {
super(evaluator, conversionService, delegate);
this.parent = parent;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.SpELExpressionParameterValueProvider#potentiallyConvertSpelValue(java.lang.Object, org.springframework.data.mapping.PreferredConstructor.Parameter)
*/
@Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent);
}
}
@SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
} else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
} else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.EntityWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -35,11 +36,21 @@ public interface MongoWriter<T> extends EntityWriter<T, DBObject> {
* Converts the given object into one Mongo will be able to store natively. If the given object can already be stored
* as is, no conversion will happen.
*
* @param obj
* @param obj can be {@literal null}.
* @return
*/
Object convertToMongoType(Object obj);
/**
* Converts the given object into one Mongo will be able to store natively but retains the type information in case
* the given {@link TypeInformation} differs from the given object type.
*
* @param obj can be {@literal null}.
* @param typeInformation can be {@literal null}.
* @return
*/
Object convertToMongoType(Object obj, TypeInformation<?> typeInformation);
/**
* Creates a {@link DBRef} to refer to the given object.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -41,6 +41,7 @@ import com.mongodb.DBRef;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class QueryMapper {
@@ -83,11 +84,8 @@ public class QueryMapper {
for (String key : query.keySet()) {
MongoPersistentProperty targetProperty = getTargetProperty(key, entity);
String newKey = determineKey(key, entity);
Object value = query.get(key);
result.put(newKey, getMappedValue(value, targetProperty, newKey));
Field field = entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
result.put(field.getMappedKey(), getMappedValue(query.get(key), field));
}
return result;
@@ -125,13 +123,13 @@ public class QueryMapper {
* @param property
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, MongoPersistentProperty property) {
public DBObject getMappedKeyword(Keyword keyword, Field property) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property);
convertAssociation(keyword.value, property.getProperty());
}
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property, keyword.key));
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property.with(keyword.key)));
}
/**
@@ -143,13 +141,9 @@ public class QueryMapper {
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, MongoPersistentProperty property, String newKey) {
private Object getMappedValue(Object source, Field key) {
if (property == null) {
return convertSimpleOrDBObject(source, null);
}
if (property.isIdProperty() || "_id".equals(newKey)) {
if (key.isIdField()) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
@@ -173,53 +167,12 @@ public class QueryMapper {
}
}
if (property.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), property) : convertAssociation(source,
property);
if (key.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), key) : convertAssociation(source,
key.getProperty());
}
return convertSimpleOrDBObject(source, mappingContext.getPersistentEntity(property));
}
private MongoPersistentProperty getTargetProperty(String key, MongoPersistentEntity<?> entity) {
if (isIdKey(key, entity)) {
return entity.getIdProperty();
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? null : path.getLeafProperty();
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return null;
}
try {
PropertyPath path = PropertyPath.from(key, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
/**
* Returns the translated key assuming the given one is a propert (path) reference.
*
* @param key the source key
* @param entity the base entity
* @return the translated key
*/
private String determineKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null && DEFAULT_ID_NAMES.contains(key)) {
return "_id";
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? key : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
return convertSimpleOrDBObject(source, key.getPropertyEntity());
}
/**
@@ -263,29 +216,17 @@ public class QueryMapper {
return result;
}
return source instanceof DBRef ? source : converter.toDBRef(source, property);
}
/**
* Returns whether the given key will be considered an id key.
*
* @param key
* @param entity
* @return
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return false;
if (property.isMap()) {
BasicDBObject result = new BasicDBObject();
DBObject dbObject = (DBObject) source;
for (String key : dbObject.keySet()) {
Object o = dbObject.get(key);
result.put(key, o instanceof DBRef ? o : converter.toDBRef(o, property));
}
return result;
}
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return DEFAULT_ID_NAMES.contains(key);
return source == null || source instanceof DBRef ? source : converter.toDBRef(source, property);
}
/**
@@ -344,4 +285,192 @@ public class QueryMapper {
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
}
}
/**
* Value object to represent a field and its meta-information.
*
* @author Oliver Gierke
*/
private static class Field {
private static final String ID_KEY = "_id";
protected final String name;
/**
* Creates a new {@link Field} without meta-information but the given name.
*
* @param name must not be {@literal null} or empty.
*/
public Field(String name) {
Assert.hasText(name, "Name must not be null!");
this.name = name;
}
/**
* Returns a new {@link Field} with the given name.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public Field with(String name) {
return new Field(name);
}
/**
* Returns whether the current field is the id field.
*
* @return
*/
public boolean isIdField() {
return ID_KEY.equals(name);
}
/**
* Returns the underlying {@link MongoPersistentProperty} backing the field.
*
* @return
*/
public MongoPersistentProperty getProperty() {
return null;
}
/**
* Returns the {@link MongoPersistentEntity} that field is conatined in.
*
* @return
*/
public MongoPersistentEntity<?> getPropertyEntity() {
return null;
}
/**
* Returns whether the field represents an association.
*
* @return
*/
public boolean isAssociation() {
return false;
}
/**
* Returns the key to be used in the mapped document eventually.
*
* @return
*/
public String getMappedKey() {
return isIdField() ? ID_KEY : name;
}
}
/**
* Extension of {@link Field} to be backed with mapping metadata.
*
* @author Oliver Gierke
*/
private static class MetadataBackedField extends Field {
private final MongoPersistentEntity<?> entity;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoPersistentProperty property;
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
super(name);
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
this.entity = entity;
this.mappingContext = context;
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#with(java.lang.String)
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#isIdKey()
*/
@Override
public boolean isIdField() {
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(name) || idProperty.getFieldName().equals(name);
}
return DEFAULT_ID_NAMES.contains(name);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getProperty()
*/
@Override
public MongoPersistentProperty getProperty() {
return property;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getEntity()
*/
@Override
public MongoPersistentEntity<?> getPropertyEntity() {
MongoPersistentProperty property = getProperty();
return property == null ? null : mappingContext.getPersistentEntity(property);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#isAssociation()
*/
@Override
public boolean isAssociation() {
MongoPersistentProperty property = getProperty();
return property == null ? false : property.isAssociation();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTargetKey()
*/
@Override
public String getMappedKey() {
PersistentPropertyPath<MongoPersistentProperty> path = getPath(name);
return path == null ? name : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String name) {
try {
PropertyPath path = PropertyPath.from(name, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
}
}

View File

@@ -0,0 +1,43 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @author Oliver Gierke
*/
public class CustomMetric implements Metric {
private final double multiplier;
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
*/
public CustomMetric(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,7 @@ import java.lang.annotation.Target;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
*/
@Target({ ElementType.TYPE })
@Documented
@@ -69,4 +70,12 @@ public @interface CompoundIndex {
* @return
*/
String collection() default "";
/**
* If {@literal true} the index will be created in the background.
*
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
* @return
*/
boolean background() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,18 +18,20 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@SuppressWarnings("deprecation")
public class Index implements IndexDefinition {
public enum Duplicates {
RETAIN, DROP
}
private final Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
private final Map<String, Direction> fieldSpec = new LinkedHashMap<String, Direction>();
private String name;
@@ -42,12 +44,37 @@ public class Index implements IndexDefinition {
public Index() {
}
public Index(String key, Order order) {
fieldSpec.put(key, order);
public Index(String key, Direction direction) {
fieldSpec.put(key, direction);
}
/**
* Creates a new {@link Indexed} on the given key and {@link Order}.
*
* @deprecated use {@link #Index(String, Direction)} instead.
* @param key must not be {@literal null} or empty.
* @param order must not be {@literal null}.
*/
@Deprecated
public Index(String key, Order order) {
this(key, order.toDirection());
}
/**
* Adds the given field to the index.
*
* @deprecated use {@link #on(String, Direction)} instead.
* @param key must not be {@literal null} or empty.
* @param order must not be {@literal null}.
* @return
*/
@Deprecated
public Index on(String key, Order order) {
fieldSpec.put(key, order);
return on(key, order.toDirection());
}
public Index on(String key, Direction direction) {
fieldSpec.put(key, direction);
return this;
}
@@ -76,7 +103,7 @@ public class Index implements IndexDefinition {
public DBObject getIndexKeys() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, (fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1));
dbo.put(k, fieldSpec.get(k).equals(Direction.ASC) ? 1 : -1);
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -24,30 +25,38 @@ import org.springframework.util.ObjectUtils;
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
public final class IndexField {
private final String key;
private final Order order;
private final Direction direction;
private final boolean isGeo;
private IndexField(String key, Order order, boolean isGeo) {
private IndexField(String key, Direction direction, boolean isGeo) {
Assert.hasText(key);
Assert.isTrue(order != null ^ isGeo);
Assert.isTrue(direction != null ^ isGeo);
this.key = key;
this.order = order;
this.direction = direction;
this.isGeo = isGeo;
}
/**
* Creates a default {@link IndexField} with the given key and {@link Order}.
*
* @deprecated use {@link #create(String, Direction)}.
* @param key must not be {@literal null} or emtpy.
* @param order must not be {@literal null}.
* @param direction must not be {@literal null}.
* @return
*/
@Deprecated
public static IndexField create(String key, Order order) {
Assert.notNull(order);
return new IndexField(key, order.toDirection(), false);
}
public static IndexField create(String key, Direction order) {
Assert.notNull(order);
return new IndexField(key, order, false);
}
@@ -70,12 +79,23 @@ public final class IndexField {
}
/**
* Returns the order of the {@link IndexField} or {@literal null} in case we have a geo index field.
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
*
* @return the order
* @deprecated use {@link #getDirection()} instead.
* @return the direction
*/
@Deprecated
public Order getOrder() {
return order;
return Direction.ASC.equals(direction) ? Order.ASCENDING : Order.DESCENDING;
}
/**
* Returns the direction of the {@link IndexField} or {@literal null} in case we have a geo index field.
*
* @return the direction
*/
public Direction getDirection() {
return direction;
}
/**
@@ -104,7 +124,8 @@ public final class IndexField {
IndexField that = (IndexField) obj;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.order, that.order) && this.isGeo == that.isGeo;
return this.key.equals(that.key) && ObjectUtils.nullSafeEquals(this.direction, that.direction)
&& this.isGeo == that.isGeo;
}
/*
@@ -116,7 +137,7 @@ public final class IndexField {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(key);
result += 31 * ObjectUtils.nullSafeHashCode(order);
result += 31 * ObjectUtils.nullSafeHashCode(direction);
result += 31 * ObjectUtils.nullSafeHashCode(isGeo);
return result;
}
@@ -127,6 +148,6 @@ public final class IndexField {
*/
@Override
public String toString() {
return String.format("IndexField [ key: %s, order: %s, isGeo: %s]", key, order, isGeo);
return String.format("IndexField [ key: %s, direction: %s, isGeo: %s]", key, direction, isGeo);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.ElementType;
@@ -24,7 +23,9 @@ import java.lang.annotation.Target;
/**
* Mark a field to be indexed using MongoDB's indexing feature.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -41,4 +42,12 @@ public @interface Indexed {
String name() default "";
String collection() default "";
/**
* If {@literal true} the index will be created in the background.
*
* @see http://docs.mongodb.org/manual/core/indexes/#background-construction
* @return
*/
boolean background() default false;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -43,6 +43,7 @@ import com.mongodb.util.JSON;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Philipp Schneider
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -106,7 +107,8 @@ public class MongoPersistentEntityIndexCreator implements
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
@@ -140,7 +142,8 @@ public class MongoPersistentEntityIndexCreator implements
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
@@ -191,13 +194,14 @@ public class MongoPersistentEntityIndexCreator implements
* @param sparse sparse or not
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
boolean dropDups, boolean sparse, boolean background) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
opts.put("background", background);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,12 +16,17 @@
package org.springframework.data.mongodb.core.mapping;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoCollectionUtils;
@@ -38,17 +43,15 @@ import org.springframework.util.StringUtils;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T>, ApplicationContextAware {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private final String collection;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
private MongoPersistentProperty versionProperty;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
* collection name to the entities simple type name.
@@ -73,27 +76,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.MutablePersistentEntity#addPersistentProperty(P)
*/
@Override
public void addPersistentProperty(MongoPersistentProperty property) {
if (property.isVersionProperty()) {
if (this.versionProperty != null) {
throw new MappingException(String.format(
"Attempt to add version property %s but already have property %s registered "
+ "as version. Check your mapping configuration!", property.getField(), versionProperty.getField()));
}
this.versionProperty = property;
}
super.addPersistentProperty(property);
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
@@ -110,25 +92,21 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getVersionProperty()
* @see org.springframework.data.mapping.model.BasicPersistentEntity#verify()
*/
public MongoPersistentProperty getVersionProperty() {
return versionProperty;
}
@Override
public void verify() {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasVersionProperty()
*/
public boolean hasVersionProperty() {
return getVersionProperty() != null;
AssertFieldNameUniquenessHandler handler = new AssertFieldNameUniquenessHandler();
doWithProperties(handler);
doWithAssociations(handler);
}
/**
@@ -157,4 +135,37 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
return o1.getFieldOrder() - o2.getFieldOrder();
}
}
/**
* Handler to collect {@link MongoPersistentProperty} instances and check that each of them is mapped to a distinct
* field name.
*
* @author Oliver Gierke
*/
private static class AssertFieldNameUniquenessHandler implements PropertyHandler<MongoPersistentProperty>,
AssociationHandler<MongoPersistentProperty> {
private final Map<String, MongoPersistentProperty> properties = new HashMap<String, MongoPersistentProperty>();
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
assertUniqueness(persistentProperty);
}
public void doWithAssociation(Association<MongoPersistentProperty> association) {
assertUniqueness(association.getInverse());
}
private void assertUniqueness(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
MongoPersistentProperty existingProperty = properties.get(fieldName);
if (existingProperty != null) {
throw new MappingException(String.format(AMBIGUOUS_FIELD_MAPPING, property.toString(),
existingProperty.toString(), fieldName));
}
properties.put(fieldName, property);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,9 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -46,6 +48,8 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
private static final Set<Class<?>> SUPPORTED_ID_TYPES = new HashSet<Class<?>>();
private static final Set<String> SUPPORTED_ID_PROPERTY_NAMES = new HashSet<String>();
private static final Field CAUSE_FIELD;
static {
SUPPORTED_ID_TYPES.add(ObjectId.class);
SUPPORTED_ID_TYPES.add(String.class);
@@ -53,8 +57,12 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
SUPPORTED_ID_PROPERTY_NAMES.add("id");
SUPPORTED_ID_PROPERTY_NAMES.add("_id");
CAUSE_FIELD = ReflectionUtils.findField(Throwable.class, "cause");
}
private final FieldNamingStrategy fieldNamingStrategy;
/**
* Creates a new {@link BasicMongoPersistentProperty}.
*
@@ -62,10 +70,14 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
*/
public BasicMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
this.fieldNamingStrategy = fieldNamingStrategy == null ? PropertyNameFieldNamingStrategy.INSTANCE
: fieldNamingStrategy;
if (isIdProperty() && getFieldName() != ID_FIELD_NAME) {
LOG.warn("Customizing field name for id property not allowed! Custom name will not be considered!");
@@ -108,9 +120,20 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return ID_FIELD_NAME;
}
org.springframework.data.mongodb.core.mapping.Field annotation = getField().getAnnotation(
org.springframework.data.mongodb.core.mapping.Field.class);
return annotation != null && StringUtils.hasText(annotation.value()) ? annotation.value() : field.getName();
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
if (annotation != null && StringUtils.hasText(annotation.value())) {
return annotation.value();
}
String fieldName = fieldNamingStrategy.getFieldName(this);
if (!StringUtils.hasText(fieldName)) {
throw new MappingException(String.format("Invalid (null or empty) field name returned for property %s by %s!",
this, fieldNamingStrategy.getClass()));
}
return fieldName;
}
/*
@@ -148,11 +171,11 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return getField().getAnnotation(DBRef.class);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isVersionProperty()
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
*/
public boolean isVersionProperty() {
return getField().isAnnotationPresent(Version.class);
public boolean usePropertyAccess() {
return CAUSE_FIELD.equals(getField());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,10 +38,11 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
*/
public CachingMongoPersistentProperty(Field field, PropertyDescriptor propertyDescriptor,
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
MongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder, FieldNamingStrategy fieldNamingStrategy) {
super(field, propertyDescriptor, owner, simpleTypeHolder, fieldNamingStrategy);
}
/*

View File

@@ -0,0 +1,46 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.Locale;
/**
* {@link FieldNamingStrategy} that abbreviates field names by using the very first letter of the camel case parts of
* the {@link MongoPersistentProperty}'s name.
*
* @since 1.3
* @author Oliver Gierke
*/
public class CamelCaseAbbreviatingFieldNamingStrategy implements FieldNamingStrategy {
private static final String CAMEL_CASE_PATTERN = "(?<!(^|[A-Z]))(?=[A-Z])|(?<!^)(?=[A-Z][a-z])";
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
String[] parts = property.getName().split(CAMEL_CASE_PATTERN);
StringBuilder builder = new StringBuilder();
for (String part : parts) {
builder.append(part.substring(0, 1).toLowerCase(Locale.US));
}
return builder.toString();
}
}

View File

@@ -0,0 +1,36 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
*
* @see Field
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @since 1.3
* @author Oliver Gierke
*/
public interface FieldNamingStrategy {
/**
* Returns the field name to be used for the given {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null} or empty;
* @return
*/
String getFieldName(MongoPersistentProperty property);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.util.AbstractMap;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
@@ -36,6 +37,9 @@ import org.springframework.data.util.TypeInformation;
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty>
implements ApplicationContextAware {
private static final FieldNamingStrategy DEFAULT_NAMING_STRATEGY = PropertyNameFieldNamingStrategy.INSTANCE;
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
private ApplicationContext context;
/**
@@ -45,13 +49,24 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
}
/**
* Configures the {@link FieldNamingStrategy} to be used to determine the field name if no manual mapping is applied.
* Defaults to a strategy using the plain property name.
*
* @param fieldNamingStrategy the {@link FieldNamingStrategy} to be used to determine the field name if no manual
* mapping is applied.
*/
public void setFieldNamingStrategy(FieldNamingStrategy fieldNamingStrategy) {
this.fieldNamingStrategy = fieldNamingStrategy == null ? DEFAULT_NAMING_STRATEGY : fieldNamingStrategy;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#shouldCreatePersistentEntityFor(org.springframework.data.util.TypeInformation)
*/
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType());
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType()) && !AbstractMap.class.isAssignableFrom(type.getType());
}
/*
@@ -61,7 +76,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
@Override
public MongoPersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
BasicMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder);
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder, fieldNamingStrategy);
}
/*
@@ -86,8 +101,6 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
*/
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
super.setApplicationContext(applicationContext);
}
}

View File

@@ -21,7 +21,6 @@ import org.springframework.data.mapping.PersistentEntity;
* MongoDB specific {@link PersistentEntity} abstraction.
*
* @author Oliver Gierke
* @author Patryk Wasik
*/
public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersistentProperty> {
@@ -31,19 +30,4 @@ public interface MongoPersistentEntity<T> extends PersistentEntity<T, MongoPersi
* @return
*/
String getCollection();
/**
* Returns the {@link MongoPersistentProperty} that represents the version attribute of an entity. Will not be
* {@literal null} if {@link #hasVersionProperty()}.
*
* @return
*/
MongoPersistentProperty getVersionProperty();
/**
* Returns whether the entity has a property representing the version of the entity.
*
* @return
*/
boolean hasVersionProperty();
}

View File

@@ -56,13 +56,6 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
*/
DBRef getDBRef();
/**
* Returns whether the property is representing the version attribute of an entity.
*
* @return
*/
boolean isVersionProperty();
/**
* Simple {@link Converter} implementation to transform a {@link MongoPersistentProperty} into its field name.
*
@@ -80,4 +73,12 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
return source.getFieldName();
}
}
/**
* Returns whether property access shall be used for reading the property value. This means it will use the getter
* instead of field access.
*
* @return
*/
boolean usePropertyAccess();
}

View File

@@ -0,0 +1,35 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* {@link FieldNamingStrategy} simply using the {@link MongoPersistentProperty}'s name.
*
* @since 1.3
* @author Oliver Gierke
*/
public enum PropertyNameFieldNamingStrategy implements FieldNamingStrategy {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public String getFieldName(MongoPersistentProperty property) {
return property.getName();
}
}

View File

@@ -1,160 +0,0 @@
/*
* Copyright 2012-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.model.AbstractPersistentProperty;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
/**
* @deprecated use {@link MongoMappingContext} instead.
* @author Oliver Gierke
*/
@Deprecated
public class SimpleMongoMappingContext extends
AbstractMappingContext<SimpleMongoMappingContext.SimpleMongoPersistentEntity<?>, MongoPersistentProperty> {
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation)
*/
@Override
protected <T> SimpleMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new SimpleMongoPersistentEntity<T>(typeInformation);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.model.MutablePersistentEntity, org.springframework.data.mapping.model.SimpleTypeHolder)
*/
@Override
protected SimplePersistentProperty createPersistentProperty(Field field, PropertyDescriptor descriptor,
SimpleMongoPersistentEntity<?> owner, SimpleTypeHolder simpleTypeHolder) {
return new SimplePersistentProperty(field, descriptor, owner, simpleTypeHolder);
}
static class SimplePersistentProperty extends AbstractPersistentProperty<MongoPersistentProperty> implements
MongoPersistentProperty {
private static final List<String> ID_FIELD_NAMES = Arrays.asList("id", "_id");
/**
* Creates a new {@link SimplePersistentProperty}.
*
* @param field
* @param propertyDescriptor
* @param information
*/
public SimplePersistentProperty(Field field, PropertyDescriptor propertyDescriptor, MongoPersistentEntity<?> owner,
SimpleTypeHolder simpleTypeHolder) {
super(field, propertyDescriptor, owner, simpleTypeHolder);
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.BasicPersistentProperty#isIdProperty()
*/
public boolean isIdProperty() {
return ID_FIELD_NAMES.contains(field.getName());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getKey()
*/
public String getFieldName() {
return isIdProperty() ? "_id" : getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getFieldOrder()
*/
public int getFieldOrder() {
return Integer.MAX_VALUE;
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.AbstractPersistentProperty#createAssociation()
*/
@Override
protected Association<MongoPersistentProperty> createAssociation() {
return new Association<MongoPersistentProperty>(this, null);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isDbReference()
*/
public boolean isDbReference() {
return false;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#getDBRef()
*/
public DBRef getDBRef() {
return null;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentProperty#isVersion()
*/
public boolean isVersionProperty() {
return false;
}
}
static class SimpleMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T> {
/**
* @param information
*/
public SimpleMongoPersistentEntity(TypeInformation<T> information) {
super(information);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
return MongoCollectionUtils.getPreferredCollectionName(getType());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.mapping.MongoPersistentEntity#getVersionProperty()
*/
public MongoPersistentProperty getVersionProperty() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#hasVersionProperty()
*/
public boolean hasVersionProperty() {
return false;
}
}
}

View File

@@ -27,10 +27,13 @@ import java.lang.annotation.Target;
*
* @since 1.4
* @author Patryk Wasik
* @deprecated use {@link org.springframework.data.annotation.Version} instead.
*/
@Deprecated
@Documented
@Target({ FIELD })
@Retention(RUNTIME)
@org.springframework.data.annotation.Version
public @interface Version {
}

View File

@@ -0,0 +1,50 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Base class for delete events.
*
* @author Martin Baumgartner
*/
public abstract class AbstractDeleteEvent<T> extends MongoMappingEvent<DBObject> {
private static final long serialVersionUID = 1L;
private final Class<T> type;
/**
* Creates a new {@link AbstractDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type , possibly be {@literal null}.
*/
public AbstractDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, dbo);
this.type = type;
}
/**
* Returns the type for which the {@link AbstractDeleteEvent} shall be invoked for.
*
* @return
*/
public Class<T> getType() {
return type;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 by the original author(s).
* Copyright 2011-2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import com.mongodb.DBObject;
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
*/
public abstract class AbstractMongoEventListener<E> implements ApplicationListener<MongoMappingEvent<?>> {
@@ -45,6 +46,7 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
@SuppressWarnings("rawtypes")
public void onApplicationEvent(MongoMappingEvent<?> event) {
if (event instanceof AfterLoadEvent) {
@@ -57,6 +59,22 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
return;
}
if (event instanceof AbstractDeleteEvent) {
Class<?> eventDomainType = ((AbstractDeleteEvent) event).getType();
if (eventDomainType != null && domainClass.isAssignableFrom(eventDomainType)) {
if (event instanceof BeforeDeleteEvent) {
onBeforeDelete(event.getDBObject());
}
if (event instanceof AfterDeleteEvent) {
onAfterDelete(event.getDBObject());
}
}
return;
}
@SuppressWarnings("unchecked")
E source = (E) event.getSource();
@@ -78,31 +96,43 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
public void onBeforeConvert(E source) {
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeConvert(" + source + ")");
LOG.debug("onBeforeConvert({})", source);
}
}
public void onBeforeSave(E source, DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeSave(" + source + ", " + dbo + ")");
LOG.debug("onBeforeSave({}, {})", source, dbo);
}
}
public void onAfterSave(E source, DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterSave(" + source + ", " + dbo + ")");
LOG.debug("onAfterSave({}, {})", source, dbo);
}
}
public void onAfterLoad(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterLoad(" + dbo + ")");
LOG.debug("onAfterLoad({})", dbo);
}
}
public void onAfterConvert(DBObject dbo, E source) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert(" + dbo + "," + source + ")");
LOG.debug("onAfterConvert({}, {})", dbo, source);
}
}
public void onAfterDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
}
}
public void onBeforeDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
}
}
}

View File

@@ -0,0 +1,39 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Event being thrown after a single or a set of documents has/have been deleted. The {@link DBObject} held in the event
* will be the query document <em>after</am> it has been mapped onto the domain type handled.
*
* @author Martin Baumgartner
*/
public class AfterDeleteEvent<T> extends AbstractDeleteEvent<T> {
private static final long serialVersionUID = 1L;
/**
* Creates a new {@link AfterDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type can be {@literal null}.
*/
public AfterDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, type);
}
}

View File

@@ -0,0 +1,53 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import org.springframework.context.ApplicationListener;
import org.springframework.data.auditing.AuditingHandler;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.util.Assert;
/**
* Event listener to populate auditing related fields on an entity about to be saved.
*
* @author Oliver Gierke
*/
public class AuditingEventListener implements ApplicationListener<BeforeConvertEvent<Object>> {
private final IsNewAwareAuditingHandler<Object> auditingHandler;
/**
* Creates a new {@link AuditingEventListener} using the given {@link MappingContext} and {@link AuditingHandler}.
*
* @param auditingHandler must not be {@literal null}.
*/
public AuditingEventListener(IsNewAwareAuditingHandler<Object> auditingHandler) {
Assert.notNull(auditingHandler, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandler = auditingHandler;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(BeforeConvertEvent<Object> event) {
Object entity = event.getSource();
auditingHandler.markAudited(entity);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,17 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Event being thrown before a domain object is converted to be persisted.
*
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class BeforeConvertEvent<T> extends MongoMappingEvent<T> {
private static final long serialVersionUID = 1L;
private static final long serialVersionUID = 252614269008845243L;
public BeforeConvertEvent(T source) {
super(source, null);

View File

@@ -0,0 +1,39 @@
/*
* Copyright 2013 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping.event;
import com.mongodb.DBObject;
/**
* Event being thrown before a document is deleted. The {@link DBObject} held in the event will represent the query
* document <em>before</em> being mapped based on the domain class handled.
*
* @author Martin Baumgartner
*/
public class BeforeDeleteEvent<T> extends AbstractDeleteEvent<T> {
private static final long serialVersionUID = -2627547705679734497L;
/**
* Creates a new {@link BeforeDeleteEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type can be {@literal null}.
*/
public BeforeDeleteEvent(DBObject dbo, Class<T> type) {
super(dbo, type);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,15 +17,19 @@ package org.springframework.data.mongodb.core.mapping.event;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* {@link ApplicationListener} for Mongo mapping events logging the events.
*
* @author Jon Brisbin
* @author Martin Baumgartner
*/
public class LoggingEventListener extends AbstractMongoEventListener<Object> {
private static final Logger log = LoggerFactory.getLogger(LoggingEventListener.class);
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingEventListener.class);
/*
* (non-Javadoc)
@@ -33,7 +37,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onBeforeConvert(Object source) {
log.info("onBeforeConvert: " + source);
LOGGER.info("onBeforeConvert: {}", source);
}
/*
@@ -42,10 +46,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onBeforeSave(Object source, DBObject dbo) {
try {
log.info("onBeforeSave: " + source + ", " + dbo);
} catch (Throwable ignored) {
}
LOGGER.info("onBeforeSave: {}, {}", source, dbo);
}
/*
@@ -54,7 +55,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterSave(Object source, DBObject dbo) {
log.info("onAfterSave: " + source + ", " + dbo);
LOGGER.info("onAfterSave: {}, {}", source, dbo);
}
/*
@@ -63,7 +64,7 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterLoad(DBObject dbo) {
log.info("onAfterLoad: " + dbo);
LOGGER.info("onAfterLoad: {}", dbo);
}
/*
@@ -72,6 +73,24 @@ public class LoggingEventListener extends AbstractMongoEventListener<Object> {
*/
@Override
public void onAfterConvert(DBObject dbo, Object source) {
log.info("onAfterConvert: " + dbo + ", " + source);
LOGGER.info("onAfterConvert: {}, {}", dbo, source);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onAfterDelete(com.mongodb.DBObject)
*/
@Override
public void onAfterDelete(DBObject dbo) {
LOGGER.info("onAfterDelete: {}", dbo);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onBeforeDelete(com.mongodb.DBObject)
*/
@Override
public void onBeforeDelete(DBObject dbo) {
LOGGER.info("onBeforeDelete: {}", dbo);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,30 +16,42 @@
package org.springframework.data.mongodb.core.mapreduce;
/**
* Value object to encapsulate results of a map-reduce count.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class MapReduceCounts {
private final int inputCount;
private final int emitCount;
private final int outputCount;
public static MapReduceCounts NONE = new MapReduceCounts(-1, -1, -1);
private final long inputCount;
private final long emitCount;
private final long outputCount;
/**
* Creates a new {@link MapReduceCounts} using the given input count, emit count, and output count.
*
* @param inputCount
* @param emitCount
* @param outputCount
*/
public MapReduceCounts(long inputCount, long emitCount, long outputCount) {
public MapReduceCounts(int inputCount, int emitCount, int outputCount) {
super();
this.inputCount = inputCount;
this.emitCount = emitCount;
this.outputCount = outputCount;
}
public int getInputCount() {
public long getInputCount() {
return inputCount;
}
public int getEmitCount() {
public long getEmitCount() {
return emitCount;
}
public int getOutputCount() {
public long getOutputCount() {
return outputCount;
}
@@ -59,12 +71,15 @@ public class MapReduceCounts {
*/
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
long result = 1;
result = prime * result + emitCount;
result = prime * result + inputCount;
result = prime * result + outputCount;
return result;
return Long.valueOf(result).intValue();
}
/*

View File

@@ -118,15 +118,14 @@ public class MapReduceResults<T> implements Iterable<T> {
DBObject counts = (DBObject) rawResults.get("counts");
if (counts == null) {
return new MapReduceCounts(-1, -1, -1);
return MapReduceCounts.NONE;
}
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
return new MapReduceCounts((Integer) counts.get("input"), (Integer) counts.get("emit"),
(Integer) counts.get("output"));
return new MapReduceCounts(getAsLong(counts, "input"), getAsLong(counts, "emit"), getAsLong(counts, "output"));
}
return new MapReduceCounts(-1, -1, -1);
return MapReduceCounts.NONE;
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,15 +17,26 @@ package org.springframework.data.mongodb.core.query;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Patryk Wasik
*/
public class Field {
private Map<String, Integer> criteria = new HashMap<String, Integer>();
private Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private String postionKey;
private int positionValue;
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
@@ -47,14 +58,104 @@ public class Field {
return this;
}
public Field elemMatch(String key, Criteria elemMatchCriteria) {
elemMatchs.put(key, elemMatchCriteria);
return this;
}
/**
* The array field must appear in the query. Only one positional {@code $} operator can appear in the projection and
* only one array field can appear in the query.
*
* @param field query array field, must not be {@literal null} or empty.
* @param value
* @return
*/
public Field position(String field, int value) {
Assert.hasText(field, "Field must not be null or empty!");
postionKey = field;
positionValue = value;
return this;
}
public DBObject getFieldsObject() {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
dbo.put(k, (criteria.get(k)));
dbo.put(k, criteria.get(k));
}
for (String k : slices.keySet()) {
dbo.put(k, new BasicDBObject("$slice", (slices.get(k))));
dbo.put(k, new BasicDBObject("$slice", slices.get(k)));
}
for (Entry<String, Criteria> entry : elemMatchs.entrySet()) {
DBObject dbObject = new BasicDBObject("$elemMatch", entry.getValue().getCriteriaObject());
dbo.put(entry.getKey(), dbObject);
}
if (postionKey != null) {
dbo.put(postionKey + ".$", positionValue);
}
return dbo;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object object) {
if (this == object) {
return true;
}
if (!(object instanceof Field)) {
return false;
}
Field that = (Field) object;
if (!this.criteria.equals(that.criteria)) {
return false;
}
if (!this.slices.equals(that.slices)) {
return false;
}
if (!this.elemMatchs.equals(that.elemMatchs)) {
return false;
}
boolean samePositionKey = this.postionKey == null ? that.postionKey == null : this.postionKey
.equals(that.postionKey);
boolean samePositionValue = this.positionValue == that.positionValue;
return samePositionKey && samePositionValue;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.criteria);
result += 31 * ObjectUtils.nullSafeHashCode(this.elemMatchs);
result += 31 * ObjectUtils.nullSafeHashCode(this.slices);
result += 31 * ObjectUtils.nullSafeHashCode(this.postionKey);
result += 31 * ObjectUtils.nullSafeHashCode(this.positionValue);
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.geo.Metrics;
@@ -31,10 +32,12 @@ import com.mongodb.DBObject;
*/
public class NearQuery {
private final DBObject criteria;
private final Point point;
private Query query;
private Double maxDistance;
private Distance maxDistance;
private Metric metric;
private boolean spherical;
private Integer num;
/**
* Creates a new {@link NearQuery}.
@@ -45,13 +48,11 @@ public class NearQuery {
Assert.notNull(point);
this.criteria = new BasicDBObject();
this.criteria.put("near", point.asList());
this.point = point;
this.spherical = false;
this.metric = metric;
if (metric != null) {
spherical(true);
distanceMultiplier(metric);
in(metric);
}
}
@@ -105,12 +106,13 @@ public class NearQuery {
}
/**
* Returns the {@link Metric} underlying the actual query.
* Returns the {@link Metric} underlying the actual query. If no metric was set explicitly {@link Metrics#NEUTRAL}
* will be returned.
*
* @return
* @return will never be {@literal null}.
*/
public Metric getMetric() {
return metric;
return metric == null ? Metrics.NEUTRAL : metric;
}
/**
@@ -120,68 +122,96 @@ public class NearQuery {
* @return
*/
public NearQuery num(int num) {
this.criteria.put("num", num);
this.num = num;
return this;
}
/**
* Sets the max distance results shall have from the configured origin. Will normalize the given value using a
* potentially already configured {@link Metric}.
* Sets the max distance results shall have from the configured origin. If a {@link Metric} was set before the given
* value will be interpreted as being a value in that metric. E.g.
*
* <pre>
* NearQuery query = near(10.0, 20.0, Metrics.KILOMETERS).maxDistance(150);
* </pre>
*
* Will set the maximum distance to 150 kilometers.
*
* @param maxDistance
* @return
*/
public NearQuery maxDistance(double maxDistance) {
this.maxDistance = getNormalizedDistance(maxDistance, this.metric);
return this;
return maxDistance(new Distance(maxDistance, getMetric()));
}
/**
* Sets the maximum distance supplied in a given metric. Will normalize the distance but not reconfigure the query's
* {@link Metric}.
* result {@link Metric} if one was configured before.
*
* @param maxDistance
* @param metric must not be {@literal null}.
* @return
*/
public NearQuery maxDistance(double maxDistance, Metric metric) {
Assert.notNull(metric);
this.spherical(true);
return maxDistance(getNormalizedDistance(maxDistance, metric));
return maxDistance(new Distance(maxDistance, metric));
}
/**
* Sets the maximum distance to the given {@link Distance}.
* Sets the maximum distance to the given {@link Distance}. Will set the returned {@link Metric} to be the one of the
* given {@link Distance} if no {@link Metric} was set before.
*
* @param distance
* @param distance must not be {@literal null}.
* @return
*/
public NearQuery maxDistance(Distance distance) {
Assert.notNull(distance);
return maxDistance(distance.getValue(), distance.getMetric());
if (distance.getMetric() != Metrics.NEUTRAL) {
this.spherical(true);
}
if (this.metric == null) {
in(distance.getMetric());
}
this.maxDistance = distance;
return this;
}
/**
* Configures a distance multiplier the resulting distances get applied.
* Returns the maximum {@link Distance}.
*
* @return
*/
public Distance getMaxDistance() {
return this.maxDistance;
}
/**
* Configures a {@link CustomMetric} with the given multiplier.
*
* @param distanceMultiplier
* @return
*/
public NearQuery distanceMultiplier(double distanceMultiplier) {
this.criteria.put("distanceMultiplier", distanceMultiplier);
this.metric = new CustomMetric(distanceMultiplier);
return this;
}
/**
* Configures the distance multiplier to the multiplier of the given {@link Metric}. Does <em>not</em> recalculate the
* {@link #maxDistance(double)}.
* Configures the distance multiplier to the multiplier of the given {@link Metric}.
*
* @deprecated use {@link #in(Metric)} instead.
* @param metric must not be {@literal null}.
* @return
*/
@Deprecated
public NearQuery distanceMultiplier(Metric metric) {
Assert.notNull(metric);
return distanceMultiplier(metric.getMultiplier());
return in(metric);
}
/**
@@ -191,10 +221,19 @@ public class NearQuery {
* @return
*/
public NearQuery spherical(boolean spherical) {
this.criteria.put("spherical", spherical);
this.spherical = spherical;
return this;
}
/**
* Returns whether spharical values will be returned.
*
* @return
*/
public boolean isSpherical() {
return this.spherical;
}
/**
* Will cause the results' distances being returned in kilometers. Sets {@link #distanceMultiplier(double)} and
* {@link #spherical(boolean)} accordingly.
@@ -215,6 +254,18 @@ public class NearQuery {
return adaptMetric(Metrics.MILES);
}
/**
* Will cause the results' distances being returned in the given metric. Sets {@link #distanceMultiplier(double)}
* accordingly as well as {@link #spherical(boolean)} if the given {@link Metric} is not {@link Metrics#NEUTRAL}.
*
* @param metric the metric the results shall be returned in. Uses {@link Metrics#NEUTRAL} if {@literal null} is
* passed.
* @return
*/
public NearQuery in(Metric metric) {
return adaptMetric(metric == null ? Metrics.NEUTRAL : metric);
}
/**
* Configures the given {@link Metric} to be used as base on for this query and recalculate the maximum distance if no
* metric was set before.
@@ -223,12 +274,12 @@ public class NearQuery {
*/
private NearQuery adaptMetric(Metric metric) {
if (this.metric == null && maxDistance != null) {
maxDistance(this.maxDistance, metric);
if (metric != Metrics.NEUTRAL) {
spherical(true);
}
spherical(true);
return distanceMultiplier(metric);
this.metric = metric;
return this;
}
/**
@@ -249,18 +300,27 @@ public class NearQuery {
*/
public DBObject toDBObject() {
BasicDBObject dbObject = new BasicDBObject(criteria.toMap());
BasicDBObject dbObject = new BasicDBObject();
if (query != null) {
dbObject.put("query", query.getQueryObject());
}
if (maxDistance != null) {
dbObject.put("maxDistance", maxDistance);
dbObject.put("maxDistance", this.maxDistance.getNormalizedValue());
}
if (metric != null) {
dbObject.put("distanceMultiplier", metric.getMultiplier());
}
if (num != null) {
dbObject.put("num", num);
}
dbObject.put("near", point.asList());
dbObject.put("spherical", spherical);
return dbObject;
}
private double getNormalizedDistance(double distance, Metric metric) {
return metric == null ? distance : distance / metric.getMultiplier();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,11 +15,31 @@
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.domain.Sort.Direction;
/**
* An enum that specifies the ordering for sort or index specifications
*
* @author trisberg
* @deprecated prefer {@link Direction}
* @author Thomas Risberg
* @author Oliver Gierke
*/
@Deprecated
public enum Order {
ASCENDING, DESCENDING
ASCENDING {
@Override
public Direction toDirection() {
return Direction.ASC;
}
},
DESCENDING {
@Override
public Direction toDirection() {
return Direction.DESC;
}
};
public abstract Direction toDirection();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.util.Assert;
@@ -38,30 +39,39 @@ public class Query {
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private Field fieldSpec;
private Sort coreSort;
@SuppressWarnings("deprecation")
private org.springframework.data.mongodb.core.query.Sort sort;
private Sort sort;
private int skip;
private int limit;
private String hint;
/**
* Static factory method to create a Query using the provided criteria
* Static factory method to create a {@link Query} using the provided {@link Criteria}.
*
* @param critera
* @param criteria must not be {@literal null}.
* @return
*/
public static Query query(Criteria critera) {
return new Query(critera);
public static Query query(Criteria criteria) {
return new Query(criteria);
}
public Query() {
}
/**
* Creates a new {@link Query} using the given {@link Criteria}.
*
* @param criteria must not be {@literal null}.
*/
public Query(Criteria criteria) {
addCriteria(criteria);
}
/**
* Adds the given {@link Criteria} to the current {@link Query}.
*
* @param criteria must not be {@literal null}.
* @return
*/
public Query addCriteria(Criteria criteria) {
CriteriaDefinition existing = this.criteria.get(criteria.getKey());
String key = criteria.getKey();
@@ -104,21 +114,6 @@ public class Query {
return this;
}
/**
* Returns a {@link org.springframework.data.mongodb.core.query.Sort} instance to define ordering properties.
*
* @deprecated use {@link #with(Sort)} instead
* @return
*/
@Deprecated
public org.springframework.data.mongodb.core.query.Sort sort() {
if (this.sort == null) {
this.sort = new org.springframework.data.mongodb.core.query.Sort();
}
return this.sort;
}
/**
* Sets the given pagination information on the {@link Query} instance. Will transparently set {@code skip} and
* {@code limit} as well as applying the {@link Sort} instance defined with the {@link Pageable}.
@@ -150,10 +145,17 @@ public class Query {
return this;
}
if (this.coreSort == null) {
this.coreSort = sort;
for (Order order : sort) {
if (order.isIgnoreCase()) {
throw new IllegalArgumentException(String.format("Gven sort contained an Order for %s with ignore case! "
+ "MongoDB does not support sorting ignoreing case currently!", order.getProperty()));
}
}
if (this.sort == null) {
this.sort = sort;
} else {
this.coreSort = this.coreSort.and(sort);
this.sort = this.sort.and(sort);
}
return this;
@@ -176,25 +178,20 @@ public class Query {
return fieldSpec.getFieldsObject();
}
@SuppressWarnings("deprecation")
public DBObject getSortObject() {
if (this.coreSort == null && this.sort == null) {
if (this.sort == null && this.sort == null) {
return null;
}
DBObject dbo = new BasicDBObject();
if (this.coreSort != null) {
for (org.springframework.data.domain.Sort.Order order : this.coreSort) {
if (this.sort != null) {
for (org.springframework.data.domain.Sort.Order order : this.sort) {
dbo.put(order.getProperty(), order.isAscending() ? 1 : -1);
}
}
if (this.sort != null) {
dbo.putAll(this.sort.getSortObject());
}
return dbo;
}

View File

@@ -1,56 +0,0 @@
/*
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.LinkedHashMap;
import java.util.Map;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Helper class to define sorting criterias for a Query instance.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @deprecated use {@link org.springframework.data.domain.Sort} instead. See
* {@link Query#with(org.springframework.data.domain.Sort)}.
*/
@Deprecated
public class Sort {
private Map<String, Order> fieldSpec = new LinkedHashMap<String, Order>();
public Sort() {
}
public Sort(String key, Order order) {
fieldSpec.put(key, order);
}
public Sort on(String key, Order order) {
fieldSpec.put(key, order);
return this;
}
public DBObject getSortObject() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, fieldSpec.get(k).equals(Order.ASCENDING) ? 1 : -1);
}
return dbo;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -52,7 +52,10 @@ public class Update {
/**
* Creates an {@link Update} instance from the given {@link DBObject}. Allows to explicitly exlude fields from making
* it into the created {@link Update} object.
* it into the created {@link Update} object. Note, that this will set attributes directly and <em>not</em> use
* {@literal $set}. This means fields not given in the {@link DBObject} will be nulled when executing the update. To
* create an only-updating {@link Update} instance of a {@link DBObject}, call {@link #set(String, Object)} for each
* value in it.
*
* @param object the source {@link DBObject} to create the update from.
* @param exclude the fields to exclude.
@@ -69,7 +72,7 @@ public class Update {
continue;
}
update.set(key, object.get(key));
update.modifierOps.put(key, object.get(key));
}
return update;
@@ -160,7 +163,7 @@ public class Update {
* @return
*/
public Update pop(String key, Position pos) {
addMultiFieldOperation("$pop", key, (pos == Position.FIRST ? -1 : 1));
addMultiFieldOperation("$pop", key, pos == Position.FIRST ? -1 : 1);
return this;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,6 +30,7 @@ import com.mongodb.gridfs.GridFSFile;
* Collection of operations to store and read files from MongoDB GridFS.
*
* @author Oliver Gierke
* @author Philipp Schneider
*/
public interface GridFsOperations extends ResourcePatternResolver {
@@ -42,27 +43,60 @@ public interface GridFsOperations extends ResourcePatternResolver {
*/
GridFSFile store(InputStream content, String filename);
/**
* Stores the given content into a file with the given name and content type.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType);
/**
* Stores the given content into a file with the given name using the given metadata. The metadata object will be
* marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, Object metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType, Object metadata);
/**
* Stores the given content into a file with the given name using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, DBObject metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, String filename, String contentType, DBObject metadata);
/**
* Returns all files matching the given query. Note, that currently {@link Sort} criterias defined at the
* {@link Query} will not be regarded as MongoDB does not support ordering for GridFS file access.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -42,6 +42,7 @@ import com.mongodb.gridfs.GridFSInputFile;
* {@link GridFsOperations} implementation to store content into MongoDB GridFS.
*
* @author Oliver Gierke
* @author Philipp Schneider
*/
public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver {
@@ -87,15 +88,37 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
return store(content, filename, (Object) null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String)
*/
public GridFSFile store(InputStream content, String filename, String contentType) {
return store(content, filename, contentType, (Object) null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.Object)
*/
public GridFSFile store(InputStream content, String filename, Object metadata) {
DBObject dbObject = new BasicDBObject();
converter.write(metadata, dbObject);
return store(content, filename, dbObject);
return store(content, filename, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String, java.lang.Object)
*/
public GridFSFile store(InputStream content, String filename, String contentType, Object metadata) {
DBObject dbObject = null;
if (metadata != null) {
dbObject = new BasicDBObject();
converter.write(metadata, dbObject);
}
return store(content, filename, contentType, dbObject);
}
/*
@@ -103,16 +126,30 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.DBObject)
*/
public GridFSFile store(InputStream content, String filename, DBObject metadata) {
return this.store(content, filename, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.DBObject)
*/
public GridFSFile store(InputStream content, String filename, String contentType, DBObject metadata) {
Assert.notNull(content);
Assert.hasText(filename);
Assert.notNull(metadata);
GridFSInputFile file = getGridFs().createFile(content);
file.setFilename(filename);
file.setMetaData(metadata);
file.save();
if (metadata != null) {
file.setMetaData(metadata);
}
if (contentType != null) {
file.setContentType(contentType);
}
file.save();
return file;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,7 +15,11 @@
*/
package org.springframework.data.mongodb.repository;
import java.lang.annotation.*;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* Annotation to declare finder queries directly on repository methods. Both attributes allow using a placeholder
@@ -43,4 +47,12 @@ public @interface Query {
* @return
*/
String fields() default "";
/**
* Returns whether the query defined should be executed as count projection.
*
* @since 1.3
* @return
*/
boolean count() default false;
}

View File

@@ -53,14 +53,6 @@ public class MongoRepositoryBean<T> extends CdiRepositoryBean<T> {
this.operations = operations;
}
/*
* (non-Javadoc)
* @see javax.enterprise.inject.spi.Bean#getScope()
*/
public Class<? extends Annotation> getScope() {
return operations.getScope();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.cdi.CdiRepositoryBean#create(javax.enterprise.context.spi.CreationalContext, java.lang.Class)

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.repository.query;
import java.util.List;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -39,6 +41,8 @@ import org.springframework.util.Assert;
*/
public abstract class AbstractMongoQuery implements RepositoryQuery {
private static final ConversionService CONVERSION_SERVICE = new DefaultConversionService();
private final MongoQueryMethod method;
private final MongoOperations operations;
@@ -86,18 +90,22 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
return new SingleEntityExecution().execute(query);
}
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
Object result = new SingleEntityExecution(isCountQuery()).execute(query);
if (result == null) {
return result;
}
Class<?> expectedReturnType = method.getReturnType().getType();
if (expectedReturnType.isAssignableFrom(result.getClass())) {
return result;
}
return CONVERSION_SERVICE.convert(result, expectedReturnType);
}
/**
* Creates a {@link Query} instance using the given {@link ConvertingParameterAccessor}. Will delegate to
@@ -111,13 +119,28 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return createQuery(accessor);
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
/**
* Returns whether the query should get a count projection applied.
*
* @return
*/
protected abstract boolean isCountQuery();
private abstract class Execution {
abstract Object execute(Query query);
protected List<?> readCollection(Query query) {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
return operations.find(query, metadata.getJavaType(), collectionName);
@@ -175,7 +198,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
@SuppressWarnings({ "rawtypes", "unchecked" })
Object execute(Query query) {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(query, metadata.getCollectionName());
List<?> result = operations.find(query.with(pageable), metadata.getJavaType(), metadata.getCollectionName());
@@ -191,6 +214,12 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
class SingleEntityExecution extends Execution {
private final boolean countProjection;
private SingleEntityExecution(boolean countProjection) {
this.countProjection = countProjection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.core.query.Query)
@@ -198,8 +227,9 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
@Override
Object execute(Query query) {
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return operations.findOne(query, entityInformation.getJavaType());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return countProjection ? operations.count(query, metadata.getJavaType()) : operations.findOne(query,
metadata.getJavaType());
}
}
@@ -236,8 +266,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
Object execute(Query query, Query countQuery) {
MongoEntityInformation<?, ?> information = method.getEntityInformation();
long count = operations.count(countQuery, information.getCollectionName());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(countQuery, metadata.getCollectionName());
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
}
@@ -254,18 +284,23 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Distance maxDistance = accessor.getMaxDistance();
if (maxDistance != null) {
nearQuery.maxDistance(maxDistance);
nearQuery.maxDistance(maxDistance).in(maxDistance.getMetric());
}
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, entityInformation.getJavaType(),
entityInformation.getCollectionName());
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
}
private boolean isListOfGeoResult() {
TypeInformation<?> returnType = method.getReturnType();
return returnType.getType().equals(List.class) && GeoResult.class.equals(returnType.getComponentType());
if (!returnType.getType().equals(List.class)) {
return false;
}
TypeInformation<?> componentType = returnType.getComponentType();
return componentType == null ? false : GeoResult.class.equals(componentType.getType());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
@@ -85,12 +86,12 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getSort();
}
/* (non-Javadoc)
* @see org.springframework.data.repository.query.ParameterAccessor#getBindableParameter(int)
*/
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ParameterAccessor#getBindableValue(int)
*/
public Object getBindableValue(int index) {
return getConvertedValue(delegate.getBindableValue(index));
return getConvertedValue(delegate.getBindableValue(index), null);
}
/*
@@ -101,7 +102,8 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return delegate.getMaxDistance();
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoParameterAccessor#getGeoNearLocation()
*/
public Point getGeoNearLocation() {
@@ -111,11 +113,12 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
/**
* Converts the given value with the underlying {@link MongoWriter}.
*
* @param value
* @param value can be {@literal null}.
* @param typeInformation can be {@literal null}.
* @return
*/
private Object getConvertedValue(Object value) {
return writer.convertToMongoType(value);
private Object getConvertedValue(Object value, TypeInformation<?> typeInformation) {
return writer.convertToMongoType(value, typeInformation == null ? null : typeInformation.getActualType());
}
/*
@@ -186,7 +189,7 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
}
}
return getConvertedValue(next);
return getConvertedValue(next, property.getTypeInformation());
}
/*

View File

@@ -1,47 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import java.io.Serializable;
/**
* Interface for components being able to provide {@link EntityInformationCreator} for a given {@link Class}.
*
* @author Oliver Gierke
*/
public interface EntityInformationCreator {
/**
* Returns a {@link MongoEntityInformation} for the given domain class.
*
* @param domainClass the domain class to create the {@link MongoEntityInformation} for, must not be {@literal null}.
* @return
*/
<T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass);
/**
* Returns a {@link MongoEntityInformation} for the given domain class and class to retrieve the collection to query
* against from.
*
* @param domainClass the domain class to create the {@link MongoEntityInformation} for, must not be {@literal null}.
* @param collectionClass the class to derive the collection from queries to retrieve the domain classes from shall be
* ran against, must not be {@literal null}.
* @return
*/
<T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
Class<?> collectionClass);
}

View File

@@ -0,0 +1,33 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.repository.core.EntityMetadata;
/**
* Extension of {@link EntityMetadata} to additionally expose the collection name an entity shall be persisted to.
*
* @author Oliver Gierke
*/
public interface MongoEntityMetadata<T> extends EntityMetadata<T> {
/**
* Returns the name of the collection the entity shall be persisted to.
*
* @return
*/
String getCollectionName();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -169,68 +169,68 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PotentiallyConvertingIterator parameters) {
switch (type) {
case AFTER:
case GREATER_THAN:
return criteria.gt(parameters.nextConverted(property));
case GREATER_THAN_EQUAL:
return criteria.gte(parameters.nextConverted(property));
case BEFORE:
case LESS_THAN:
return criteria.lt(parameters.nextConverted(property));
case LESS_THAN_EQUAL:
return criteria.lte(parameters.nextConverted(property));
case BETWEEN:
return criteria.gt(parameters.nextConverted(property)).lt(parameters.nextConverted(property));
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters, property));
case IN:
return criteria.in(nextAsArray(parameters, property));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value, type));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
case AFTER:
case GREATER_THAN:
return criteria.gt(parameters.nextConverted(property));
case GREATER_THAN_EQUAL:
return criteria.gte(parameters.nextConverted(property));
case BEFORE:
case LESS_THAN:
return criteria.lt(parameters.nextConverted(property));
case LESS_THAN_EQUAL:
return criteria.lte(parameters.nextConverted(property));
case BETWEEN:
return criteria.gt(parameters.nextConverted(property)).lt(parameters.nextConverted(property));
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters, property));
case IN:
return criteria.in(nextAsArray(parameters, property));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
case CONTAINING:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value, type));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
Distance distance = accessor.getMaxDistance();
Point point = accessor.getGeoNearLocation();
point = point == null ? nextAs(parameters, Point.class) : point;
Distance distance = accessor.getMaxDistance();
Point point = accessor.getGeoNearLocation();
point = point == null ? nextAs(parameters, Point.class) : point;
if (distance == null) {
return criteria.near(point);
} else {
if (distance.getMetric() != null) {
criteria.nearSphere(point);
if (distance == null) {
return criteria.near(point);
} else {
criteria.near(point);
if (distance.getMetric() != null) {
criteria.nearSphere(point);
} else {
criteria.near(point);
}
criteria.maxDistance(distance.getNormalizedValue());
}
criteria.maxDistance(distance.getNormalizedValue());
}
return criteria;
return criteria;
case WITHIN:
Object parameter = parameters.next();
return criteria.within((Shape) parameter);
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.ne(parameters.nextConverted(property));
case WITHIN:
Object parameter = parameters.next();
return criteria.within((Shape) parameter);
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.ne(parameters.nextConverted(property));
default:
throw new IllegalArgumentException("Unsupported keyword!");
}
throw new IllegalArgumentException("Unsupported keyword!");
}
/**
@@ -268,15 +268,16 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
private String toLikeRegex(String source, Type type) {
switch (type) {
case STARTING_WITH:
source = source + "*";
break;
case ENDING_WITH:
source = "*" + source;
break;
case CONTAINING:
source = "*" + source + "*";
break;
case STARTING_WITH:
source = source + "*";
break;
case ENDING_WITH:
source = "*" + source;
break;
case CONTAINING:
source = "*" + source + "*";
break;
default:
}
return source.replaceAll("\\*", ".*");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,12 @@ import java.util.Arrays;
import java.util.List;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.geo.GeoPage;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.Parameters;
@@ -33,8 +36,7 @@ import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* TODO - Extract methods for {@link #getAnnotatedQuery()} into superclass as it is currently copied from Spring Data
* JPA
* Mongo specific implementation of {@link QueryMethod}.
*
* @author Oliver Gierke
*/
@@ -45,19 +47,24 @@ public class MongoQueryMethod extends QueryMethod {
.asList(GeoResult.class, GeoResults.class, GeoPage.class);
private final Method method;
private final MongoEntityInformation<?, ?> entityInformation;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private MongoEntityMetadata<?> metadata;
/**
* Creates a new {@link MongoQueryMethod} from the given {@link Method}.
*
* @param method
*/
public MongoQueryMethod(Method method, RepositoryMetadata metadata, EntityInformationCreator entityInformationCreator) {
public MongoQueryMethod(Method method, RepositoryMetadata metadata,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(method, metadata);
Assert.notNull(entityInformationCreator, "DefaultEntityInformationCreator must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!");
this.method = method;
this.entityInformation = entityInformationCreator.getEntityInformation(metadata.getReturnedDomainClass(method),
getDomainClass());
this.mappingContext = mappingContext;
}
/*
@@ -101,14 +108,30 @@ public class MongoQueryMethod extends QueryMethod {
return StringUtils.hasText(value) ? value : null;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryMethod#getEntityInformation()
*/
@Override
public MongoEntityInformation<?, ?> getEntityInformation() {
@SuppressWarnings("unchecked")
public MongoEntityMetadata<?> getEntityInformation() {
return entityInformation;
if (metadata == null) {
Class<?> returnedObjectType = getReturnedObjectType();
Class<?> domainClass = getDomainClass();
MongoPersistentEntity<?> returnedEntity = mappingContext.getPersistentEntity(getReturnedObjectType());
MongoPersistentEntity<?> managedEntity = mappingContext.getPersistentEntity(domainClass);
returnedEntity = returnedEntity == null ? managedEntity : returnedEntity;
MongoPersistentEntity<?> collectionEntity = domainClass.isAssignableFrom(returnedObjectType) ? returnedEntity
: managedEntity;
this.metadata = new SimpleMongoEntityMetadata<Object>((Class<Object>) returnedEntity.getType(),
collectionEntity.getCollection());
}
return this.metadata;
}
/*
@@ -121,12 +144,11 @@ public class MongoQueryMethod extends QueryMethod {
}
/**
* Returns whether te query is a geoNear query.
* Returns whether te query is a geo near query.
*
* @return
*/
public boolean isGeoNearQuery() {
return isGeoNearQuery(this.method);
}
@@ -149,7 +171,7 @@ public class MongoQueryMethod extends QueryMethod {
*
* @return
*/
private Query getQueryAnnotation() {
Query getQueryAnnotation() {
return method.getAnnotation(Query.class);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2010 the original author or authors.
* Copyright 2002-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -77,4 +77,13 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return new MongoQueryCreator(tree, accessor, context, false).createQuery();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return tree.isCountProjection();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,10 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
@@ -27,6 +24,7 @@ import com.mongodb.DBCursor;
*
* @author Oliver Gierke
*/
@Deprecated
public abstract class QueryUtils {
private QueryUtils() {
@@ -34,51 +32,13 @@ public abstract class QueryUtils {
}
/**
* Applies the given {@link Pageable} to the given {@link Query}. Will do nothing if {@link Pageable} is
* {@literal null}.
* Turns an {@link Order} into an {@link org.springframework.data.mongodb.core.query.Order}.
*
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param pageable
* @deprecated use {@link Order} directly.
* @param order
* @return
*/
@Deprecated
public static Query applyPagination(Query query, Pageable pageable) {
if (pageable == null) {
return query;
}
query.limit(pageable.getPageSize());
query.skip(pageable.getOffset());
return query.with(pageable.getSort());
}
/**
* Applies the given {@link Sort} to the {@link Query}. Will do nothing if {@link Sort} is {@literal null}.
*
* @deprecated use {@link Query#with(Pageable)}.
* @param query must not be {@literal null}.
* @param sort
* @return
*/
@Deprecated
public static Query applySorting(Query query, Sort sort) {
if (sort == null) {
return query;
}
org.springframework.data.mongodb.core.query.Sort bSort = query.sort();
for (Order order : sort) {
bSort.on(order.getProperty(), toOrder(order));
}
return query;
}
public static org.springframework.data.mongodb.core.query.Order toOrder(Order order) {
return order.isAscending() ? org.springframework.data.mongodb.core.query.Order.ASCENDING
: org.springframework.data.mongodb.core.query.Order.DESCENDING;

View File

@@ -0,0 +1,60 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.util.Assert;
/**
* Bean based implementation of {@link MongoEntityMetadata}.
*
* @author Oliver Gierke
*/
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final String collectionName;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and collection name.
*
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
*/
public SimpleMongoEntityMetadata(Class<T> type, String collectionName) {
Assert.notNull(type, "Type must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.type = type;
this.collectionName = collectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.EntityMetadata#getJavaType()
*/
public Class<T> getJavaType() {
return type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoEntityMetadata#getCollectionName()
*/
public String getCollectionName() {
return collectionName;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,17 +38,21 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final String query;
private final String fieldSpec;
private final boolean isCountQuery;
/**
* Creates a new {@link StringBasedMongoQuery}.
*
* @param method
* @param template
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, mongoOperations);
this.query = query;
this.fieldSpec = method.getFieldSpecification();
this.isCountQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().count() : false;
}
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
@@ -82,6 +86,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return isCountQuery;
}
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor) {
Matcher matcher = PLACEHOLDER.matcher(input);

View File

@@ -1,66 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.util.Assert;
/**
* Simple {@link EntityInformationCreator} to to create {@link MongoEntityInformation} instances based on a
* {@link MappingContext}.
*
* @author Oliver Gierke
*/
public class DefaultEntityInformationCreator implements EntityInformationCreator {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public DefaultEntityInformationCreator(
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
Assert.notNull(mappingContext);
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.EntityInformationCreator#getEntityInformation(java.lang.Class)
*/
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return getEntityInformation(domainClass, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.support.EntityInformationCreator#getEntityInformation(java.lang.Class, java.lang.Class)
*/
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
Class<?> collectionClass) {
MongoPersistentEntity<T> persistentEntity = (MongoPersistentEntity<T>) mappingContext
.getPersistentEntity(domainClass);
String customCollectionName = collectionClass == null ? null : mappingContext.getPersistentEntity(collectionClass)
.getCollection();
return new MappingMongoEntityInformation<T, ID>(persistentEntity, customCollectionName);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,12 +22,11 @@ import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.query.MongoEntityMetadata;
import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
import org.springframework.data.mongodb.repository.query.QueryUtils;
import org.springframework.data.repository.core.support.QueryCreationListener;
import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.Type;
@@ -74,29 +73,29 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
return;
}
String property = part.getProperty().toDotPath();
Order order = toOrder(sort, property);
Direction order = toDirection(sort, property);
index.on(property, order);
}
// Add fixed sorting criteria to index
if (sort != null) {
for (Sort.Order order : sort) {
index.on(order.getProperty(), QueryUtils.toOrder(order));
index.on(order.getProperty(), order.getDirection());
}
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
MongoEntityMetadata<?> metadata = query.getQueryMethod().getEntityInformation();
operations.indexOps(metadata.getCollectionName()).ensureIndex(index);
LOG.debug(String.format("Created %s!", index));
}
private static Order toOrder(Sort sort, String property) {
private static Direction toDirection(Sort sort, String property) {
if (sort == null) {
return Order.DESCENDING;
return Direction.DESC;
}
org.springframework.data.domain.Sort.Order order = sort.getOrderFor(property);
return order == null ? Order.DESCENDING : order.isAscending() ? Order.ASCENDING : Order.DESCENDING;
return order == null ? Direction.DESC : order.isAscending() ? Direction.ASC : Direction.DESC;
}
}
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -57,7 +58,8 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends A
this.customCollectionName = customCollectionName;
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.EntityInformation#getId(java.lang.Object)
*/
@SuppressWarnings("unchecked")
@@ -65,6 +67,10 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends A
MongoPersistentProperty idProperty = entityMetadata.getIdProperty();
if (idProperty == null) {
return null;
}
try {
return (ID) BeanWrapper.create(entity, null).getProperty(idProperty);
} catch (Exception e) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,7 +39,6 @@ import com.mysema.query.apt.DefaultConfiguration;
*
* @author Oliver Gierke
*/
@SuppressWarnings("restriction")
@SupportedAnnotationTypes({ "com.mysema.query.annotations.*", "org.springframework.data.mongodb.core.mapping.*" })
@SupportedSourceVersion(SourceVersion.RELEASE_6)
public class MongoAnnotationProcessor extends AbstractQuerydslProcessor {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,10 +21,11 @@ import java.io.Serializable;
import java.lang.reflect.Method;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.query.MongoQueryMethod;
import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
@@ -46,20 +47,18 @@ import org.springframework.util.Assert;
public class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoOperations mongoOperations;
private final EntityInformationCreator entityInformationCreator;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoTemplate} and {@link MappingContext}.
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoOperations}.
*
* @param template must not be {@literal null}
* @param mappingContext
* @param mongoOperations must not be {@literal null}
*/
public MongoRepositoryFactory(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
this.mongoOperations = mongoOperations;
this.entityInformationCreator = new DefaultEntityInformationCreator(mongoOperations.getConverter()
.getMappingContext());
this.mappingContext = mongoOperations.getConverter().getMappingContext();
}
/*
@@ -117,7 +116,7 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, mappingContext);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
@@ -136,7 +135,16 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getEntityInformation(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return entityInformationCreator.getEntityInformation(domainClass);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(domainClass);
if (entity == null) {
throw new MappingException(String.format("Could not lookup mapping metadata for domain class %s!",
domainClass.getName()));
}
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity);
}
}
}

View File

@@ -41,8 +41,8 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
* @param operations the operations to set
*/
public void setMongoOperations(MongoOperations operations) {
this.operations = operations;
setMappingContext(operations.getConverter().getMappingContext());
}
/**

View File

@@ -49,13 +49,14 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
/**
* Creates a ew {@link SimpleMongoRepository} for the given {@link MongoEntityInformation} and {@link MongoTemplate}.
*
* @param metadata
* @param template
* @param metadata must not be {@literal null}.
* @param template must not be {@literal null}.
*/
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
Assert.notNull(metadata);
this.entityInformation = metadata;
this.mongoOperations = mongoOperations;
}
@@ -96,7 +97,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findById(id, entityInformation.getJavaType());
return mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
private Query getIdQuery(Object id) {
@@ -114,11 +115,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
final Query idQuery = getIdQuery(id);
idQuery.fields();
return mongoOperations.findOne(idQuery, entityInformation.getJavaType(), entityInformation.getCollectionName()) != null;
return mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName());
}
/*
@@ -126,7 +124,6 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @see org.springframework.data.repository.CrudRepository#count()
*/
public long count() {
return mongoOperations.getCollection(entityInformation.getCollectionName()).count();
}
@@ -136,7 +133,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
*/
public void delete(ID id) {
Assert.notNull(id, "The given id must not be null!");
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType());
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/*
@@ -166,7 +163,6 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @see org.springframework.data.repository.CrudRepository#deleteAll()
*/
public void deleteAll() {
mongoOperations.remove(new Query(), entityInformation.getCollectionName());
}
@@ -227,7 +223,6 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @return
*/
protected MongoOperations getMongoOperations() {
return this.mongoOperations;
}
@@ -235,7 +230,6 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
* @return the entityInformation
*/
protected MongoEntityInformation<T, ID> getEntityInformation() {
return entityInformation;
}
}

View File

@@ -63,7 +63,8 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getName());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}

View File

@@ -1,3 +1,5 @@
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd=org/springframework/data/mongodb/config/spring-mongo-1.0.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/springframework/data/mongodb/config/spring-mongo-1.1.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.2.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.3.xsd=org/springframework/data/mongodb/config/spring-mongo-1.3.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.3.xsd

View File

@@ -11,8 +11,7 @@
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd" />

View File

@@ -11,8 +11,7 @@
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />

View File

@@ -0,0 +1,482 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoTemplate">
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:schema>

View File

@@ -0,0 +1,597 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a Mongo. Will default to 'mongo'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.core.MongoTemplate">
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="abbreviate-field-names" use="optional" default="false">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy">
Enables abbreviating the field names for domain class properties to the
first character of their camel case names, e.g. fooBar -> fb.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:simpleType name="converterRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:element name="template">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string"
use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to
type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="gridFsTemplate">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

Some files were not shown because too many files have changed in this diff Show More