Compare commits

..

58 Commits

Author SHA1 Message Date
Spring Buildmaster
22933e4493 DATAMONGO-859 - Release version 1.5.0.M1. 2014-03-31 08:04:03 -07:00
Thomas Darimont
40aa6bbdd5 DATAMONGO-859 - Prepare release 1.5 M1.
Updated readme.md and mongodb.xml to reflect recent version. Updated Spring Data Commons and Spring Data Build versions in pom.xml. Update pom.xml to use release repository. Updated docbkx to use recent Spring Data Commons version. Updated changelog to reflect changes and releases.

Original pull request: #161.
2014-03-31 16:47:15 +02:00
Christoph Strobl
5e43f5846a DATAMONGO-471 - Add support for $each when using $addToSet.
Additionally to Update.addToSet(String, Object) the method 'addToSet(String)' has been introduced, returning a builder to allow the creation of $addToSet command for either single value, or multiple values using $each.

Using value:
new Update().addToSet("key").value("spring");

Using each:
new Update().addToSet("key").each("spring", "data", "mongodb");

Original Pull Request: #157.
2014-03-31 15:25:14 +02:00
Thomas Darimont
2cfd4781bc DATAMONGO-884 - Improved handling for Object methods in LazyLoadingInterceptor.
We now handle invocations of equals(…)/hashCode()/toString()  methods that are not overridden with custom proxy aware logic. This avoids potentially NullPointerExceptions and makes it easier to debug code that deals with proxies (due to a proper toString representation of a proxy).

Original pull request: #158.
2014-03-31 15:19:30 +02:00
Thomas Darimont
031ab0c07b DATACMNS-482 - Fix compiler error due to changes in SD Commons.
Fixed a compiler that got introduced by making the geospatial types in Spring Data Commons serializable.
2014-03-31 15:14:08 +02:00
Thomas Darimont
10f69f6623 DATAMONGO-884 - Fix potential NullPointerException for lazy DBRefs.
We now initialize the proxy in case an Object-method is called that is overridden in the traget class. Removed the additional check for initialization and to-DBRef-methods as they're repeated in the target method.

Original pull requests: #152, #153.
2014-03-27 17:57:39 +01:00
Thomas Darimont
d7b03915a7 DATAMONGO-858 - Revised rendering of geo spatial structures.
Switched back to the old style of rendering (as in 1.4.x) of DBObjects when they are used as values in persistent domain objects, adjusted the GeoConverters accordingly. In order to render geo structures correctly when they are used within a query we now wrap them in a GeoCommand that triggers a different Shape rendering.

We now render the metric that was used in the Distance definition of the radius of a Circle or Sphere.
2014-03-27 13:18:55 +01:00
Oliver Gierke
ed55d48a53 DATAMONGO-858 - Polishing.
Moved to use the newly introduced geo types from Spring Data Commons. Adde deprecation warning suppression everywhere else.

Adapted Sonargraph architecture description file and split up namespace registration into repository specific stuff and everything else.
2014-03-27 13:18:55 +01:00
Thomas Darimont
d5ed4e0ac2 DATAMONGO-858 - Add support for common geospatial structures.
Backed the geo spatial structures of SD MongoDB by the new geo spatial structures in SD commons. Deprecated the MongoDB geo spatial types to make users aware that we're going to remove them in one of the following development iterations. Added custom conversions for basic geo spatial types.

We deliberately choose not to let Circle extends CMNS geo.Circle since it would break clients that use the legacy Circle API (getRadius() returns a Distance in CMNS where as it returns a plain double in Mongo).
2014-03-27 13:18:55 +01:00
Oliver Gierke
75194730e9 DATAMONGO-887 - Added unit tests to verify TreeMaps can be converted. 2014-03-27 08:58:53 +01:00
Oliver Gierke
a09183d2eb DATAMONGO-880 - Minor polishing in lazy-loading area.
Took the change to add @since tags to the types introduced for lazy loading. Polished JavaDoc where necessary. Removed methods solely existing for testing purposes and use reflection in tests to minimize the API being published.
2014-03-20 09:27:37 +01:00
Thomas Darimont
45dd3cd988 DATAMONGO-880 - Improved handling of persistence of lazy-loaded DBRefs.
Added LazyLoadingProxy interface that will be implemented by every LazyLoading-proxy that is created by the DefaultDbRefResolver. Clients can now cast those proxies to this interface and call it's methods initialize a proxy explicitly or to get the referenced DBRef if possible.

We now keep a reference to the DBRef that lead to the creation of a LazyLoadingProxy in order to be able to reuse it in case one assigns the proxy to a field that should be a DBRef. This avoids unnecessary conversion.

Previously saving of proxies wasn't possible since the mapping infrastructure did not know how to extract the entity information from the proxy. We now either store the DBRef backed by the proxy directly or we initialize the proxy first and use the result of LazyLoadingProxy.initialize().

Original pull request: #151.
2014-03-20 09:26:08 +01:00
Oliver Gierke
b24e34c360 DATAMONGO-883 - Adapted to changes in auditing config in Spring Data MongoDB. 2014-03-18 20:10:43 +01:00
Oliver Gierke
fa9b5efdab DATAMONGO-882 - Adapted to removal of obsolete generics in BeanWrapper. 2014-03-18 20:08:22 +01:00
Oliver Gierke
8f2ced8ada DATAMONGO-881 - Allow custom conversions to override default conversions.
User provided converters are now registered *after* the default converters to make sure they enjoy precedence over the default ones. 

This is achieved by inverting the order of converters after the conversions have been registered. This is necessary as the registration order for convertible pairs is different from the one of the converters. For the pairs, earlier registered instances take precedence, while for the actual converter instances, instances registered later trump ones registered before.
2014-03-18 09:32:28 +01:00
Oliver Gierke
ff92cf1429 DATAMONGO-566 - Polishing.
Inlined a few methods to reduce the number of indirections. Added a bit of missing JavaDoc here and there. StringBasedMongoQuery now prevents a manually defined query from being marked as both count and delete query.

Polished test cases a little.

Original pull request: #147.
2014-03-17 17:51:54 +01:00
Christoph Strobl
ba48290a3e DATAMONGO-566 - Add support for derived delete-by queries.
Using keywords remove or delete in derived query, or setting @Query(delete=true) removes documents matching the query. If the return type is assignable to Number, the total number of affected documents is returned. In case the return type is collection like the query is executed against the store in first place. All documents included in the resulting collection are deleted in a subsequent call.

Additionally findAllAndRemove(…) methods have been added to MongoTemplate.

Original pull request: #147.
2014-03-17 17:51:08 +01:00
Oliver Gierke
70e5efd0d9 DATAMONGO-877 - Added guard against null-package in AbstractMappingConfiguration.
AbstractMappingConfiguration.getMappingBasePackage() now quards against a null package returned for the configuration class. This can happen if the class resides in the default package.
2014-03-10 12:47:26 +01:00
Oliver Gierke
4eae229bff DATAMONGO-876 - Adapt to API changes introduced for better property access config.
Adapted usage of BeanWrapper as the property access is now solely defined via the PersistentProperty. Adapted MongoPersistentEntityIndexCreator to lookup annotations via PersistentProperty instead of the backing field. Removed code from BasicMongoPersistentProperty which is now already implemented in the Spring Data Commons types.
2014-03-07 14:38:23 +01:00
Oliver Gierke
47f0607c49 DATAMONGO-809 - Polishing.
Added ticket references to test cases in GridFsTemplateIntegratinoTests.
2014-03-07 08:30:43 +01:00
Martin Baumgartner
753e794194 DATAMONGO-809 - Filename is now optional when storing files to GridFS.
Added method overloads to GridFsOperations and GridFsTemplate to store files without a filename given.

Original pull request: #119.
2014-03-07 08:30:43 +01:00
Thomas Darimont
d27bec8ed5 DATAMONGO-773 - Verify that @DBRef fields can be included in query.
Added test cases to verify that projection search with included @DBRef fields works as expected.

Original pull request: #142.
2014-03-06 13:34:17 +01:00
Christoph Strobl
c63f7f75dc DATAMONGO-868 - MongoTemplate.findAndModify(…) increases version if not handled manually.
MongoTemplate.findAndModify(…) increments the version property in case it's not manually set in the Update object given.

Original Pull Request: #141.
2014-03-06 11:51:13 +01:00
Christoph Strobl
84040518cf DATAMONGO-863 - UpdateMapper doesn't convert raw DBObjects anymore.
UpdateMapper now only performs simple conversion if it encounters a DBObject, instead of deep inspection on keywords used. This allows to use custom clauses nested in Update for operations not directly supported.

Original Pull Request: #138.
2014-03-06 11:45:35 +01:00
Christoph Strobl
c66b9a538c DATAMONGO-821 - Fixed handling of keyword expressions for DBRefs.
Query Mapper skips DBRef conversion in case the given source value is a nested DBObject. This allows to directly use mongodb operators wrapped in DBObject on association properties.

Original Pull Request: #139.
2014-03-06 11:22:39 +01:00
Oliver Gierke
a0c6b9aa64 DATAMONGO-843 - Register default MongoMappingContext for auditing.
The MongoAuditingRegistrar now also register a fallback MongoMappingContext in case none is present in the BeanDefinitionRegistry.
2014-03-06 09:10:31 +01:00
Thomas Darimont
9370c1ee01 DATAMONGO-843 - Improvements in auditing configuration.
Repositories now declare a fallback MappingContext in case none is configured explicitly to make sure @EnableMongoAuditing also works without an explicit MappingContext bean defined.

AuditingEntityListener is now referring to the IsNewAwareAuditingHandler via an intermediate ObjectFactory to prevent the downstream dependencies from being instantiated eagerly at listener init time.

This is to prevent circular initialization dependencies as Spring accesses ApplicationEventListener beans very early in the container lifecycle to check whether they might be interested in a certain even and just dropped immediately afterwards.

Changed BeanNames.MAPPING_CONTEXT constant to mongoMappingContext to let the XML configuration be consistent with AbstractMongoConfiguration.mongoMappingContext().
2014-03-05 19:50:27 +01:00
Oliver Gierke
2839e9491f DATAMONGO-871 - Add support for arrays as query method return types.
Changed AbstractMongoQuery to potentially convert all query execution results using the DefaultConversionService in case the query result doesn't match the expected return value.

This allows arrays to be returned for collection queries as the conversion service cam transparently convert between collections and arrays.
2014-03-05 10:03:09 +01:00
Oliver Gierke
6963f9e07a DATAMONGO-870 - Added support for sliced query execution.
Added support for Slice as return type for query methods. The execution will expand the requested page size by one to read one more element than actually requested. If that additional element is returned, it will considered to be an indicator for whether a next slice is available.

Related issues: DATACMNS-397.
2014-03-04 17:37:26 +01:00
Thomas Darimont
8dd08a36a0 DATAMONGO-865 - Adjust test dependencies to avoid ClassNotFoundException during test runs.
Added jul-to-slf4j dependency to avoid exceptions being logged during test runs.

Original pull request: #135.
2014-03-04 09:43:49 +01:00
Christoph Strobl
a908e89ef7 DATAMONGO-829 - NearQuery should not default 'num' to zero.
NearQuery now ignores query.getLimit() equal to zero, when adding Query to NearQuery. This has to be done as limit is defaulted to zero within Query which then results in unintended propagation of the parameter.

In case 'num' should be explicitly set to zero one might use 'NearQuery.num(0)' as an alternative to the query approach.

Introduced 'null' check for 'NearQuery.query(Query)' and 'NearQuery.with(Pageable)' along the way.

Original Pull Request: #133
2014-03-03 15:24:09 +01:00
Christoph Strobl
5ace4032ed DATAMONGO-862 - Fixed handling of unmapped paths for updates.
UpdateMapper uses key instead of cleaned property path when not directly pointing to a property.

Original pull request: #132.
2014-02-27 16:55:54 +01:00
Oliver Gierke
621b299f6f DATAMONGO-833 - Add support for reading EnumSets and EnumMaps.
Switched to use Spring Data Commons' CollectionFactory that is capable of creating EnumSets and EnumMaps. Added unit test inspired by pull request #113 for EnumSets and an additional one for EnumMaps.

Slightly refactored the algorithm for reading maps to prevent repeated type lookups.

Related pull request: #113.
2014-02-26 05:35:55 +01:00
Spring Buildmaster
a2628d1b74 DATAMONGO-854 - Prepare next development iteration. 2014-02-24 15:31:41 +01:00
Spring Buildmaster
294616432d DATAMONGO-854 - Release version 1.4.0 RELEASE. 2014-02-24 06:25:32 -08:00
Christoph Strobl
47dd512f95 DATAMONGO-854 - Prepare 1.4.0.RELEASE.
Update artifact version in readme for release and snapshot.
Use commons 1.7.0 resources in docbkx.
Update changelog.
Update version information in notice and readme.

Original pull request: #130.
2014-02-24 15:11:40 +01:00
Thomas Darimont
f16e8d85e5 DATAMONGO-856 - Documentation update.
Removed outdated why-spring-data-doc. Removed CouchDB reference from requirements document. Fixed some typos. Added missing opening <para>-element. Fixed rendering of author information. Added copyright and product name information.

Original pull request: #128.
2014-02-24 11:43:58 +01:00
Christoph Strobl
eb03ae61f2 DATAMONGO-856 - Documentation updates.
Updated references from springsource.org to spring.io. Updated references to mongodb.org. Update vendor to Pivotal Software, Inc. Update required/recommended versions. Update CustomConversions configuration section. Added missing section id's. Fixed some typos. Added missing JavaDoc.

Original pull request: #128.
2014-02-24 11:42:59 +01:00
Christoph Strobl
5be66a3fee DATAMONGO-853 - Update does not allow null keys anymore.
Added check for blank / null keys when adding key to Update.

Original pull request: #129.
2014-02-24 10:51:48 +01:00
Thomas Darimont
d88e4c0e3e DATAMONGO-468 - Verify that one can use a domain object in DbRef field updates.
Added test case to demonstrate that using a domain object as a value for a DbRef field update is already supported.

Original pull request: #127.
2014-02-21 14:03:35 +01:00
Christoph Strobl
57d1449008 DATAMONGO-852 - Update keeps track of fields to be modified.
Update holds a set of fields that modifications are registered for. This information is used to determine if a modification is registered for the version field of a versioned entity. The change was introduced since the present solution did not correctly find the version property correctly within the DBObject resulting from the mapped update.

In case version property is already included in Update automatic version update via $inc is will be skipped.

Original pull request: #126
2014-02-21 13:57:58 +01:00
Oliver Gierke
8d00a0d926 DATAMONGO-404 - Polishing of DBRef creation in Update clauses.
Refactored the internals of UpdateMapper to simplify the code a little. Removed the special converter in favor of handling the mapped key generation directly. This can be removed again, once DATACMNS-444 is fixed.

MetadataBackedField.getPath(String) now also rejects PersistentPropertyPaths the refer to anything else but the id property in case it traverses an association.

Changed MetadataBackedField to return the association property in calls to ….getProperty() as it is the PersistentProperty to hand to the mapping infrastructure for object conversion.

Changed MappingMongoConverter to also check, whether the given source object handed into DBRef creation is of the ID type and simply use that for DBRef creation. This allows creating DBRefs from ids as well.
2014-02-19 21:13:01 +01:00
Oliver Gierke
e3fa844488 DATAMONGO-854 - Upgraded to Spring Data Commons snapshots.
Upgraded to snapshots of Spring Data Commons and the build parent.
2014-02-19 20:02:34 +01:00
Thomas Darimont
58bee75a6b DATAMONGO-404 - Fixed Update.pull(…) handling to work with DBRefs.
We now support pointing to DBRef-mapped properties in Update.pull(…) and also allow to refer to the id of the DBRef to avoid having to create an instance of the entity.
2014-02-19 18:00:49 +01:00
Oliver Gierke
a402395f5c DATAMONGO-849 - Fixed invalid class reference in readme.
Minor sample code optimizations to use MongoClient instead of Mongo. Fixed repository URL.
2014-02-17 16:38:25 +01:00
Oliver Gierke
9d5f8f3ba0 DATAMONGO-848 - Added tweaks to be compatible with Java driver 2.12.
Added build profile to be able to build against next Mongo Java driver version (2.12.0-rc0) currently. Tweaked Bundlor version replacements to allow binding non-OSGi compatible Mongo driver versions.

Exception translator now handles newly introduced MongoServerSelectionException which the driver throws to indicate it can't connect to a MongoDB instance as of driver version 2.12. GridFsTemplate now uses an empty query object instead of null to indicate that no query should be used. 

Adapted test cases to be able to deal with the slightly changed representation of serverUsed in command results (2.12 removed leading slash).
2014-02-17 12:42:19 +01:00
Christoph Strobl
7ebf953063 DATAMONGO-354 - Update.pushAll(…) now supports multiple values.
Update.pushAll(…) now is a multiFieldOperation which allows to send values for different fields within one command.

Original Pull Request: #122.
2014-02-17 11:45:15 +01:00
Christoph Strobl
617ebe0ca7 DATAMONGO-828 - Fixed version checks for updates in MongoTemplate.
Added inspection of the query object to check if the update should only apply to a given version. If so and no documents have been updated we still throw an OptimisticLockingException. For all other cases - like UpdateFirst - zero affected documents is fine.

Original Pull Request: #121.
2014-02-17 11:30:38 +01:00
Christoph Strobl
7f76789664 DATAMONGO-410 - Added test case to show that UpdateMapper considers custom converter.
Original pull request: #124.
2014-02-17 11:03:52 +01:00
Oliver Gierke
81e5919ace DATAMONGO-812 - Added assumptions to not break tests on old MongoDB versions. 2014-02-11 18:50:09 +01:00
Oliver Gierke
efd74956dc DATAMONGO-812 - Polishing.
Changed convertToMongoType(…) to forward type hints to recursive calls to make sure type information is written if a TypeInformation was provided initially. Make sure that UpdateMapper hands in an initial type hint to the converter to make sure type information gets written.

Changed the signature of QueryMapper.getMappedObjectForField(…) to allow customizing the entire entry being added to the result. This is in preparation of more advanced mappings that might have to customize the mapped key.

Fixed newly introduced test cases in MongoTemplateTests.

Original pull request: #112.
2014-02-11 17:39:08 +01:00
Christoph Strobl
49eee40f7e DATAMONGO-812 - Add support for $push $each since $pushAll is deprecated.
$pushAll has been deprecated in MongoDB 2.4. Instead of calling pushAll one can use push in combination with each. The abstraction for pushAll will remain in code for now but may be removed in a subsequent version.

Original pull request: #112.
2014-02-11 17:39:07 +01:00
Thomas Darimont
8e93b844c7 DATAMONGO-830 - Prevent NullPointerException during cache warmup in CustomConversions.
We now use a ConcurrentHashMap to cache the results of custom read target lookups in order to avoid having to traverse the readingPairs for every lookup. The use of ConcurrentHashMap should also prevent potentially NullPointerExceptions from being thrown if custom conversions are initialized in heavily threaded environments.

Original pull request: #117.
2014-02-10 18:49:01 +01:00
Thomas Darimont
3e64432f1a DATAMONGO-842 - Improve documentation in GridFS section.
Rephrased wording for better understanding.

Original pull request: #120.
2014-02-10 10:30:57 +01:00
Thomas Darimont
88c968ad36 DATAMONGO-840 - Improve support for nested field references in SpEL expressions within Projections.
We now correctly add a compound expression that represents a field reference to the previous operation arguments if necessary.

Original pull request: #118.
2014-02-10 10:18:14 +01:00
Thomas Darimont
99eefe0773 DATAMONGO-838 - Cannot refer to expression based field in group operation.
Previously we didn't set a proper target value for the generated expression field. As a potential fix we just use the alias as the target field.

Original pull request: #116.
2014-02-10 09:59:15 +01:00
Oliver Gierke
3d4569be14 DATAMONGO-826 - Remove obsolete milestone repository. 2014-02-09 14:33:27 +01:00
Spring Buildmaster
57455c4a26 DATAMONGO-826 - Prepare next development iteration. 2014-01-29 06:04:53 -08:00
146 changed files with 6514 additions and 1475 deletions

View File

@@ -1,6 +1,6 @@
# Spring Data MongoDB
The primary goal of the [Spring Data](http://www.springsource.org/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
@@ -8,12 +8,12 @@ The Spring Data MongoDB project aims to provide a familiar and consistent Spring
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://static.springsource.org/spring-data/data-mongodb/docs/current/reference/html/)
* the [JavaDocs](http://static.springsource.org/spring-data/data-mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://www.springsource.org/spring-data/mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.springsource.org/forumdisplay.php?f=80).
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.spring.io/forum/spring-projects/data/nosql).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://www.springsource.org/projects).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.3.RELEASE</version>
<version>1.4.1.RELEASE</version>
</dependency>
```
@@ -36,13 +36,13 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.0.BUILD-SNAPSHOT</version>
<version>1.5.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.springsource.org/libs-snapshot</url>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
@@ -53,7 +53,7 @@ MongoTemplate is the central support class for Mongo database operations. It pro
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/dao.html#dao-exceptions).
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
@@ -81,7 +81,7 @@ class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public Mongo mongo() throws Exception {
return new Mongo();
return new MongoClient();
}
@Override
@@ -94,9 +94,9 @@ class ApplicationConfig extends AbstractMongoConfiguration {
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
```xml
<bean id="template" class="org.springframework.data.document.mongodb.MongoTemplate">
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.Mongo">
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
@@ -139,9 +139,9 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.springsource.org/forumdisplay.php?f=80) by responding to questions and joining the debate.
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.spring.io/forum/spring-projects/data/nosql) by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://www.springsource.org/node/feed) to springframework.org
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

28
pom.xml
View File

@@ -5,20 +5,20 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<url>http://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.3.0.RC1</version>
<version>1.4.0.M1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
@@ -29,8 +29,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.7.0.RC1</springdata.commons>
<springdata.commons>1.8.0.M1</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
</properties>
<developers>
@@ -102,6 +103,16 @@
</developer>
</developers>
<profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
</properties>
</profile>
</profiles>
<dependencies>
<!-- MongoDB -->
<dependency>
@@ -112,14 +123,9 @@
</dependencies>
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.springsource.org/libs-milestone-local</url>
<url>http://repo.spring.io/libs-milestone/</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
</dependency>
<dependency>

View File

@@ -1,13 +1,13 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: SpringSource
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="0",
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,9 +1,9 @@
Bundle-SymbolicName: org.springframework.data.mongodb.log4j
Bundle-Name: Spring Data Mongo DB Log4J Appender
Bundle-Vendor: SpringSource
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo:[=.=,+1.0.0)}",
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -5,15 +5,6 @@
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
</element>
<architecture>
<element type="Layer" name="Config">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|GridFS" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Monitoring" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories" type="AllowedDependency"/>
</element>
<element type="Layer" name="Repositories">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.repository.**"/>
@@ -40,10 +31,20 @@
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
<element type="Layer" name="Config">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|GridFS" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Monitoring" type="AllowedDependency"/>
</element>
<element type="Layer" name="Monitoring">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.monitor.**"/>
@@ -57,41 +58,39 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
<element type="Layer" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.**"/>
</element>
<element type="TypeFilter" name="Assignment"/>
<element type="Subsystem" name="Mapping">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapping.**"/>
<element type="IncludeTypePattern" name="**.core.mapping.**"/>
</element>
</element>
<element type="Subsystem" name="Geospatial">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.geo.**"/>
<element type="IncludeTypePattern" name="**.core.geo.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Query">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.query.**"/>
<element type="IncludeTypePattern" name="**.core.query.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.convert.**"/>
<element type="IncludeTypePattern" name="**.core.convert.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.spel.**"/>
<element type="IncludeTypePattern" name="**.core.spel.**"/>
</element>
</element>
<element type="Subsystem" name="Aggregation">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.aggregation.**"/>
<element type="IncludeTypePattern" name="**.core.aggregation.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
@@ -100,7 +99,7 @@
</element>
<element type="Subsystem" name="Index">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.index.**"/>
<element type="IncludeTypePattern" name="**.core.index.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
@@ -116,6 +115,13 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.util.**"/>
</element>
<stereotype name="Unrestricted"/>
<stereotype name="Public"/>
</element>
</element>
<element type="Subsystem" name="API">
<element type="TypeFilter" name="Assignment">

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RC1</version>
<version>1.5.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -137,6 +137,13 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
@@ -145,7 +152,7 @@
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
<version>1.0.8</version>
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>

View File

@@ -116,7 +116,9 @@ public abstract class AbstractMongoConfiguration {
* entities.
*/
protected String getMappingBasePackage() {
return getClass().getPackage().getName();
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,13 +24,14 @@ package org.springframework.data.mongodb.config;
*/
public abstract class BeanNames {
static final String MAPPING_CONTEXT = "mappingContext";
static final String INDEX_HELPER = "indexCreationHelper";
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY = "isNewStrategyFactory";
public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext";
static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper";
static final String MONGO_BEAN_NAME = "mongo";
static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory";
static final String DEFAULT_CONVERTER_BEAN_NAME = "mappingConverter";
static final String MONGO_TEMPLATE = "mongoTemplate";
static final String GRID_FS_TEMPLATE = "gridFsTemplate";
static final String MONGO_TEMPLATE_BEAN_NAME = "mongoTemplate";
static final String GRID_FS_TEMPLATE_BEAN_NAME = "gridFsTemplate";
}

View File

@@ -43,7 +43,7 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE;
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
@@ -64,7 +64,7 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
if (StringUtils.hasText(dbFactoryRef)) {
gridFsTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
gridFsTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
}
if (StringUtils.hasText(converterRef)) {
@@ -77,7 +77,7 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
gridFsTemplateBuilder.addConstructorArgValue(bucket);
}
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE)
return (AbstractBeanDefinition) helper.getComponentIdButFallback(gridFsTemplateBuilder, BeanNames.GRID_FS_TEMPLATE_BEAN_NAME)
.getBeanDefinition();
}
}

View File

@@ -86,7 +86,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : "mappingConverter";
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
@@ -98,7 +98,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
// Converter
@@ -116,10 +116,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
}
try {
registry.getBeanDefinition(INDEX_HELPER);
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY;
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
@@ -128,14 +128,14 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
INDEX_HELPER));
INDEX_HELPER_BEAN_NAME));
}
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null) {
parserContext.registerBeanComponent(new BeanComponentDefinition(validatingMongoEventListener,
VALIDATING_EVENT_LISTENER));
VALIDATING_EVENT_LISTENER_BEAN_NAME));
}
parserContext.registerBeanComponent(new BeanComponentDefinition(converterBuilder.getBeanDefinition(), id));
@@ -180,7 +180,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return new RuntimeBeanReference(validatorName);
}
static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
@@ -215,7 +215,8 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
ctxRef = converterId + "." + MAPPING_CONTEXT;
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
return ctxRef;
@@ -310,9 +311,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY));
context.registerBeanComponent(builder.getComponent(mappingContextStrategyFactoryBuilder,
IS_NEW_STRATEGY_FACTORY_BEAN_NAME));
return IS_NEW_STRATEGY_FACTORY;
return IS_NEW_STRATEGY_FACTORY_BEAN_NAME;
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,19 @@
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.auditing.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
@@ -58,23 +63,24 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
String mappingContextRef = element.getAttribute("mapping-context-ref");
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
if (!StringUtils.hasText(mappingContextRef)) {
String mappingContextName = BeanNames.MAPPING_CONTEXT;
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT)) {
mappingContextName = MappingMongoConverterParser.potentiallyCreateMappingContext(element, parserContext, null,
BeanNames.DEFAULT_CONVERTER_BEAN_NAME);
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, new RootBeanDefinition(MongoMappingContext.class));
}
MappingMongoConverterParser.createIsNewStrategyFactoryBeanDefinition(mappingContextName, parserContext, element);
mappingContextRef = MAPPING_CONTEXT_BEAN_NAME;
}
BeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(BeanNames.IS_NEW_STRATEGY_FACTORY);
BeanDefinition handlerBeanDefinition = parser.parse(element, parserContext);
IsNewAwareAuditingHandlerBeanDefinitionParser parser = new IsNewAwareAuditingHandlerBeanDefinitionParser(
mappingContextRef);
parser.parse(element, parserContext);
builder.addConstructorArgValue(handlerBeanDefinition);
builder.addConstructorArgValue(getObjectFactoryBeanDefinition(parser.getResolvedBeanName(),
parserContext.extractSource(element)));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,22 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AnnotationAuditingConfiguration;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
@@ -47,6 +52,15 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
@@ -57,22 +71,22 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
registerIsNewStrategyFactoryIfNecessary(registry);
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AnnotationAuditingConfiguration)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AnnotationAuditingConfiguration configuration) {
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AnnotationAuditingConfiguration must not be null!");
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
return configureDefaultAuditHandlerAttributes(configuration,
BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class)).addConstructorArgReference(
BeanNames.IS_NEW_STRATEGY_FACTORY);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
@@ -86,20 +100,31 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
registerInfrastructureBeanWithId(BeanDefinitionBuilder.rootBeanDefinition(AuditingEventListener.class)
.addConstructorArgValue(auditingHandlerDefinition).getRawBeanDefinition(),
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* @param registry, the {@link BeanDefinitionRegistry} to use to register an {@link IsNewStrategyFactory} to.
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
*/
private void registerIsNewStrategyFactoryIfNecessary(BeanDefinitionRegistry registry) {
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
registry.registerBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY,
BeanDefinitionBuilder.rootBeanDefinition(MappingContextIsNewStrategyFactory.class)
.addConstructorArgReference(BeanNames.MAPPING_CONTEXT).getBeanDefinition());
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 by the original author(s).
* Copyright 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -54,7 +54,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY;
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
@@ -103,7 +103,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY)
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY_BEAN_NAME)
.getBeanDefinition();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,9 +16,6 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
@@ -34,10 +31,6 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
*/
public void init() {
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -58,7 +58,7 @@ public class MongoParser implements BeanDefinitionParser {
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO;
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -35,6 +35,7 @@ import org.w3c.dom.Element;
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
* @author Martin Baumgartner
* @author Oliver Gierke
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
@@ -47,7 +48,7 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
throws BeanDefinitionStoreException {
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE;
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
@@ -68,7 +69,7 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
if (StringUtils.hasText(dbFactoryRef)) {
mongoTemplateBuilder.addConstructorArgReference(dbFactoryRef);
} else {
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY);
mongoTemplateBuilder.addConstructorArgReference(BeanNames.DB_FACTORY_BEAN_NAME);
}
if (StringUtils.hasText(converterRef)) {
@@ -80,7 +81,7 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder, BeanNames.MONGO_TEMPLATE)
.getBeanDefinition();
return (AbstractBeanDefinition) helper.getComponentIdButFallback(mongoTemplateBuilder,
BeanNames.MONGO_TEMPLATE_BEAN_NAME).getBeanDefinition();
}
}

View File

@@ -60,6 +60,11 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
// Driver 2.12 throws this to indicate connection problems. String comparison to avoid hard dependency
if (ex.getClass().getName().equals("com.mongodb.MongoServerSelectionException")) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}

View File

@@ -1,6 +1,6 @@
/*
* Copyright 2010-2013 the original author or authors.
*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
@@ -23,7 +23,6 @@ import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
@@ -50,7 +49,10 @@ import com.mongodb.WriteResult;
* @author Oliver Gierke
* @author Tobias Trelle
* @author Chuong Ngo
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
@@ -413,7 +415,7 @@ public interface MongoOperations {
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
*
* @param near must not be {@literal null}.
@@ -423,7 +425,7 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}.
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
@@ -468,10 +470,32 @@ public interface MongoOperations {
*/
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
*/
boolean exists(Query query, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @return
*/
boolean exists(Query query, Class<?> entityClass);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @param collectionName name of the collection to check for objects.
* @return
*/
boolean exists(Query query, Class<?> entityClass, String collectionName);
/**
@@ -529,12 +553,58 @@ public interface MongoOperations {
*/
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parameterized type.
* @return
*/
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parameterized type.
* @param collectionName the collection to query.
* @return
*/
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parameterized type.
* @return
*/
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parameterized type.
* @param collectionName the collection to query.
* @return
*/
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName);
@@ -598,9 +668,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -655,9 +725,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Conversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
@@ -672,9 +742,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Spring 3.0's new Type Cobnversion API.
* See <a href="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">Spring 3 Type
* Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
@@ -795,7 +865,7 @@ public interface MongoOperations {
*
* @param object
*/
void remove(Object object);
WriteResult remove(Object object);
/**
* Removes the given object from the given collection.
@@ -803,7 +873,7 @@ public interface MongoOperations {
* @param object
* @param collection must not be {@literal null} or empty.
*/
void remove(Object object, String collection);
WriteResult remove(Object object, String collection);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
@@ -812,9 +882,17 @@ public interface MongoOperations {
* @param query
* @param entityClass
*/
void remove(Query query, Class<?> entityClass);
WriteResult remove(Query query, Class<?> entityClass);
void remove(Query query, Class<?> entityClass, String collectionName);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query
* @param entityClass
* @param collectionName
*/
WriteResult remove(Query query, Class<?> entityClass, String collectionName);
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
@@ -823,7 +901,40 @@ public interface MongoOperations {
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
void remove(Query query, String collectionName);
WriteResult remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, String collectionName);
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
* @param query
* @param entityClass
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass);
/**
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
* @param query
* @param entityClass
* @param collectionName
* @return
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Returns the underlying {@link MongoConverter}.

View File

@@ -1,8 +1,26 @@
/*
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.transaction.support.ResourceHolder;
import org.springframework.transaction.support.ResourceHolderSynchronization;
/**
* @author Oliver Gierke
*/
class MongoSynchronization extends ResourceHolderSynchronization<ResourceHolder, Object> {
public MongoSynchronization(ResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -27,6 +27,7 @@ import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Scanner;
import java.util.Set;
@@ -47,9 +48,12 @@ import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.annotation.Id;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
@@ -67,10 +71,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -95,6 +96,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.jca.cci.core.ConnectionCallback;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
@@ -113,7 +115,6 @@ import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -129,6 +130,7 @@ import com.mongodb.util.JSONParseException;
* @author Chuong Ngo
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoTemplate.class);
@@ -741,8 +743,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
if (mongoPersistentEntity != null && mongoPersistentEntity.hasVersionProperty()) {
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(entity,
this.mongoConverter.getConversionService());
BeanWrapper<Object> wrapper = BeanWrapper.create(entity, this.mongoConverter.getConversionService());
wrapper.setProperty(mongoPersistentEntity.getVersionProperty(), 0);
}
}
@@ -836,12 +837,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private <T> void doSaveVersioned(T objectToSave, MongoPersistentEntity<?> entity, String collectionName) {
BeanWrapper<PersistentEntity<T, ?>, T> beanWrapper = BeanWrapper.create(objectToSave,
this.mongoConverter.getConversionService());
BeanWrapper<T> beanWrapper = BeanWrapper.create(objectToSave, this.mongoConverter.getConversionService());
MongoPersistentProperty idProperty = entity.getIdProperty();
MongoPersistentProperty versionProperty = entity.getVersionProperty();
Number version = beanWrapper.getProperty(versionProperty, Number.class, !versionProperty.usePropertyAccess());
Number version = beanWrapper.getProperty(versionProperty, Number.class);
// Fresh instance -> initialize version property
if (version == null) {
@@ -855,7 +855,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
Number number = beanWrapper.getProperty(versionProperty, Number.class, false);
Number number = beanWrapper.getProperty(versionProperty, Number.class);
beanWrapper.setProperty(versionProperty, number.longValue() + 1);
BasicDBObject dbObject = new BasicDBObject();
@@ -1016,7 +1016,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
: collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
if (entity != null && entity.hasVersionProperty() && !multi) {
if (writeResult.getN() == 0) {
if (writeResult.getN() == 0 && dbObjectContainsVersionProperty(queryObj, entity)) {
throw new OptimisticLockingFailureException("Optimistic lock exception on saving entity: "
+ updateObj.toMap().toString() + " to collection " + collectionName);
}
@@ -1031,32 +1031,64 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
private void increaseVersionForUpdateIfNecessary(MongoPersistentEntity<?> persistentEntity, Update update) {
if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
String versionPropertyField = persistentEntity.getVersionProperty().getFieldName();
if (!update.getUpdateObject().containsField(versionPropertyField)) {
update.inc(versionPropertyField, 1L);
String versionFieldName = persistentEntity.getVersionProperty().getFieldName();
if (!update.modifies(versionFieldName)) {
update.inc(versionFieldName, 1L);
}
}
}
public void remove(Object object) {
private boolean dbObjectContainsVersionProperty(DBObject dbObject, MongoPersistentEntity<?> persistentEntity) {
if (object == null) {
return;
if (persistentEntity == null || !persistentEntity.hasVersionProperty()) {
return false;
}
remove(getIdQueryFor(object), object.getClass());
return dbObject.containsField(persistentEntity.getVersionProperty().getFieldName());
}
public void remove(Object object, String collection) {
public WriteResult remove(Object object) {
if (object == null) {
return null;
}
return remove(getIdQueryFor(object), object.getClass());
}
public WriteResult remove(Object object, String collection) {
Assert.hasText(collection);
if (object == null) {
return;
return null;
}
doRemove(collection, getIdQueryFor(object), object.getClass());
return doRemove(collection, getIdQueryFor(object), object.getClass());
}
/**
* Returns {@link Entry} containing the {@link MongoPersistentProperty} defining the {@literal id} as
* {@link Entry#getKey()} and the {@link Id}s property value as its {@link Entry#getValue()}.
*
* @param object
* @return
*/
private Map.Entry<MongoPersistentProperty, Object> extractIdPropertyAndValue(Object object) {
Assert.notNull(object, "Id cannot be extracted from 'null'.");
Class<?> objectType = object.getClass();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + objectType);
}
Object idValue = BeanWrapper.create(object, mongoConverter.getConversionService())
.getProperty(idProp, Object.class);
return Collections.singletonMap(idProp, idValue).entrySet().iterator().next();
}
/**
@@ -1067,21 +1099,31 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private Query getIdQueryFor(Object object) {
Assert.notNull(object);
Map.Entry<MongoPersistentProperty, Object> id = extractIdPropertyAndValue(object);
return new Query(where(id.getKey().getFieldName()).is(id.getValue()));
}
Class<?> objectType = object.getClass();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(objectType);
MongoPersistentProperty idProp = entity == null ? null : entity.getIdProperty();
/**
* Returns a {@link Query} for the given entities by their ids.
*
* @param objects must not be {@literal null} or {@literal empty}.
* @return
*/
private Query getIdInQueryFor(Collection<?> objects) {
if (idProp == null) {
throw new MappingException("No id property found for object of type " + objectType);
Assert.notEmpty(objects, "Cannot create Query for empty collection.");
Iterator<?> it = objects.iterator();
Map.Entry<MongoPersistentProperty, Object> firstEntry = extractIdPropertyAndValue(it.next());
ArrayList<Object> ids = new ArrayList<Object>(objects.size());
ids.add(firstEntry.getValue());
while (it.hasNext()) {
ids.add(extractIdPropertyAndValue(it.next()).getValue());
}
ConversionService service = mongoConverter.getConversionService();
Object idProperty = null;
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
return new Query(where(firstEntry.getKey().getFieldName()).in(ids));
}
private void assertUpdateableIdIfNotSet(Object entity) {
@@ -1094,7 +1136,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
ConversionService service = mongoConverter.getConversionService();
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class, true);
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(String.format(
@@ -1103,19 +1145,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
public void remove(Query query, String collectionName) {
remove(query, null, collectionName);
public WriteResult remove(Query query, String collectionName) {
return remove(query, null, collectionName);
}
public void remove(Query query, Class<?> entityClass) {
remove(query, entityClass, determineCollectionName(entityClass));
public WriteResult remove(Query query, Class<?> entityClass) {
return remove(query, entityClass, determineCollectionName(entityClass));
}
public void remove(Query query, Class<?> entityClass, String collectionName) {
doRemove(collectionName, query, entityClass);
public WriteResult remove(Query query, Class<?> entityClass, String collectionName) {
return doRemove(collectionName, query, entityClass);
}
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
protected <T> WriteResult doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
@@ -1126,8 +1168,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
final DBObject queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return execute(collectionName, new CollectionCallback<WriteResult>() {
public WriteResult doInCollection(DBCollection collection) throws MongoException, DataAccessException {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass));
@@ -1143,11 +1185,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
writeConcernToUse);
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass));
return null;
return wr;
}
});
}
@@ -1303,6 +1346,54 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return aggregate(aggregation, collectionName, outputType, null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.String)
*/
@Override
public <T> List<T> findAllAndRemove(Query query, String collectionName) {
return findAndRemove(query, null, collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
@Override
public <T> List<T> findAllAndRemove(Query query, Class<T> entityClass) {
return findAllAndRemove(query, entityClass, determineCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findAllAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public <T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName) {
return doFindAndDelete(collectionName, query, entityClass);
}
/**
* Retrieve and remove all documents matching the given {@code query} by calling {@link #find(Query, Class, String)}
* and {@link #remove(Query, Class, String)}, whereas the {@link Query} for {@link #remove(Query, Class, String)} is
* constructed out of the find result.
*
* @param collectionName
* @param query
* @param entityClass
* @return
*/
protected <T> List<T> doFindAndDelete(String collectionName, Query query, Class<T> entityClass) {
List<T> result = find(query, entityClass, collectionName);
if (!CollectionUtils.isEmpty(result)) {
remove(getIdInQueryFor(result), entityClass, collectionName);
}
return result;
}
protected <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
AggregationOperationContext context) {
@@ -1565,6 +1656,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
increaseVersionForUpdateIfNecessary(entity, update);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
@@ -1602,9 +1695,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
ConversionService conversionService = mongoConverter.getConversionService();
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(savedObject, conversionService);
BeanWrapper<Object> wrapper = BeanWrapper.create(savedObject, conversionService);
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
Object idValue = wrapper.getProperty(idProp, idProp.getType());
if (idValue != null) {
return;

View File

@@ -262,7 +262,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
@Override
public ProjectionOperation as(String alias) {
Field expressionField = Fields.field(alias, "expr");
Field expressionField = Fields.field(alias, alias);
return this.operation.and(new ExpressionProjection(expressionField, this.value.toString(), params));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -495,7 +495,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
if (currentNode.hasfirstChildNotOfType(Indexer.class)) {
// we have a property path expression like: foo.bar -> render as reference
return context.getFieldReference().toString();
return context.addToPreviousOrReturn(context.getFieldReference().toString());
}
return context.addToPreviousOrReturn(currentNode.getValue());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,13 +17,14 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -56,6 +57,7 @@ import org.springframework.util.Assert;
* .
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class CustomConversions {
@@ -67,7 +69,7 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder;
private final Map<Class<?>, HashMap<Class<?>, CacheValue>> cache;
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
private final List<Object> converters;
@@ -90,24 +92,31 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>();
this.cache = new HashMap<Class<?>, HashMap<Class<?>, CacheValue>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
this.converters = new ArrayList<Object>();
this.converters.addAll(converters);
this.converters.add(CustomToStringConverter.INSTANCE);
this.converters.add(BigDecimalToStringConverter.INSTANCE);
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.add(DBObjectToStringConverter.INSTANCE);
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
List<Object> toRegister = new ArrayList<Object>();
for (Object c : this.converters) {
// Add user provided converters to make sure they can override the defaults
toRegister.addAll(converters);
toRegister.add(CustomToStringConverter.INSTANCE);
toRegister.add(BigDecimalToStringConverter.INSTANCE);
toRegister.add(StringToBigDecimalConverter.INSTANCE);
toRegister.add(BigIntegerToStringConverter.INSTANCE);
toRegister.add(StringToBigIntegerConverter.INSTANCE);
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());
for (Object c : toRegister) {
registerConversion(c);
}
Collections.reverse(toRegister);
this.converters = Collections.unmodifiableList(toRegister);
this.simpleTypeHolder = new SimpleTypeHolder(customSimpleTypes, MongoSimpleTypes.HOLDER);
}
@@ -195,25 +204,25 @@ public class CustomConversions {
*
* @param pair
*/
private void register(ConverterRegistration context) {
private void register(ConverterRegistration converterRegistration) {
ConvertiblePair pair = context.getConvertiblePair();
ConvertiblePair pair = converterRegistration.getConvertiblePair();
if (context.isReading()) {
if (converterRegistration.isReading()) {
readingPairs.add(pair);
if (LOG.isWarnEnabled() && !context.isSimpleSourceType()) {
if (LOG.isWarnEnabled() && !converterRegistration.isSimpleSourceType()) {
LOG.warn(String.format(READ_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
}
}
if (context.isWriting()) {
if (converterRegistration.isWriting()) {
writingPairs.add(pair);
customSimpleTypes.add(pair.getSourceType());
if (LOG.isWarnEnabled() && !context.isSimpleTargetType()) {
if (LOG.isWarnEnabled() && !converterRegistration.isSimpleTargetType()) {
LOG.warn(String.format(WRITE_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
}
}
@@ -223,11 +232,11 @@ public class CustomConversions {
* Returns the target type to convert to in case we have a custom conversion registered to convert the given source
* type into a Mongo native one.
*
* @param source must not be {@literal null}
* @param sourceType must not be {@literal null}
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> source) {
return getCustomWriteTarget(source, null);
public Class<?> getCustomWriteTarget(Class<?> sourceType) {
return getCustomWriteTarget(sourceType, null);
}
/**
@@ -235,72 +244,78 @@ public class CustomConversions {
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
*
* @param source must not be {@literal null}
* @param expectedTargetType
* @param sourceType must not be {@literal null}
* @param requestedTargetType
* @return
*/
public Class<?> getCustomWriteTarget(Class<?> source, Class<?> expectedTargetType) {
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(source);
return getCustomTarget(source, expectedTargetType, writingPairs);
Assert.notNull(sourceType);
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
}
/**
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might
* be a subclass oth the given expected type though.
* be a subclass of the given expected type though.
*
* @param source must not be {@literal null}
* @param sourceType must not be {@literal null}
* @return
*/
public boolean hasCustomWriteTarget(Class<?> source) {
return hasCustomWriteTarget(source, null);
public boolean hasCustomWriteTarget(Class<?> sourceType) {
Assert.notNull(sourceType);
return hasCustomWriteTarget(sourceType, null);
}
/**
* Returns whether we have a custom conversion registered to write an object of the given source type into an object
* of the given Mongo native target type.
*
* @param source must not be {@literal null}.
* @param expectedTargetType
* @param sourceType must not be {@literal null}.
* @param requestedTargetType
* @return
*/
public boolean hasCustomWriteTarget(Class<?> source, Class<?> expectedTargetType) {
return getCustomWriteTarget(source, expectedTargetType) != null;
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
return getCustomWriteTarget(sourceType, requestedTargetType) != null;
}
/**
* Returns whether we have a custom conversion registered to read the given source into the given target type.
*
* @param source must not be {@literal null}
* @param expectedTargetType must not be {@literal null}
* @param sourceType must not be {@literal null}
* @param requestedTargetType must not be {@literal null}
* @return
*/
public boolean hasCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(source);
Assert.notNull(expectedTargetType);
Assert.notNull(sourceType);
Assert.notNull(requestedTargetType);
return getCustomReadTarget(source, expectedTargetType) != null;
return getCustomReadTarget(sourceType, requestedTargetType) != null;
}
/**
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally
* checks assignabilty of the target type if one is given.
* checks assignability of the target type if one is given.
*
* @param source must not be {@literal null}
* @param expectedTargetType
* @param pairs must not be {@literal null}
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @param pairs must not be {@literal null}.
* @return
*/
private static Class<?> getCustomTarget(Class<?> source, Class<?> expectedTargetType, Iterable<ConvertiblePair> pairs) {
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
Iterable<ConvertiblePair> pairs) {
Assert.notNull(source);
Assert.notNull(sourceType);
Assert.notNull(pairs);
for (ConvertiblePair typePair : pairs) {
if (typePair.getSourceType().isAssignableFrom(source)) {
if (typePair.getSourceType().isAssignableFrom(sourceType)) {
Class<?> targetType = typePair.getTargetType();
if (expectedTargetType == null || targetType.isAssignableFrom(expectedTargetType)) {
if (requestedTargetType == null || targetType.isAssignableFrom(requestedTargetType)) {
return targetType;
}
}
@@ -309,27 +324,33 @@ public class CustomConversions {
return null;
}
private Class<?> getCustomReadTarget(Class<?> source, Class<?> expectedTargetType) {
/**
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Class<?> type = expectedTargetType == null ? PlaceholderType.class : expectedTargetType;
Assert.notNull(sourceType);
Map<Class<?>, CacheValue> map;
CacheValue toReturn;
if ((map = cache.get(source)) == null || (toReturn = map.get(type)) == null) {
Class<?> target = getCustomTarget(source, type, readingPairs);
if (cache.get(source) == null) {
cache.put(source, new HashMap<Class<?>, CacheValue>());
}
Map<Class<?>, CacheValue> value = cache.get(source);
toReturn = target == null ? CacheValue.NULL : new CacheValue(target);
value.put(type, toReturn);
if (requestedTargetType == null) {
return null;
}
return toReturn.clazz;
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType);
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey);
if (readTargetTypeValue != null) {
return readTargetTypeValue.getType();
}
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs));
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType();
}
@WritingConverter
@@ -338,8 +359,10 @@ public class CustomConversions {
INSTANCE;
public Set<ConvertiblePair> getConvertibleTypes() {
ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class);
ConvertiblePair booleanToString = new ConvertiblePair(Character.class, String.class);
return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString));
}
@@ -348,29 +371,29 @@ public class CustomConversions {
}
}
/**
* Placeholder type to allow registering not-found values in the converter cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
*/
private static class PlaceholderType {
}
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
* @author Thomas Darimont
*/
private static class CacheValue {
public static final CacheValue NULL = new CacheValue(null);
private final Class<?> clazz;
private static final CacheValue ABSENT = new CacheValue(null);
public CacheValue(Class<?> clazz) {
this.clazz = clazz;
private final Class<?> type;
public CacheValue(Class<?> type) {
this.type = type;
}
public Class<?> getType() {
return type;
}
static CacheValue of(Class<?> type) {
return type == null ? ABSENT : new CacheValue(type);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,15 +24,22 @@ import com.mongodb.DBRef;
* Used to resolve associations annotated with {@link org.springframework.data.mongodb.core.mapping.DBRef}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.4
*/
public interface DbRefResolver {
/**
* Resolves the given {@link DBRef} into an object of the given {@link MongoPersistentProperty}'s type. The method
* might return a proxy object for the {@link DBRef} or resolve it immediately. In both cases the
* {@link DbRefResolverCallback} will be used to obtain the actual backing object.
*
* @param property will never be {@literal null}.
* @param dbref the {@link DBRef} to resolve.
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.util.ReflectionUtils.*;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
@@ -51,6 +53,7 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.4
*/
public class DefaultDbRefResolver implements DbRefResolver {
@@ -78,13 +81,13 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, callback);
return createLazyLoadingProxy(property, dbref, callback);
}
return callback.resolve(property);
@@ -109,10 +112,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
* eventually resolve the value of the property.
*
* @param property must not be {@literal null}.
* @param dbref can be {@literal null}.
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DbRefResolverCallback callback) {
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
ProxyFactory proxyFactory = new ProxyFactory();
Class<?> propertyType = property.getType();
@@ -121,7 +125,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
proxyFactory.addInterface(type);
}
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, exceptionTranslator, callback);
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
proxyFactory.addInterface(LazyLoadingProxy.class);
if (propertyType.isInterface()) {
proxyFactory.addInterface(propertyType);
@@ -141,7 +147,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
}
/**
* @param property
* Returns whether the property shall be resolved lazily.
*
* @param property must not be {@literal null}.
* @return
*/
private boolean isLazyDbRef(MongoPersistentProperty property) {
@@ -154,31 +162,46 @@ public class DefaultDbRefResolver implements DbRefResolver {
* guaranteed to be performed only once.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
private final PersistenceExceptionTranslator exceptionTranslator;
private volatile boolean resolved;
private Object result;
private DBRef dbref;
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
/**
* Creates a new {@link LazyLoadingInterceptor} for the given {@link MongoPersistentProperty},
* {@link PersistenceExceptionTranslator} and {@link DbRefResolverCallback}.
*
* @param property must not be {@literal null}.
* @param dbref can be {@literal null}.
* @param callback must not be {@literal null}.
*/
public LazyLoadingInterceptor(MongoPersistentProperty property, PersistenceExceptionTranslator exceptionTranslator,
DbRefResolverCallback callback) {
public LazyLoadingInterceptor(MongoPersistentProperty property, DBRef dbref,
PersistenceExceptionTranslator exceptionTranslator, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(exceptionTranslator, "Exception translator must not be null!");
Assert.notNull(callback, "Callback must not be null!");
this.dbref = dbref;
this.callback = callback;
this.exceptionTranslator = exceptionTranslator;
this.property = property;
@@ -199,9 +222,95 @@ public class DefaultDbRefResolver implements DbRefResolver {
*/
@Override
public Object intercept(Object obj, Method method, Object[] args, MethodProxy proxy) throws Throwable {
return ReflectionUtils.isObjectMethod(method) ? method.invoke(obj, args) : method.invoke(ensureResolved(), args);
if (INITIALIZE_METHOD.equals(method)) {
return ensureResolved();
}
if (TO_DBREF_METHOD.equals(method)) {
return this.dbref;
}
if (isObjectMethod(method) && Object.class.equals(method.getDeclaringClass())) {
if (ReflectionUtils.isToStringMethod(method)) {
return proxyToString(proxy);
}
if (ReflectionUtils.isEqualsMethod(method)) {
return proxyEquals(proxy, args[0]);
}
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
}
Object target = ensureResolved();
if (target == null) {
return null;
}
return method.invoke(target, args);
}
/**
* Returns a to string representation for the given {@code proxy}.
*
* @param proxy
* @return
*/
private String proxyToString(Object proxy) {
StringBuilder description = new StringBuilder();
if (dbref != null) {
description.append(dbref.getRef());
description.append(":");
description.append(dbref.getId());
} else {
description.append(System.identityHashCode(proxy));
}
description.append("$").append(LazyLoadingProxy.class.getSimpleName());
return description.toString();
}
/**
* Returns the hashcode for the given {@code proxy}.
*
* @param proxy
* @return
*/
private int proxyHashCode(Object proxy) {
return proxyToString(proxy).hashCode();
}
/**
* Performs an equality check for the given {@code proxy}.
*
* @param proxy
* @param that
* @return
*/
private boolean proxyEquals(Object proxy, Object that) {
if (!(that instanceof LazyLoadingProxy)) {
return false;
}
if (that == proxy) {
return true;
}
return proxyToString(proxy).equals(that.toString());
}
/**
* Will trigger the resolution if the proxy is not resolved already or return a previously resolved result.
*
* @return
*/
private Object ensureResolved() {
if (!resolved) {
@@ -212,16 +321,28 @@ public class DefaultDbRefResolver implements DbRefResolver {
return this.result;
}
/**
* Callback method for serialization.
*
* @param out
* @throws IOException
*/
private void writeObject(ObjectOutputStream out) throws IOException {
ensureResolved();
out.writeObject(this.result);
}
/**
* Callback method for deserialization.
*
* @param in
* @throws IOException
*/
private void readObject(ObjectInputStream in) throws IOException {
try {
this.resolved = true; // Object is guaranteed to be resolved after serializations
this.resolved = true;
this.result = in.readObject();
} catch (ClassNotFoundException e) {
throw new LazyLoadingException("Could not deserialize result", e);
@@ -229,6 +350,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
}
/**
* Resolves the proxy into its backing object.
*
* @return
*/
private synchronized Object resolve() {
@@ -248,14 +371,6 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result;
}
public boolean isResolved() {
return resolved;
}
public Object getResult() {
return result;
}
}
/**
@@ -273,6 +388,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });

View File

@@ -0,0 +1,520 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.geo.Box;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Wrapper class to contain useful geo structure converters for the usage with Mongo.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.5
*/
abstract class GeoConverters {
/**
* Private constructor to prevent instantiation.
*/
private GeoConverters() {}
/**
* Returns the geo converters to be registered.
*
* @return
*/
@SuppressWarnings("unchecked")
public static Collection<? extends Object> getConvertersToRegister() {
return Arrays.asList( //
BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE //
, GeoCommandToDbObjectConverter.INSTANCE);
}
/**
* Converts a {@link List} of {@link Double}s into a {@link Point}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) {
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"),
(Double) source.get("y"));
}
}
/**
* Converts a {@link Point} into a {@link List} of {@link Double}s.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(Point source) {
return source == null ? null : new BasicDBObject("x", source.getX()).append("y", source.getY());
}
}
/**
* Converts a {@link Box} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@WritingConverter
public static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(Box source) {
if (source == null) {
return null;
}
BasicDBObject result = new BasicDBObject();
result.put("first", PointToDbObjectConverter.INSTANCE.convert(source.getFirst()));
result.put("second", PointToDbObjectConverter.INSTANCE.convert(source.getSecond()));
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) {
if (source == null) {
return null;
}
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new org.springframework.data.mongodb.core.geo.Box(first, second);
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius().getNormalizedValue());
result.put("metric", source.getRadius().getMetric().toString());
return result;
}
}
/**
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject center = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Distance distance = new Distance(radius);
if (source.containsField("metric")) {
String metricString = (String) source.get("metric");
Assert.notNull(metricString, "Metric must not be null!");
distance = distance.in(Metrics.valueOf(metricString));
}
Assert.notNull(center, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
return new Circle(DbObjectToPointConverter.INSTANCE.convert(center), distance);
}
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(Sphere source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius().getNormalizedValue());
result.put("metric", source.getRadius().getMetric().toString());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link Sphere}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public Sphere convert(DBObject source) {
if (source == null) {
return null;
}
DBObject center = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Distance distance = new Distance(radius);
if (source.containsField("metric")) {
String metricString = (String) source.get("metric");
Assert.notNull(metricString, "Metric must not be null!");
distance = distance.in(Metrics.valueOf(metricString));
}
Assert.notNull(center, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
return new Sphere(DbObjectToPointConverter.INSTANCE.convert(center), distance);
}
}
/**
* Converts a {@link Polygon} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(Polygon source) {
if (source == null) {
return null;
}
List<Point> points = source.getPoints();
List<DBObject> pointTuples = new ArrayList<DBObject>(points.size());
for (Point point : points) {
pointTuples.add(PointToDbObjectConverter.INSTANCE.convert(point));
}
DBObject result = new BasicDBObject();
result.put("points", pointTuples);
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
public static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings({ "deprecation", "unchecked" })
public Polygon convert(DBObject source) {
if (source == null) {
return null;
}
List<DBObject> points = (List<DBObject>) source.get("points");
List<Point> newPoints = new ArrayList<Point>(points.size());
for (DBObject element : points) {
Assert.notNull(element, "Point elements of polygon must not be null!");
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
}
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints);
}
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
public static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("deprecation")
public DBObject convert(GeoCommand source) {
if (source == null) {
return null;
}
BasicDBList argument = new BasicDBList();
Shape shape = source.getShape();
if (shape instanceof Box) {
argument.add(toList(((Box) shape).getFirst()));
argument.add(toList(((Box) shape).getSecond()));
} else if (shape instanceof Circle) {
argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius());
} else if (shape instanceof Polygon) {
for (Point point : ((Polygon) shape).getPoints()) {
argument.add(toList(point));
}
} else if (shape instanceof Sphere) {
argument.add(toList(((Sphere) shape).getCenter()));
argument.add(((Sphere) shape).getRadius().getNormalizedValue());
}
return new BasicDBObject(source.getCommand(), argument);
}
}
static List<Double> toList(Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -0,0 +1,45 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import com.mongodb.DBRef;
/**
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @since 1.5
*/
public interface LazyLoadingProxy {
/**
* Initializes the proxy and returns the wrapped value.
*
* @return
* @since 1.5
*/
Object initialize();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.
*
* @return
* @since 1.5
*/
DBRef toDBRef();
}

View File

@@ -29,10 +29,10 @@ import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
@@ -71,6 +71,7 @@ import com.mongodb.DBRef;
* @author Jon Brisbin
* @author Patrik Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
@@ -81,7 +82,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext;
protected boolean useFieldAccessOnly = true;
protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null;
@@ -165,17 +165,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return mappingContext;
}
/**
* Configures whether to use field access only for entity mapping. Setting this to true will force the
* {@link MongoConverter} to not go through getters or setters even if they are present for getting and setting
* property values.
*
* @param useFieldAccessOnly
*/
public void setUseFieldAccessOnly(boolean useFieldAccessOnly) {
this.useFieldAccessOnly = useFieldAccessOnly;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
@@ -253,7 +242,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider);
final BeanWrapper<MongoPersistentEntity<S>, S> wrapper = BeanWrapper.create(instance, conversionService);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService);
final S result = wrapper.getBean();
// Set properties not already set in the constructor
@@ -265,7 +254,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
Object obj = getValueInternal(prop, dbo, evaluator, result);
wrapper.setProperty(prop, obj, useFieldAccessOnly);
wrapper.setProperty(prop, obj);
}
});
@@ -273,9 +262,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
MongoPersistentProperty property = association.getInverse();
Object obj = dbRefResolver.resolveDbRef(inverseProp, new DbRefResolverCallback() {
Object value = dbo.get(property.getName());
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override
public Object resolve(MongoPersistentProperty property) {
@@ -283,7 +274,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
});
wrapper.setProperty(inverseProp, obj);
wrapper.setProperty(property, obj);
}
});
@@ -303,7 +294,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
}
return createDBRef(object, annotation);
return createDBRef(object, referingProperty);
}
/**
@@ -374,15 +365,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName());
}
final BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(obj, conversionService);
final BeanWrapper<Object> wrapper = BeanWrapper.create(obj, conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
boolean fieldAccessOnly = idProperty.usePropertyAccess() ? false : useFieldAccessOnly;
try {
Object id = wrapper.getProperty(idProperty, Object.class, fieldAccessOnly);
Object id = wrapper.getProperty(idProperty, Object.class);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {}
}
@@ -395,11 +384,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
boolean fieldAccessOnly = prop.usePropertyAccess() ? false : useFieldAccessOnly;
Object propertyObj = wrapper.getProperty(prop, prop.getType(), fieldAccessOnly);
Object propertyObj = wrapper.getProperty(prop);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
} else {
@@ -413,7 +401,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType();
Object propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
Object propertyObj = wrapper.getProperty(inverseProp, type);
if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp);
}
@@ -446,13 +434,32 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (prop.isDbReference()) {
DBRef dbRefObj = createDBRef(obj, prop.getDBRef());
DBRef dbRefObj = null;
/*
* If we already have a LazyLoadingProxy, we use it's cached DBRef value instead of
* unnecessarily initializing it only to convert it to a DBRef a few instructions later.
*/
if (obj instanceof LazyLoadingProxy) {
dbRefObj = ((LazyLoadingProxy) obj).toDBRef();
}
dbRefObj = dbRefObj != null ? dbRefObj : createDBRef(obj, prop);
if (null != dbRefObj) {
accessor.put(prop, dbRefObj);
return;
}
}
/*
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize();
}
// Lookup potential custom target type
Class<?> basicTargetType = conversions.getCustomWriteTarget(obj.getClass(), null);
@@ -515,7 +522,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
continue;
}
DBRef dbRef = createDBRef(element, property.getDBRef());
DBRef dbRef = createDBRef(element, property);
dbList.add(dbRef);
}
@@ -548,7 +555,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property.getDBRef()) : null);
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
} else {
throw new MappingException("Cannot use a complex object as a key value.");
@@ -741,7 +748,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
}
protected DBRef createDBRef(Object target, org.springframework.data.mongodb.core.mapping.DBRef dbref) {
protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
Assert.notNull(target);
@@ -750,6 +757,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
MongoPersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
targetEntity = targetEntity == null ? targetEntity = mappingContext.getPersistentEntity(property) : targetEntity;
if (null == targetEntity) {
throw new MappingException("No mapping metadata found for " + target.getClass());
@@ -761,14 +769,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No id property found on class " + targetEntity.getType());
}
BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(target, conversionService);
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
Object id = null;
if (target.getClass().equals(idProperty.getType())) {
id = target;
} else {
BeanWrapper<Object> wrapper = BeanWrapper.create(target, conversionService);
id = wrapper.getProperty(idProperty, Object.class);
}
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
return dbRefResolver.createDbRef(dbref, targetEntity, idMapper.convertId(id));
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), targetEntity,
idMapper.convertId(id));
}
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval,
@@ -785,7 +800,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param sourceValue must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
Assert.notNull(targetType);
@@ -796,13 +810,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleRead(new HashSet<Object>(), collectionType);
}
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, sourceValue.size());
TypeInformation<?> componentType = targetType.getComponentType();
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
@@ -833,7 +847,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(dbObject);
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
Map<Object, Object> map = CollectionFactory.createMap(mapType, dbObject.keySet().size());
TypeInformation<?> keyType = type.getComponentType();
Class<?> rawKeyType = keyType == null ? null : keyType.getType();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size());
Map<String, Object> sourceMap = dbObject.toMap();
for (Entry<String, Object> entry : sourceMap.entrySet()) {
@@ -843,15 +864,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object key = potentiallyUnescapeMapKey(entry.getKey());
TypeInformation<?> keyTypeInformation = type.getComponentType();
if (keyTypeInformation != null) {
Class<?> keyType = keyTypeInformation.getType();
key = conversionService.convert(key, keyType);
if (rawKeyType != null) {
key = conversionService.convert(key, rawKeyType);
}
Object value = entry.getValue();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
@@ -902,15 +919,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? null : ClassTypeInformation.OBJECT;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj);
return maybeConvertList((BasicDBList) obj, typeHint);
}
if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o));
newValueDbo.put(vk, convertToMongoType(o, typeHint));
}
return newValueDbo;
}
@@ -918,17 +937,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (obj instanceof Map) {
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue()));
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
}
return result;
}
if (obj.getClass().isArray()) {
return maybeConvertList(Arrays.asList((Object[]) obj));
return maybeConvertList(Arrays.asList((Object[]) obj), typeHint);
}
if (obj instanceof Collection) {
return maybeConvertList((Collection<?>) obj);
return maybeConvertList((Collection<?>) obj, typeHint);
}
DBObject newDbo = new BasicDBObject();
@@ -941,11 +960,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
}
public BasicDBList maybeConvertList(Iterable<?> source) {
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
BasicDBList newDbl = new BasicDBList();
for (Object element : source) {
newDbl.add(convertToMongoType(element));
newDbl.add(convertToMongoType(element, typeInformation));
}
return newDbl;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,15 +33,14 @@ import com.mongodb.DBObject;
* Wrapper class to contain useful converters for the usage with Mongo.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
abstract class MongoConverters {
/**
* Private constructor to prevent instantiation.
*/
private MongoConverters() {
}
private MongoConverters() {}
/**
* Simple singleton to convert {@link ObjectId}s to their {@link String} representation.

View File

@@ -17,18 +17,23 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import java.util.Map.Entry;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
@@ -47,6 +52,7 @@ import com.mongodb.DBRef;
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class QueryMapper {
@@ -105,21 +111,36 @@ public class QueryMapper {
}
Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
Object rawValue = query.get(key);
String newKey = field.getMappedKey();
if (isNestedKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
result.put(newKey, getMappedKeyword(field, keyword));
} else {
result.put(newKey, getMappedValue(field, rawValue));
}
result.put(entry.getKey(), entry.getValue());
}
return result;
}
/**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
*
* @param field
* @param rawValue
* @return
*/
protected Entry<String, Object> getMappedObjectForField(Field field, Object rawValue) {
String key = field.getMappedKey();
Object value;
if (isNestedKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
value = getMappedKeyword(field, keyword);
} else {
value = getMappedValue(field, rawValue);
}
return createMapEntry(key, value);
}
/**
* @param entity
* @param key
@@ -138,7 +159,7 @@ public class QueryMapper {
* @param entity
* @return
*/
private DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
protected DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
// $or/$nor
if (keyword.isOrOrNor() || keyword.hasIterableValue()) {
@@ -147,7 +168,7 @@ public class QueryMapper {
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
newConditions.add(isDBObject(condition) ? getMappedObject((DBObject) condition, entity)
: convertSimpleOrDBObject(condition, entity));
}
@@ -164,13 +185,13 @@ public class QueryMapper {
* @param keyword
* @return
*/
private DBObject getMappedKeyword(Field property, Keyword keyword) {
protected DBObject getMappedKeyword(Field property, Keyword keyword) {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = keyword.getValue();
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property.getProperty())
: getMappedValue(property.with(keyword.getKey()), value);
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue(
property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue);
}
@@ -184,11 +205,11 @@ public class QueryMapper {
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Field documentField, Object value) {
protected Object getMappedValue(Field documentField, Object value) {
if (documentField.isIdField()) {
if (value instanceof DBObject) {
if (isDBObject(value)) {
DBObject valueDbo = (DBObject) value;
DBObject resultDbo = new BasicDBObject(valueDbo.toMap());
@@ -217,7 +238,7 @@ public class QueryMapper {
}
if (isAssociationConversionNecessary(documentField, value)) {
return convertAssociation(value, documentField.getProperty());
return convertAssociation(value, documentField);
}
return convertSimpleOrDBObject(value, documentField.getPropertyEntity());
@@ -233,9 +254,10 @@ public class QueryMapper {
* @param value
* @return
*/
private boolean isAssociationConversionNecessary(Field documentField, Object value) {
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return documentField.isAssociation() && value != null
&& documentField.getProperty().getActualType().isAssignableFrom(value.getClass());
&& (documentField.getProperty().getActualType().isAssignableFrom(value.getClass()) //
|| documentField.getPropertyEntity().getIdProperty().getActualType().isAssignableFrom(value.getClass()));
}
/**
@@ -245,13 +267,13 @@ public class QueryMapper {
* @param entity
* @return
*/
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
protected Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
if (source instanceof BasicDBList) {
return delegateConvertToMongoType(source, entity);
}
if (source instanceof DBObject) {
if (isDBObject(source)) {
return getMappedObject((DBObject) source, entity);
}
@@ -270,6 +292,10 @@ public class QueryMapper {
return converter.convertToMongoType(source);
}
protected Object convertAssociation(Object source, Field field) {
return convertAssociation(source, field.getProperty());
}
/**
* Converts the given source assuming it's actually an association to another object.
*
@@ -277,17 +303,16 @@ public class QueryMapper {
* @param property
* @return
*/
private Object convertAssociation(Object source, MongoPersistentProperty property) {
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || !property.isAssociation() || source == null || source instanceof DBRef
|| !property.isEntity()) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
return source;
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
result.add(element instanceof DBRef ? element : converter.toDBRef(element, property));
result.add(createDbRefFor(element, property));
}
return result;
}
@@ -296,12 +321,54 @@ public class QueryMapper {
BasicDBObject result = new BasicDBObject();
DBObject dbObject = (DBObject) source;
for (String key : dbObject.keySet()) {
Object o = dbObject.get(key);
result.put(key, o instanceof DBRef ? o : converter.toDBRef(o, property));
result.put(key, createDbRefFor(dbObject.get(key), property));
}
return result;
}
return createDbRefFor(source, property);
}
/**
* Checks whether the given value is a {@link DBObject}.
*
* @param value can be {@literal null}.
* @return
*/
protected final boolean isDBObject(Object value) {
return value instanceof DBObject;
}
/**
* Creates a new {@link Entry} for the given {@link Field} with the given value.
*
* @param field must not be {@literal null}.
* @param value can be {@literal null}.
* @return
*/
protected final Entry<String, Object> createMapEntry(Field field, Object value) {
return createMapEntry(field.getMappedKey(), value);
}
/**
* Creates a new {@link Entry} with the given key and value.
*
* @param key must not be {@literal null} or empty.
* @param value can be {@literal null}
* @return
*/
private Entry<String, Object> createMapEntry(String key, Object value) {
Assert.hasText(key, "Key must not be null or empty!");
return Collections.singletonMap(key, value).entrySet().iterator().next();
}
private DBRef createDbRefFor(Object source, MongoPersistentProperty property) {
if (source instanceof DBRef) {
return (DBRef) source;
}
return converter.toDBRef(source, property);
}
@@ -360,7 +427,7 @@ public class QueryMapper {
*
* @author Oliver Gierke
*/
private static class Keyword {
static class Keyword {
private static final String N_OR_PATTERN = "\\$.*or";
@@ -450,7 +517,9 @@ public class QueryMapper {
}
/**
* Returns the underlying {@link MongoPersistentProperty} backing the field.
* Returns the underlying {@link MongoPersistentProperty} backing the field. For path traversals this will be the
* property that represents the value to handle. This means it'll be the leaf property for plain paths or the
* association property in case we refer to an association somewhere in the path.
*
* @return
*/
@@ -484,6 +553,19 @@ public class QueryMapper {
public String getMappedKey() {
return isIdField() ? ID_KEY : name;
}
/**
* Returns whether the field references an association in case it refers to a nested field.
*
* @return
*/
public boolean containsAssociation() {
return false;
}
public Association<MongoPersistentProperty> getAssociation() {
return null;
}
}
/**
@@ -494,10 +576,13 @@ public class QueryMapper {
*/
protected static class MetadataBackedField extends Field {
private static final String INVALID_ASSOCIATION_REFERENCE = "Invalid path reference %s! Associations can only be pointed to directly or via their id property!";
private final MongoPersistentEntity<?> entity;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoPersistentProperty property;
private final PersistentPropertyPath<MongoPersistentProperty> path;
private final Association<MongoPersistentProperty> association;
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
@@ -519,6 +604,7 @@ public class QueryMapper {
this.path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
this.association = findAssociation();
}
/*
@@ -552,7 +638,7 @@ public class QueryMapper {
*/
@Override
public MongoPersistentProperty getProperty() {
return property;
return association == null ? property : association.getInverse();
}
/*
@@ -571,9 +657,34 @@ public class QueryMapper {
*/
@Override
public boolean isAssociation() {
return association != null;
}
MongoPersistentProperty property = getProperty();
return property == null ? false : property.isAssociation();
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getAssociation()
*/
@Override
public Association<MongoPersistentProperty> getAssociation() {
return association;
}
/**
* Finds the association property in the {@link PersistentPropertyPath}.
*
* @return
*/
private final Association<MongoPersistentProperty> findAssociation() {
if (this.path != null) {
for (MongoPersistentProperty p : this.path) {
if (p.isAssociation()) {
return p.getAssociation();
}
}
}
return null;
}
/*
@@ -585,6 +696,10 @@ public class QueryMapper {
return path == null ? name : path.toDotPath(getPropertyConverter());
}
protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
return path;
}
/**
* Returns the {@link PersistentPropertyPath} for the given <code>pathExpression</code>.
*
@@ -594,8 +709,28 @@ public class QueryMapper {
private PersistentPropertyPath<MongoPersistentProperty> getPath(String pathExpression) {
try {
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
boolean associationDetected = false;
while (iterator.hasNext()) {
MongoPersistentProperty property = iterator.next();
if (property.isAssociation()) {
associationDetected = true;
continue;
}
if (associationDetected && !property.isIdProperty()) {
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
}
}
return propertyPath;
} catch (PropertyReferenceException e) {
return null;
}

View File

@@ -17,23 +17,32 @@ package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Iterator;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* A subclass of {@link QueryMapper} that retains type information on the mongo types.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class UpdateMapper extends QueryMapper {
private final MongoWriter<?> converter;
private final MongoConverter converter;
/**
* Creates a new {@link UpdateMapper} using the given {@link MongoConverter}.
@@ -59,6 +68,63 @@ public class UpdateMapper extends QueryMapper {
entity.getTypeInformation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#getMappedObjectForField(org.springframework.data.mongodb.core.convert.QueryMapper.Field, java.lang.Object)
*/
@Override
protected Entry<String, Object> getMappedObjectForField(Field field, Object rawValue) {
if (isDBObject(rawValue)) {
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
}
if (!isUpdateModifier(rawValue)) {
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
}
Object value = null;
if (rawValue instanceof Modifier) {
value = getMappedValue((Modifier) rawValue);
} else if (rawValue instanceof Modifiers) {
DBObject modificationOperations = new BasicDBObject();
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
modificationOperations.putAll(getMappedValue(modifier).toMap());
}
value = modificationOperations;
} else {
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
}
return createMapEntry(field, value);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#isAssociationConversionNecessary(org.springframework.data.mongodb.core.convert.QueryMapper.Field, java.lang.Object)
*/
@Override
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return super.isAssociationConversionNecessary(documentField, value) || documentField.containsAssociation();
}
private boolean isUpdateModifier(Object value) {
return value instanceof Modifier || value instanceof Modifiers;
}
private DBObject getMappedValue(Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
return new BasicDBObject(modifier.getKey(), value);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
@@ -100,13 +166,62 @@ public class UpdateMapper extends QueryMapper {
this.key = key;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getMappedKey()
*/
@Override
public String getMappedKey() {
return this.getPath() == null ? key : super.getMappedKey();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getPropertyConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return new UpdatePropertyConverter(key);
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key);
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,44 +16,31 @@
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.util.Assert;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Box implements Shape {
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
@Field(order = 10)
private final Point first;
@Field(order = 20)
private final Point second;
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
Assert.notNull(lowerLeft);
Assert.notNull(upperRight);
this.first = lowerLeft;
this.second = upperRight;
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
Assert.isTrue(lowerLeft.length == 2, "Point array has to have 2 elements!");
Assert.isTrue(upperRight.length == 2, "Point array has to have 2 elements!");
this.first = new Point(lowerLeft[0], lowerLeft[1]);
this.second = new Point(upperRight[0], upperRight[1]);
}
public Point getLowerLeft() {
return first;
}
public Point getUpperRight() {
return second;
super(lowerLeft, upperRight);
}
/*
@@ -61,46 +48,28 @@ public class Box implements Shape {
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(getLowerLeft().asList());
list.add(getUpperRight().asList());
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$box";
}
@Override
public String toString() {
return String.format("Box [%s, %s]", first, second);
}
@Override
public int hashCode() {
int result = 31;
result += 17 * first.hashCode();
result += 17 * second.hashCode();
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
Box that = (Box) obj;
return this.first.equals(that.first) && this.second.equals(that.second);
return COMMAND;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,19 +16,32 @@
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
@@ -49,7 +62,8 @@ public class Circle implements Shape {
}
/**
* Creates a new {@link Circle} from the given coordinates and radius.
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
@@ -82,9 +96,11 @@ public class Circle implements Shape {
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(getCenter().asList());
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
@@ -93,7 +109,7 @@ public class Circle implements Shape {
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$center";
return COMMAND;
}
/*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,11 +18,13 @@ package org.springframework.data.mongodb.core.geo;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class CustomMetric implements Metric {
private final double multiplier;
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/**
* Creates a custom {@link Metric} using the given multiplier.
@@ -30,14 +32,6 @@ public class CustomMetric implements Metric {
* @param multiplier
*/
public CustomMetric(double multiplier) {
this.multiplier = multiplier;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Metric#getMultiplier()
*/
public double getMultiplier() {
return multiplier;
super(multiplier);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,19 @@
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.util.ObjectUtils;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Distance {
private final double value;
private final Metric metric;
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
@@ -36,110 +38,7 @@ public class Distance {
this(value, Metrics.NEUTRAL);
}
/**
* Creates a new {@link Distance} with the given {@link Metric}.
*
* @param value
* @param metric
*/
public Distance(double value, Metric metric) {
this.value = value;
this.metric = metric == null ? Metrics.NEUTRAL : metric;
}
/**
* @return the value
*/
public double getValue() {
return value;
}
/**
* Returns the normalized value regarding the underlying {@link Metric}.
*
* @return
*/
public double getNormalizedValue() {
return value / metric.getMultiplier();
}
/**
* @return the metric
*/
public Metric getMetric() {
return metric;
}
/**
* Adds the given distance to the current one. The resulting {@link Distance} will be in the same metric as the
* current one.
*
* @param other
* @return
*/
public Distance add(Distance other) {
double newNormalizedValue = getNormalizedValue() + other.getNormalizedValue();
return new Distance(newNormalizedValue * metric.getMultiplier(), metric);
}
/**
* Adds the given {@link Distance} to the current one and forces the result to be in a given {@link Metric}.
*
* @param other
* @param metric
* @return
*/
public Distance add(Distance other, Metric metric) {
double newLeft = getNormalizedValue() * metric.getMultiplier();
double newRight = other.getNormalizedValue() * metric.getMultiplier();
return new Distance(newLeft + newRight, metric);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Distance that = (Distance) obj;
return this.value == that.value && ObjectUtils.nullSafeEquals(this.metric, that.metric);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * Double.doubleToLongBits(value);
result += 31 * ObjectUtils.nullSafeHashCode(metric);
return result;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append(value);
if (metric != Metrics.NEUTRAL) {
builder.append(" ").append(metric.toString());
}
return builder.toString();
super(value, metric);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,19 +16,21 @@
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
/**
* Custom {@link Page} to carry the average distance retrieved from the {@link GeoResults} the {@link GeoPage} is set up
* from.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoPage}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class GeoPage<T> extends PageImpl<GeoResult<T>> {
@Deprecated
public class GeoPage<T> extends org.springframework.data.geo.GeoPage<T> {
private static final long serialVersionUID = 23421312312412L;
private final Distance averageDistance;
/**
* Creates a new {@link GeoPage} from the given {@link GeoResults}.
@@ -36,8 +38,7 @@ public class GeoPage<T> extends PageImpl<GeoResult<T>> {
* @param content must not be {@literal null}.
*/
public GeoPage(GeoResults<T> results) {
super(results.getContent());
this.averageDistance = results.getAverageDistance();
super(results);
}
/**
@@ -48,16 +49,6 @@ public class GeoPage<T> extends PageImpl<GeoResult<T>> {
* @param total
*/
public GeoPage(GeoResults<T> results, Pageable pageable, long total) {
super(results.getContent(), pageable, total);
this.averageDistance = results.getAverageDistance();
}
/**
* Returns the average distance of the underlying results.
*
* @return the averageDistance
*/
public Distance getAverageDistance() {
return averageDistance;
super(results, pageable, total);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,16 @@
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.util.Assert;
/**
* Calue object capturing some arbitrary object plus a distance.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResult}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class GeoResult<T> {
private final T content;
private final Distance distance;
@Deprecated
public class GeoResult<T> extends org.springframework.data.geo.GeoResult<T> {
/**
* Creates a new {@link GeoResult} for the given content and distance.
@@ -34,69 +33,6 @@ public class GeoResult<T> {
* @param distance must not be {@literal null}.
*/
public GeoResult(T content, Distance distance) {
Assert.notNull(content);
Assert.notNull(distance);
this.content = content;
this.distance = distance;
super(content, distance);
}
/**
* Returns the actual content object.
*
* @return the content
*/
public T getContent() {
return content;
}
/**
* Returns the distance the actual content object has from the origin.
*
* @return the distance
*/
public Distance getDistance() {
return distance;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
GeoResult<?> that = (GeoResult<?>) obj;
return this.content.equals(that.content) && this.distance.equals(that.distance);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * distance.hashCode();
result += 31 * content.hashCode();
return result;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("GeoResult [content: %s, distance: %s, ]", content.toString(), distance.toString());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,23 +15,23 @@
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
/**
* Value object to capture {@link GeoResult}s as well as the average distance they have.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.GeoResults}. This class is scheduled
* to be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class GeoResults<T> implements Iterable<GeoResult<T>> {
private final List<GeoResult<T>> results;
private final Distance averageDistance;
@Deprecated
public class GeoResults<T> extends org.springframework.data.geo.GeoResults<T> {
/**
* Creates a new {@link GeoResults} instance manually calculating the average distance from the distance values of the
@@ -39,12 +39,12 @@ public class GeoResults<T> implements Iterable<GeoResult<T>> {
*
* @param results must not be {@literal null}.
*/
public GeoResults(List<GeoResult<T>> results) {
this(results, (Metric) null);
public GeoResults(List<? extends GeoResult<T>> results) {
super(results);
}
public GeoResults(List<GeoResult<T>> results, Metric metric) {
this(results, calculateAverageDistance(results, metric));
public GeoResults(List<? extends GeoResult<T>> results, Metric metric) {
super(results, metric);
}
/**
@@ -54,92 +54,7 @@ public class GeoResults<T> implements Iterable<GeoResult<T>> {
* @param averageDistance
*/
@PersistenceConstructor
public GeoResults(List<GeoResult<T>> results, Distance averageDistance) {
Assert.notNull(results);
this.results = results;
this.averageDistance = averageDistance;
}
/**
* Returns the average distance of all {@link GeoResult}s in this list.
*
* @return the averageDistance
*/
public Distance getAverageDistance() {
return averageDistance;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<GeoResult<T>> iterator() {
return results.iterator();
}
/**
* Returns the actual
*
* @return
*/
public List<GeoResult<T>> getContent() {
return Collections.unmodifiableList(results);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
GeoResults<?> that = (GeoResults<?>) obj;
return this.results.equals(that.results) && this.averageDistance == that.averageDistance;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * results.hashCode();
result += 31 * averageDistance.hashCode();
return result;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("GeoResults: [averageDistance: %s, results: %s]", averageDistance.toString(),
StringUtils.collectionToCommaDelimitedString(results));
}
private static Distance calculateAverageDistance(List<? extends GeoResult<?>> results, Metric metric) {
if (results.isEmpty()) {
return new Distance(0, metric);
}
double averageDistance = 0;
for (GeoResult<?> result : results) {
averageDistance += result.getDistance().getValue();
}
return new Distance(averageDistance / results.size(), metric);
public GeoResults(List<? extends GeoResult<T>> results, Distance averageDistance) {
super(results, averageDistance);
}
}

View File

@@ -1,16 +1,27 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public interface Metric {
/**
* Returns the multiplier to calculate metrics values from a base scale.
*
* @return
*/
double getMultiplier();
}
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}

View File

@@ -1,3 +1,18 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -5,11 +20,17 @@ import org.springframework.data.mongodb.core.query.NearQuery;
/**
* Commonly used {@link Metrics} for {@link NearQuery}s.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metrics}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public enum Metrics implements Metric {
KILOMETERS(6378.137), MILES(3963.191), NEUTRAL(1);
KILOMETERS(org.springframework.data.geo.Metrics.KILOMETERS.getMultiplier()), //
MILES(org.springframework.data.geo.Metrics.MILES.getMultiplier()), //
NEUTRAL(org.springframework.data.geo.Metrics.NEUTRAL.getMultiplier()); //
private final double multiplier;
@@ -24,4 +45,4 @@ public enum Metrics implements Metric {
public double getMultiplier() {
return multiplier;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,85 +19,37 @@ import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.util.Assert;
/**
* Represents a geospatial point value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Point {
@Field(order = 10)
private final double x;
@Field(order = 20)
private final double y;
@Deprecated
public class Point extends org.springframework.data.geo.Point {
@PersistenceConstructor
public Point(double x, double y) {
this.x = x;
this.y = y;
super(x, y);
}
public Point(Point point) {
Assert.notNull(point);
this.x = point.x;
this.y = point.y;
}
public double getX() {
return x;
}
public double getY() {
return y;
public Point(org.springframework.data.geo.Point point) {
super(point);
}
public double[] asArray() {
return new double[] { x, y };
return new double[] { getX(), getY() };
}
public List<Double> asList() {
return Arrays.asList(x, y);
return asList(this);
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
long temp;
temp = Double.doubleToLongBits(x);
result = prime * result + (int) (temp ^ (temp >>> 32));
temp = Double.doubleToLongBits(y);
result = prime * result + (int) (temp ^ (temp >>> 32));
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (getClass() != obj.getClass()) {
return false;
}
Point other = (Point) obj;
if (Double.doubleToLongBits(x) != Double.doubleToLongBits(other.x)) {
return false;
}
if (Double.doubleToLongBits(y) != Double.doubleToLongBits(other.y)) {
return false;
}
return true;
}
@Override
public String toString() {
return String.format("Point [latitude=%f, longitude=%f]", x, y);
public static List<Double> asList(org.springframework.data.geo.Point point) {
return Arrays.asList(point.getX(), point.getY());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,19 +17,22 @@ package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.data.geo.Point;
/**
* Simple value object to represent a {@link Polygon}.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Point}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class Polygon implements Shape, Iterable<Point> {
@Deprecated
public class Polygon extends org.springframework.data.geo.Polygon implements Shape {
private final List<Point> points;
public static final String COMMAND = "$polygon";
/**
* Creates a new {@link Polygon} for the given Points.
@@ -39,31 +42,17 @@ public class Polygon implements Shape, Iterable<Point> {
* @param z
* @param others
*/
public Polygon(Point x, Point y, Point z, Point... others) {
Assert.notNull(x);
Assert.notNull(y);
Assert.notNull(z);
Assert.notNull(others);
this.points = new ArrayList<Point>(3 + others.length);
this.points.addAll(Arrays.asList(x, y, z));
this.points.addAll(Arrays.asList(others));
public <P extends Point> Polygon(P x, P y, P z, P... others) {
super(x, y, z, others);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param points
*/
public List<List<Double>> asList() {
List<List<Double>> result = new ArrayList<List<Double>>();
for (Point point : points) {
result.add(point.asList());
}
return result;
public <P extends Point> Polygon(List<P> points) {
super(points);
}
/*
@@ -71,43 +60,33 @@ public class Polygon implements Shape, Iterable<Point> {
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$polygon";
return COMMAND;
}
/*
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<Point> iterator() {
return this.points.iterator();
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Polygon that = (Polygon) obj;
return this.points.equals(that.points);
public List<? extends Object> asList() {
return asList(this);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
/**
* Returns a {@link List} of x,y-coordinate tuples of {@link Point}s from the given {@link Polygon}.
*
* @param polygon
* @return
*/
@Override
public int hashCode() {
return points.hashCode();
public static List<? extends Object> asList(org.springframework.data.geo.Polygon polygon) {
List<Point> points = polygon.getPoints();
List<List<Double>> tuples = new ArrayList<List<Double>>(points.size());
for (Point point : points) {
tuples.add(Arrays.asList(point.getX(), point.getY()));
}
return tuples;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,13 @@ import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Shape}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
public interface Shape {
@Deprecated
public interface Shape extends org.springframework.data.geo.Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound

View File

@@ -0,0 +1,161 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial sphere value.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class Sphere implements Shape {
public static final String COMMAND = "$centerSphere";
private final Point center;
private final Distance radius;
/**
* Creates a Sphere around the given center {@link Point} with the given radius.
*
* @param center must not be {@literal null}.
* @param radius must not be {@literal null}.
*/
@PersistenceConstructor
public Sphere(Point center, Distance radius) {
Assert.notNull(center);
Assert.notNull(radius);
Assert.isTrue(radius.getValue() >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a Sphere around the given center {@link Point} with the given radius.
*
* @param center
* @param radius
*/
public Sphere(Point center, double radius) {
this(center, new Distance(radius));
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
public Sphere(Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Creates a Sphere from the given {@link Circle}.
*
* @param circle
*/
@Deprecated
public Sphere(org.springframework.data.mongodb.core.geo.Circle circle) {
this(circle.getCenter(), circle.getRadius());
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public org.springframework.data.mongodb.core.geo.Point getCenter() {
return new org.springframework.data.mongodb.core.geo.Point(this.center);
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public Distance getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Sphere [center=%s, radius=%s]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !(obj instanceof Sphere)) {
return false;
}
Sphere that = (Sphere) obj;
return this.center.equals(that.center) && this.radius.equals(that.radius);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius.hashCode();
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
@Override
public List<? extends Object> asList() {
return Arrays.asList(Arrays.asList(center.getX(), center.getY()), this.radius.getValue());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
@Override
public String getCommand() {
return COMMAND;
}
}

View File

@@ -1,3 +1,18 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
/**
* Support for MongoDB geo-spatial queries.
*/

View File

@@ -51,10 +51,24 @@ public @interface CompoundIndex {
@Deprecated
IndexDirection direction() default IndexDirection.ASCENDING;
/**
* @see http://docs.mongodb.org/manual/core/index-unique/
* @return
*/
boolean unique() default false;
/**
* If set to true index will skip over any document that is missing the indexed field.
*
* @see http://docs.mongodb.org/manual/core/index-sparse/
* @return
*/
boolean sparse() default false;
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @return
*/
boolean dropDups() default false;
/**

View File

@@ -41,8 +41,7 @@ public class Index implements IndexDefinition {
private boolean sparse = false;
public Index() {
}
public Index() {}
public Index(String key, Direction direction) {
fieldSpec.put(key, direction);
@@ -83,16 +82,33 @@ public class Index implements IndexDefinition {
return this;
}
/**
* Reject all documents that contain a duplicate value for the indexed field.
*
* @see http://docs.mongodb.org/manual/core/index-unique/
* @return
*/
public Index unique() {
this.unique = true;
return this;
}
/**
* Skip over any document that is missing the indexed field.
*
* @see http://docs.mongodb.org/manual/core/index-sparse/
* @return
*/
public Index sparse() {
this.sparse = true;
return this;
}
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @param duplicates
* @return
*/
public Index unique(Duplicates duplicates) {
if (duplicates == Duplicates.DROP) {
this.dropDuplicates = true;

View File

@@ -32,16 +32,42 @@ import java.lang.annotation.Target;
@Retention(RetentionPolicy.RUNTIME)
public @interface Indexed {
/**
* If set to true reject all documents that contain a duplicate value for the indexed field.
*
* @see http://docs.mongodb.org/manual/core/index-unique/
* @return
*/
boolean unique() default false;
IndexDirection direction() default IndexDirection.ASCENDING;
/**
* If set to true index will skip over any document that is missing the indexed field.
*
* @see http://docs.mongodb.org/manual/core/index-sparse/
* @return
*/
boolean sparse() default false;
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @return
*/
boolean dropDups() default false;
/**
* Index name.
*
* @return
*/
String name() default "";
/**
* Colleciton name for index to be created on.
*
* @return
*/
String collection() default "";
/**

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
@@ -119,22 +118,20 @@ public class MongoPersistentEntityIndexCreator implements
}
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
public void doWithPersistentProperty(MongoPersistentProperty property) {
Field field = persistentProperty.getField();
if (property.isAnnotationPresent(Indexed.class)) {
if (field.isAnnotationPresent(Indexed.class)) {
Indexed index = field.getAnnotation(Indexed.class);
Indexed index = property.findAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = persistentProperty.getFieldName();
name = property.getFieldName();
} else {
if (!name.equals(field.getName()) && index.unique() && !index.sparse()) {
if (!name.equals(property.getName()) && index.unique() && !index.sparse()) {
// Names don't match, and sparse is not true. This situation will generate an error on the server.
if (LOGGER.isWarnEnabled()) {
LOGGER.warn("The index name " + name + " doesn't match this property name: " + field.getName()
LOGGER.warn("The index name " + name + " doesn't match this property name: " + property.getName()
+ ". Setting sparse=true on this index will prevent errors when inserting documents.");
}
}
@@ -142,7 +139,7 @@ public class MongoPersistentEntityIndexCreator implements
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
DBObject definition = new BasicDBObject(property.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
@@ -151,13 +148,13 @@ public class MongoPersistentEntityIndexCreator implements
LOGGER.debug("Created property index " + index);
}
} else if (field.isAnnotationPresent(GeoSpatialIndexed.class)) {
} else if (property.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = field.getAnnotation(GeoSpatialIndexed.class);
GeoSpatialIndexed index = property.findAnnotation(GeoSpatialIndexed.class);
GeospatialIndex indexObject = new GeospatialIndex(persistentProperty.getFieldName());
GeospatialIndex indexObject = new GeospatialIndex(property.getFieldName());
indexObject.withMin(index.min()).withMax(index.max());
indexObject.named(StringUtils.hasText(index.name()) ? index.name() : field.getName());
indexObject.named(StringUtils.hasText(index.name()) ? index.name() : property.getName());
indexObject.typed(index.type()).withBucketSize(index.bucketSize())
.withAdditionalField(index.additionalField());

View File

@@ -29,7 +29,6 @@ import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.model.AnnotationBasedPersistentProperty;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -50,17 +49,14 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
private static final Set<Class<?>> SUPPORTED_ID_TYPES = new HashSet<Class<?>>();
private static final Set<String> SUPPORTED_ID_PROPERTY_NAMES = new HashSet<String>();
private static final Field CAUSE_FIELD;
static {
SUPPORTED_ID_TYPES.add(ObjectId.class);
SUPPORTED_ID_TYPES.add(String.class);
SUPPORTED_ID_TYPES.add(BigInteger.class);
SUPPORTED_ID_PROPERTY_NAMES.add("id");
SUPPORTED_ID_PROPERTY_NAMES.add("_id");
CAUSE_FIELD = ReflectionUtils.findField(Throwable.class, "cause");
}
private final FieldNamingStrategy fieldNamingStrategy;
@@ -86,14 +82,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
}
}
/* (non-Javadoc)
* @see org.springframework.data.mapping.FooBasicPersistentProperty#isAssociation()
*/
@Override
public boolean isAssociation() {
return field.isAnnotationPresent(DBRef.class) || super.isAssociation();
}
/**
* Also considers fields as id that are of supported id type and name.
*
@@ -108,7 +96,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
}
// We need to support a wider range of ID types than just the ones that can be converted to an ObjectId
return SUPPORTED_ID_PROPERTY_NAMES.contains(field.getName());
return SUPPORTED_ID_PROPERTY_NAMES.contains(getName());
}
/*
@@ -163,8 +151,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getFieldOrder()
*/
public int getFieldOrder() {
org.springframework.data.mongodb.core.mapping.Field annotation = getField().getAnnotation(
org.springframework.data.mongodb.core.mapping.Field.class);
org.springframework.data.mongodb.core.mapping.Field annotation = findAnnotation(org.springframework.data.mongodb.core.mapping.Field.class);
return annotation != null ? annotation.order() : Integer.MAX_VALUE;
}
@@ -182,7 +169,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#isDbReference()
*/
public boolean isDbReference() {
return getField().isAnnotationPresent(DBRef.class);
return isAnnotationPresent(DBRef.class);
}
/*
@@ -190,14 +177,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getDBRef()
*/
public DBRef getDBRef() {
return getField().getAnnotation(DBRef.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#usePropertyAccess()
*/
public boolean usePropertyAccess() {
return CAUSE_FIELD.equals(getField());
return findAnnotation(DBRef.class);
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.mapping.event;
import org.springframework.beans.factory.ObjectFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.auditing.AuditingHandler;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
@@ -25,20 +26,22 @@ import org.springframework.util.Assert;
* Event listener to populate auditing related fields on an entity about to be saved.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class AuditingEventListener implements ApplicationListener<BeforeConvertEvent<Object>> {
private final IsNewAwareAuditingHandler auditingHandler;
private final ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory;
/**
* Creates a new {@link AuditingEventListener} using the given {@link MappingContext} and {@link AuditingHandler}.
* Creates a new {@link AuditingEventListener} using the given {@link MappingContext} and {@link AuditingHandler}
* provided by the given {@link ObjectFactory}.
*
* @param auditingHandler must not be {@literal null}.
* @param auditingHandlerFactory must not be {@literal null}.
*/
public AuditingEventListener(IsNewAwareAuditingHandler auditingHandler) {
public AuditingEventListener(ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory) {
Assert.notNull(auditingHandler, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandler = auditingHandler;
Assert.notNull(auditingHandlerFactory, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandlerFactory = auditingHandlerFactory;
}
/*
@@ -48,6 +51,6 @@ public class AuditingEventListener implements ApplicationListener<BeforeConvertE
public void onApplicationEvent(BeforeConvertEvent<Object> event) {
Object entity = event.getSource();
auditingHandler.markAudited(entity);
auditingHandlerFactory.getObject().markAudited(entity);
}
}

View File

@@ -25,10 +25,11 @@ import java.util.List;
import java.util.regex.Pattern;
import org.bson.BSON;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -43,6 +44,7 @@ import com.mongodb.DBObject;
* @author Thomas Risberg
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class Criteria implements CriteriaDefinition {
@@ -117,8 +119,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $ne operator
* Creates a criterion using the {@literal $ne} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/ne/
* @param o
* @return
*/
@@ -128,8 +131,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $lt operator
* Creates a criterion using the {@literal $lt} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/lt/
* @param o
* @return
*/
@@ -139,8 +143,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $lte operator
* Creates a criterion using the {@literal $lte} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/lte/
* @param o
* @return
*/
@@ -150,8 +155,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $gt operator
* Creates a criterion using the {@literal $gt} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/gt/
* @param o
* @return
*/
@@ -161,8 +167,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $gte operator
* Creates a criterion using the {@literal $gte} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/gte/
* @param o
* @return
*/
@@ -172,8 +179,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $in operator
* Creates a criterion using the {@literal $in} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/in/
* @param o the values to match against
* @return
*/
@@ -187,8 +195,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $in operator
* Creates a criterion using the {@literal $in} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/in/
* @param c the collection containing the values to match against
* @return
*/
@@ -198,8 +207,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $nin operator
* Creates a criterion using the {@literal $nin} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/nin/
* @param o
* @return
*/
@@ -207,14 +217,22 @@ public class Criteria implements CriteriaDefinition {
return nin(Arrays.asList(o));
}
/**
* Creates a criterion using the {@literal $nin} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/nin/
* @param o
* @return
*/
public Criteria nin(Collection<?> o) {
criteria.put("$nin", o);
return this;
}
/**
* Creates a criterion using the $mod operator
* Creates a criterion using the {@literal $mod} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/mod/
* @param value
* @param remainder
* @return
@@ -228,8 +246,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $all operator
* Creates a criterion using the {@literal $all} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/all/
* @param o
* @return
*/
@@ -237,14 +256,22 @@ public class Criteria implements CriteriaDefinition {
return all(Arrays.asList(o));
}
/**
* Creates a criterion using the {@literal $all} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/all/
* @param o
* @return
*/
public Criteria all(Collection<?> o) {
criteria.put("$all", o);
return this;
}
/**
* Creates a criterion using the $size operator
* Creates a criterion using the {@literal $size} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/size/
* @param s
* @return
*/
@@ -254,8 +281,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $exists operator
* Creates a criterion using the {@literal $exists} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/exists/
* @param b
* @return
*/
@@ -265,8 +293,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $type operator
* Creates a criterion using the {@literal $type} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/type/
* @param t
* @return
*/
@@ -276,22 +305,31 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $not meta operator which affects the clause directly following
* Creates a criterion using the {@literal $not} meta operator which affects the clause directly following
*
* @see http://docs.mongodb.org/manual/reference/operator/query/not/
* @return
*/
public Criteria not() {
return not(null);
}
/**
* Creates a criterion using the {@literal $not} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/not/
* @param value
* @return
*/
private Criteria not(Object value) {
criteria.put("$not", value);
return this;
}
/**
* Creates a criterion using a $regex
* Creates a criterion using a {@literal $regex} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/regex/
* @param re
* @return
*/
@@ -300,8 +338,10 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using a $regex and $options
* Creates a criterion using a {@literal $regex} and {@literal $options} operator.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/regex/
* @see http://docs.mongodb.org/manual/reference/operator/query/regex/#op._S_options
* @param re
* @param options
* @return
@@ -334,51 +374,79 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a geospatial criterion using a $within $center operation. This is only available for Mongo 1.7 and higher.
* Creates a geospatial criterion using a {@literal $within $centerSphere} operation. This is only available for Mongo
* 1.7 and higher.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/geoWithin/
* @see http://docs.mongodb.org/manual/reference/operator/query/centerSphere/
* @param circle must not be {@literal null}
* @return
*/
public Criteria withinSphere(Circle circle) {
Assert.notNull(circle);
criteria.put("$within", new BasicDBObject("$centerSphere", circle.asList()));
return this;
}
public Criteria within(Shape shape) {
Assert.notNull(shape);
criteria.put("$within", new BasicDBObject(shape.getCommand(), shape.asList()));
criteria.put("$within", new GeoCommand(new Sphere(circle)));
return this;
}
/**
* Creates a geospatial criterion using a $near operation
* @see Criteria#withinSphere(Circle)
* @param circle
* @return
* @deprecated As of 1.5, Use {@link #withinSphere(Circle)}. This method is scheduled to be removed in the next major
* release.
*/
@Deprecated
public Criteria withinSphere(org.springframework.data.mongodb.core.geo.Circle circle) {
Assert.notNull(circle);
criteria.put("$within", new GeoCommand(new Sphere(circle)));
return this;
}
/**
* Creates a geospatial criterion using a {@literal $within} operation.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/geoWithin/
* @param shape
* @return
*/
public Criteria within(Shape shape) {
Assert.notNull(shape);
criteria.put("$within", new GeoCommand(shape));
return this;
}
/**
* Creates a geospatial criterion using a {@literal $near} operation.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/near/
* @param point must not be {@literal null}
* @return
*/
public Criteria near(Point point) {
Assert.notNull(point);
criteria.put("$near", point.asList());
criteria.put("$near", point);
return this;
}
/**
* Creates a geospatial criterion using a $nearSphere operation. This is only available for Mongo 1.7 and higher.
* Creates a geospatial criterion using a {@literal $nearSphere} operation. This is only available for Mongo 1.7 and
* higher.
*
* @see http://docs.mongodb.org/manual/reference/operator/query/nearSphere/
* @param point must not be {@literal null}
* @return
*/
public Criteria nearSphere(Point point) {
Assert.notNull(point);
criteria.put("$nearSphere", point.asList());
criteria.put("$nearSphere", point);
return this;
}
/**
* Creates a geospatical criterion using a $maxDistance operation, for use with $near
* Creates a geospatical criterion using a {@literal $maxDistance} operation, for use with $near
*
* @see http://docs.mongodb.org/manual/reference/operator/query/maxDistance/
* @param maxDistance
* @return
*/
@@ -388,8 +456,9 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a criterion using the $elemMatch operator
* Creates a criterion using the {@literal $elemMatch} operator
*
* @see http://docs.mongodb.org/manual/reference/operator/query/elemMatch/
* @param c
* @return
*/

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import org.springframework.data.geo.Box;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.util.Assert;
/**
* Wrapper around a {@link Shape} to allow appropriate query rendering.
*
* @author Thomas Darimont
* @since 1.5
*/
public class GeoCommand {
private final Shape shape;
private final String command;
/**
* Creates a new {@link GeoCommand}.
*
* @param shape must not be {@literal null}.
*/
public GeoCommand(Shape shape) {
Assert.notNull(shape, "Shape must not be null!");
this.shape = shape;
this.command = getCommand(shape);
}
/**
* @return the shape
*/
public Shape getShape() {
return shape;
}
/**
* @return the command
*/
public String getCommand() {
return command;
}
/**
* Returns the MongoDB command for the given {@link Shape}.
*
* @param shape must not be {@literal null}.
* @return
*/
@SuppressWarnings("deprecation")
private String getCommand(Shape shape) {
Assert.notNull(shape, "Shape must not be null!");
if (shape instanceof Box) {
return org.springframework.data.mongodb.core.geo.Box.COMMAND;
} else if (shape instanceof Circle || shape instanceof org.springframework.data.mongodb.core.geo.Circle) {
return org.springframework.data.mongodb.core.geo.Circle.COMMAND;
} else if (shape instanceof Polygon) {
return org.springframework.data.mongodb.core.geo.Polygon.COMMAND;
} else if (shape instanceof Sphere) {
return org.springframework.data.mongodb.core.geo.Sphere.COMMAND;
}
throw new IllegalArgumentException("Unknown shape: " + shape);
}
}

View File

@@ -15,12 +15,14 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.geo.CustomMetric;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.geo.Metrics;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.geo.CustomMetric;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -31,6 +33,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public final class NearQuery {
@@ -143,10 +146,12 @@ public final class NearQuery {
/**
* Configures the {@link Pageable} to use.
*
* @param pageable
* @param pageable must not be {@literal null}
* @return
*/
public NearQuery with(Pageable pageable) {
Assert.notNull(pageable, "Pageable must not be 'null'.");
this.num = pageable.getOffset() + pageable.getPageSize();
this.skip = pageable.getOffset();
return this;
@@ -311,13 +316,18 @@ public final class NearQuery {
/**
* Adds an actual query to the {@link NearQuery} to restrict the objects considered for the actual near operation.
*
* @param query
* @param query must not be {@literal null}.
* @return
*/
public NearQuery query(Query query) {
Assert.notNull(query, "Cannot apply 'null' query on NearQuery.");
this.query = query;
this.skip = query.getSkip();
this.num = query.getLimit();
if (query.getLimit() != 0) {
this.num = query.getLimit();
}
return this;
}
@@ -333,6 +343,7 @@ public final class NearQuery {
*
* @return
*/
@SuppressWarnings("deprecation")
public DBObject toDBObject() {
BasicDBObject dbObject = new BasicDBObject();
@@ -353,7 +364,8 @@ public final class NearQuery {
dbObject.put("num", num);
}
dbObject.put("near", point.asList());
dbObject.put("near", Arrays.asList(point.getX(), point.getY()));
dbObject.put("spherical", spherical);
return dbObject;

View File

@@ -99,11 +99,23 @@ public class Query {
return this.fieldSpec;
}
/**
* Set number of documents to skip before returning results.
*
* @param skip
* @return
*/
public Query skip(int skip) {
this.skip = skip;
return this;
}
/**
* Limit the number of returned documents to {@code limit}.
*
* @param limit
* @return
*/
public Query limit(int limit) {
this.limit = limit;
return this;
@@ -231,14 +243,27 @@ public class Query {
return dbo;
}
/**
* Get the number of documents to skip.
*
* @return
*/
public int getSkip() {
return this.skip;
}
/**
* Get the maximum number of documents to be return.
*
* @return
*/
public int getLimit() {
return this.limit;
}
/**
* @return
*/
public String getHint() {
return hint;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,11 +16,18 @@
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -32,6 +39,7 @@ import com.mongodb.DBObject;
* @author Mark Pollack
* @author Oliver Gierke
* @author Becca Gaspard
* @author Christoph Strobl
*/
public class Update {
@@ -39,7 +47,9 @@ public class Update {
LAST, FIRST
}
private HashMap<String, Object> modifierOps = new LinkedHashMap<String, Object>();
private Set<String> keysToUpdate = new HashSet<String>();
private Map<String, Object> modifierOps = new LinkedHashMap<String, Object>();
private Map<String, PushOperatorBuilder> pushCommandBuilders = new LinkedHashMap<String, PushOperatorBuilder>(1);
/**
* Static factory method to create an Update using the provided key
@@ -73,15 +83,22 @@ public class Update {
continue;
}
update.modifierOps.put(key, object.get(key));
Object value = object.get(key);
update.modifierOps.put(key, value);
if (isKeyword(key) && value instanceof DBObject) {
update.keysToUpdate.addAll(((DBObject) value).keySet());
} else {
update.keysToUpdate.add(key);
}
}
return update;
}
/**
* Update using the $set update modifier
* Update using the {@literal $set} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/set/
* @param key
* @param value
* @return
@@ -92,8 +109,9 @@ public class Update {
}
/**
* Update using the $setOnInsert update modifier
* Update using the {@literal $setOnInsert} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/setOnInsert/
* @param key
* @param value
* @return
@@ -104,8 +122,9 @@ public class Update {
}
/**
* Update using the $unset update modifier
* Update using the {@literal $unset} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/unset/
* @param key
* @return
*/
@@ -115,8 +134,9 @@ public class Update {
}
/**
* Update using the $inc update modifier
* Update using the {@literal $inc} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/inc/
* @param key
* @param inc
* @return
@@ -127,8 +147,9 @@ public class Update {
}
/**
* Update using the $push update modifier
* Update using the {@literal $push} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/push/
* @param key
* @param value
* @return
@@ -139,27 +160,59 @@ public class Update {
}
/**
* Update using the $pushAll update modifier
* Update using {@code $push} modifier. <br/>
* Allows creation of {@code $push} command for single or multiple (using {@code $each}) values.
*
* @see http://docs.mongodb.org/manual/reference/operator/update/push/
* @see http://docs.mongodb.org/manual/reference/operator/update/each/
* @param key
* @return {@link PushOperatorBuilder} for given key
*/
public PushOperatorBuilder push(String key) {
if (!pushCommandBuilders.containsKey(key)) {
pushCommandBuilders.put(key, new PushOperatorBuilder(key));
}
return pushCommandBuilders.get(key);
}
/**
* Update using the {@code $pushAll} update modifier. <br>
* <b>Note</b>: In mongodb 2.4 the usage of {@code $pushAll} has been deprecated in favor of {@code $push $each}.
* {@link #push(String)}) returns a builder that can be used to populate the {@code $each} object.
*
* @see http://docs.mongodb.org/manual/reference/operator/update/pushAll/
* @param key
* @param values
* @return
*/
public Update pushAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
DBObject keyValue = new BasicDBObject();
keyValue.put(key, convertedValues);
modifierOps.put("$pushAll", keyValue);
addMultiFieldOperation("$pushAll", key, convertedValues);
return this;
}
/**
* Update using the $addToSet update modifier
* Update using {@code $addToSet} modifier. <br/>
* Allows creation of {@code $push} command for single or multiple (using {@code $each}) values
*
* @param key
* @return
* @since 1.5
*/
public AddToSetBuilder addToSet(String key) {
return new AddToSetBuilder(key);
}
/**
* Update using the {@literal $addToSet} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/addToSet/
* @param key
* @param value
* @return
*/
@@ -169,8 +222,9 @@ public class Update {
}
/**
* Update using the $pop update modifier
* Update using the {@literal $pop} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/pop/
* @param key
* @param pos
* @return
@@ -181,8 +235,9 @@ public class Update {
}
/**
* Update using the $pull update modifier
* Update using the {@literal $pull} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/pull/
* @param key
* @param value
* @return
@@ -193,26 +248,27 @@ public class Update {
}
/**
* Update using the $pullAll update modifier
* Update using the {@literal $pullAll} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/pullAll/
* @param key
* @param values
* @return
*/
public Update pullAll(String key, Object[] values) {
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
DBObject keyValue = new BasicDBObject();
keyValue.put(key, convertedValues);
modifierOps.put("$pullAll", keyValue);
addFieldOperation("$pullAll", key, convertedValues);
return this;
}
/**
* Update using the $rename update modifier
* Update using the {@literal $rename} update modifier
*
* @see http://docs.mongodb.org/manual/reference/operator/update/rename/
* @param oldName
* @param newName
* @return
@@ -230,8 +286,16 @@ public class Update {
return dbo;
}
protected void addFieldOperation(String operator, String key, Object value) {
Assert.hasText(key, "Key/Path for update must not be null or blank.");
modifierOps.put(operator, new BasicDBObject(key, value));
this.keysToUpdate.add(key);
}
protected void addMultiFieldOperation(String operator, String key, Object value) {
Assert.hasText(key, "Key/Path for update must not be null or blank.");
Object existingValue = this.modifierOps.get(operator);
DBObject keyValueMap;
@@ -248,5 +312,183 @@ public class Update {
}
keyValueMap.put(key, value);
this.keysToUpdate.add(key);
}
/**
* Determine if a given {@code key} will be touched on execution.
*
* @param key
* @return
*/
public boolean modifies(String key) {
return this.keysToUpdate.contains(key);
}
/**
* Inspects given {@code key} for '$'.
*
* @param key
* @return
*/
private static boolean isKeyword(String key) {
return StringUtils.startsWithIgnoreCase(key, "$");
}
/**
* Modifiers holds a distinct collection of {@link Modifier}
*
* @author Christoph Strobl
*/
public static class Modifiers {
private HashMap<String, Modifier> modifiers;
public Modifiers() {
this.modifiers = new LinkedHashMap<String, Modifier>(1);
}
public Collection<Modifier> getModifiers() {
return Collections.unmodifiableCollection(this.modifiers.values());
}
public void addModifier(Modifier modifier) {
this.modifiers.put(modifier.getKey(), modifier);
}
}
/**
* Marker interface of nested commands.
*
* @author Christoph Strobl
*/
public static interface Modifier {
/**
* @return the command to send eg. {@code $push}
*/
String getKey();
/**
* @return value to be sent with command
*/
Object getValue();
}
/**
* Implementation of {@link Modifier} representing {@code $each}.
*
* @author Christoph Strobl
*/
private static class Each implements Modifier {
private Object[] values;
public Each(Object... values) {
this.values = extractValues(values);
}
private Object[] extractValues(Object[] values) {
if (values == null || values.length == 0) {
return values;
}
if (values.length == 1 && values[0] instanceof Collection) {
return ((Collection<?>) values[0]).toArray();
}
Object[] convertedValues = new Object[values.length];
for (int i = 0; i < values.length; i++) {
convertedValues[i] = values[i];
}
return convertedValues;
}
@Override
public String getKey() {
return "$each";
}
@Override
public Object getValue() {
return this.values;
}
}
/**
* Builder for creating {@code $push} modifiers
*
* @author Christoph Strobl
*/
public class PushOperatorBuilder {
private final String key;
private final Modifiers modifiers;
PushOperatorBuilder(String key) {
this.key = key;
this.modifiers = new Modifiers();
}
/**
* Propagates {@code $each} to {@code $push}
*
* @param values
* @return
*/
public Update each(Object... values) {
this.modifiers.addModifier(new Each(values));
return Update.this.push(key, this.modifiers);
}
/**
* Propagates {@link #value(Object)} to {@code $push}
*
* @param values
* @return
*/
public Update value(Object value) {
return Update.this.push(key, value);
}
}
/**
* Builder for creating {@code $addToSet} modifier.
*
* @author Christoph Strobl
* @since 1.5
*/
public class AddToSetBuilder {
private final String key;
public AddToSetBuilder(String key) {
this.key = key;
}
/**
* Propagates {@code $each} to {@code $addToSet}
*
* @param values
* @return
*/
public Update each(Object... values) {
return Update.this.addToSet(this.key, new Each(values));
}
/**
* Propagates {@link #value(Object)} to {@code $addToSet}
*
* @param values
* @return
*/
public Update value(Object value) {
return Update.this.addToSet(this.key, value);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,6 +32,7 @@ import com.mongodb.gridfs.GridFSFile;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Thomas Darimont
* @author Martin Baumgartner
*/
public interface GridFsOperations extends ResourcePatternResolver {
@@ -44,6 +45,24 @@ public interface GridFsOperations extends ResourcePatternResolver {
*/
GridFSFile store(InputStream content, String filename);
/**
* Stores the given content into a file with the given name.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, Object metadata);
/**
* Stores the given content into a file with the given name.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
*/
GridFSFile store(InputStream content, DBObject metadata);
/**
* Stores the given content into a file with the given name and content type.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,6 +44,7 @@ import com.mongodb.gridfs.GridFSInputFile;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Thomas Darimont
* @author Martin Baumgartner
*/
public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver {
@@ -89,6 +90,25 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
return store(content, filename, (Object) null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.Object)
*/
@Override
public GridFSFile store(InputStream content, Object metadata) {
return store(content, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, com.mongodb.DBObject)
*/
@Override
public GridFSFile store(InputStream content, DBObject metadata) {
return store(content, null, metadata);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String)
@@ -102,7 +122,6 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.Object)
*/
public GridFSFile store(InputStream content, String filename, Object metadata) {
return store(content, filename, null, metadata);
}
@@ -137,10 +156,12 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
public GridFSFile store(InputStream content, String filename, String contentType, DBObject metadata) {
Assert.notNull(content);
Assert.hasText(filename);
GridFSInputFile file = getGridFs().createFile(content);
file.setFilename(filename);
if (filename != null) {
file.setFilename(filename);
}
if (metadata != null) {
file.setMetaData(metadata);
@@ -232,7 +253,7 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
}
private DBObject getMappedQuery(Query query) {
return query == null ? null : getMappedQuery(query.getQueryObject());
return query == null ? new Query().getQueryObject() : getMappedQuery(query.getQueryObject());
}
private DBObject getMappedQuery(DBObject query) {

View File

@@ -1,13 +1,28 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.lang.annotation.Retention;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
/**
* Annotation to be used for disambiguing method parameters that shall be used to trigger geo near queries. By default
@@ -20,5 +35,4 @@ import org.springframework.data.mongodb.core.geo.Distance;
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.PARAMETER)
public @interface Near {
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.springframework.data.annotation.QueryAnnotation;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@@ -59,4 +60,12 @@ public @interface Query {
* @return
*/
boolean count() default false;
/**
* Returns whether the query should delete matching documents.
*
* @since 1.5
* @return
*/
boolean delete() default false;
}

View File

@@ -0,0 +1,44 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.xml.NamespaceHandler;
import org.springframework.data.mongodb.config.MongoNamespaceHandler;
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link NamespaceHandler} to register repository configuration.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigNamespaceHandler extends MongoNamespaceHandler {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.config.MongoNamespaceHandler#init()
*/
@Override
public void init() {
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
super.init();
}
}

View File

@@ -15,13 +15,19 @@
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.config.BeanNames;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport;
import org.springframework.data.repository.config.RepositoryConfigurationSource;
import org.springframework.data.repository.config.XmlRepositoryConfigurationSource;
import org.w3c.dom.Element;
@@ -35,6 +41,8 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String CREATE_QUERY_INDEXES = "create-query-indexes";
private boolean fallbackMappingContextCreated = false;
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModulePrefix()
@@ -52,6 +60,18 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
return MongoRepositoryFactoryBean.class.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.RepositoryConfigurationSource)
*/
@Override
public void postProcess(BeanDefinitionBuilder builder, RepositoryConfigurationSource source) {
if (fallbackMappingContextCreated) {
builder.addPropertyReference("mappingContext", BeanNames.MAPPING_CONTEXT_BEAN_NAME);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.XmlRepositoryConfigurationSource)
@@ -77,4 +97,21 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
builder.addPropertyReference("mongoOperations", attributes.getString("mongoTemplateRef"));
builder.addPropertyValue("createIndexesForQueryMethods", attributes.getBoolean("createIndexesForQueryMethods"));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#registerBeansForRoot(org.springframework.beans.factory.support.BeanDefinitionRegistry, org.springframework.data.repository.config.RepositoryConfigurationSource)
*/
@Override
public void registerBeansForRoot(BeanDefinitionRegistry registry, RepositoryConfigurationSource configurationSource) {
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(AbstractBeanDefinition.ROLE_INFRASTRUCTURE);
definition.setSource(configurationSource.getSource());
registry.registerBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
}

View File

@@ -20,13 +20,15 @@ import java.util.List;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Slice;
import org.springframework.data.domain.SliceImpl;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoPage;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.query.ParameterAccessor;
@@ -34,11 +36,14 @@ import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.WriteResult;
/**
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public abstract class AbstractMongoQuery implements RepositoryQuery {
@@ -79,22 +84,28 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(operations.getConverter(), accessor));
if (method.isGeoNearQuery() && method.isPageQuery()) {
Object result = null;
if (isDeleteQuery()) {
result = new DeleteExecution().execute(query);
} else if (method.isGeoNearQuery() && method.isPageQuery()) {
MongoParameterAccessor countAccessor = new MongoParametersParameterAccessor(method, parameters);
Query countQuery = createCountQuery(new ConvertingParameterAccessor(operations.getConverter(), countAccessor));
return new GeoNearExecution(accessor).execute(query, countQuery);
result = new GeoNearExecution(accessor).execute(query, countQuery);
} else if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
result = new GeoNearExecution(accessor).execute(query);
} else if (method.isSliceQuery()) {
result = new SlicedExecution(accessor.getPageable()).execute(query);
} else if (method.isCollectionQuery()) {
return new CollectionExecution(accessor.getPageable()).execute(query);
result = new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
result = new PagedExecution(accessor.getPageable()).execute(query);
} else {
result = new SingleEntityExecution(isCountQuery()).execute(query);
}
Object result = new SingleEntityExecution(isCountQuery()).execute(query);
if (result == null) {
return result;
}
@@ -135,6 +146,14 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
protected abstract boolean isCountQuery();
/**
* Return weather the query should delete matching documents.
*
* @return
* @since 1.5
*/
protected abstract boolean isDeleteQuery();
private abstract class Execution {
abstract Object execute(Query query);
@@ -171,6 +190,41 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
}
/**
* {@link Execution} for {@link Slice} query methods.
*
* @author Oliver Gierke
* @since 1.5
*/
final class SlicedExecution extends Execution {
private final Pageable pageable;
SlicedExecution(Pageable pageable) {
this.pageable = pageable;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings({ "unchecked", "rawtypes" })
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
int pageSize = pageable.getPageSize();
Pageable slicePageable = new PageRequest(pageable.getPageNumber(), pageSize + 1, pageable.getSort());
List result = operations.find(query.with(slicePageable), metadata.getJavaType(), metadata.getCollectionName());
boolean hasNext = result.size() > pageSize;
return new SliceImpl<Object>(hasNext ? result.subList(0, pageSize) : result, pageable, hasNext);
}
}
/**
* {@link Execution} for pagination queries.
*
@@ -239,6 +293,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*
* @author Oliver Gierke
*/
@SuppressWarnings("deprecation")
final class GeoNearExecution extends Execution {
private final MongoParameterAccessor accessor;
@@ -270,11 +325,12 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
long count = operations.count(countQuery, metadata.getCollectionName());
return new GeoPage<Object>(doExecuteQuery(query), accessor.getPageable(), count);
return new org.springframework.data.mongodb.core.geo.GeoPage<Object>(doExecuteQuery(query),
accessor.getPageable(), count);
}
@SuppressWarnings("unchecked")
private GeoResults<Object> doExecuteQuery(Query query) {
private org.springframework.data.mongodb.core.geo.GeoResults<Object> doExecuteQuery(Query query) {
Point nearLocation = accessor.getGeoNearLocation();
NearQuery nearQuery = NearQuery.near(nearLocation);
@@ -294,7 +350,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return (GeoResults<Object>) operations.geoNear(nearQuery, metadata.getJavaType(), metadata.getCollectionName());
return (org.springframework.data.mongodb.core.geo.GeoResults<Object>) operations.geoNear(nearQuery,
metadata.getJavaType(), metadata.getCollectionName());
}
private boolean isListOfGeoResult() {
@@ -309,4 +366,33 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return componentType == null ? false : GeoResult.class.equals(componentType.getType());
}
}
/**
* {@link Execution} removing documents matching the query.
*
* @since 1.5
*/
final class DeleteExecution extends Execution {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query)
*/
@Override
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return deleteAndConvertResult(query, metadata);
}
private Object deleteAndConvertResult(Query query, MongoEntityMetadata<?> metadata) {
if (method.isCollectionQuery()) {
return operations.findAllAndRemove(query, metadata.getJavaType());
}
WriteResult writeResult = operations.remove(query, metadata.getCollectionName());
return writeResult != null ? writeResult.getN() : 0L;
}
}
}

View File

@@ -23,9 +23,9 @@ import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.util.TypeInformation;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.repository.query.ParameterAccessor;
/**

View File

@@ -20,8 +20,8 @@ import java.util.Arrays;
import java.util.List;
import org.springframework.core.MethodParameter;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.repository.Near;
import org.springframework.data.mongodb.repository.query.MongoParameters.MongoParameter;
import org.springframework.data.repository.query.Parameter;
@@ -64,10 +64,11 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
this.nearIndex = nearIndex;
}
@SuppressWarnings("unchecked")
@SuppressWarnings({ "unchecked", "deprecation" })
private final int getNearIndex(List<Class<?>> parameterTypes) {
for (Class<?> reference : Arrays.asList(Point.class, double[].class)) {
for (Class<?> reference : Arrays.asList(Point.class, org.springframework.data.mongodb.core.geo.Point.class,
double[].class)) {
int nearIndex = parameterTypes.indexOf(reference);
@@ -161,7 +162,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
*/
@Override
public boolean isSpecialParameter() {
return super.isSpecialParameter() || getType().equals(Distance.class) || isNearParameter();
return super.isSpecialParameter() || Distance.class.isAssignableFrom(getType()) || isNearParameter();
}
private boolean isNearParameter() {
@@ -174,7 +175,7 @@ public class MongoParameters extends Parameters<MongoParameters, MongoParameter>
}
private boolean isPoint() {
return getType().equals(Point.class) || getType().equals(double[].class);
return Point.class.isAssignableFrom(getType()) || getType().equals(double[].class);
}
private boolean hasNearAnnotation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.repository.query.ParametersParameterAccessor;
/**

View File

@@ -24,11 +24,11 @@ import java.util.Iterator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Shape;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;

View File

@@ -15,15 +15,16 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.List;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.data.geo.GeoPage;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.geo.GeoPage;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.Query;
@@ -41,8 +42,8 @@ import org.springframework.util.StringUtils;
*/
public class MongoQueryMethod extends QueryMethod {
@SuppressWarnings("unchecked") private static final List<Class<?>> GEO_NEAR_RESULTS = Arrays.asList(GeoResult.class,
GeoResults.class, GeoPage.class);
@SuppressWarnings("unchecked") private static final List<Class<? extends Serializable>> GEO_NEAR_RESULTS = Arrays
.asList(GeoResult.class, GeoResults.class, GeoPage.class);
private final Method method;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -152,11 +153,15 @@ public class MongoQueryMethod extends QueryMethod {
private boolean isGeoNearQuery(Method method) {
if (GEO_NEAR_RESULTS.contains(method.getReturnType())) {
return true;
Class<?> returnType = method.getReturnType();
for (Class<?> type : GEO_NEAR_RESULTS) {
if (type.isAssignableFrom(returnType)) {
return true;
}
}
if (Iterable.class.isAssignableFrom(method.getReturnType())) {
if (Iterable.class.isAssignableFrom(returnType)) {
TypeInformation<?> from = ClassTypeInformation.fromReturnTypeOf(method);
return GeoResult.class.equals(from.getComponentType().getType());
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2013 the original author or authors.
* Copyright 2002-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import org.springframework.data.repository.query.parser.PartTree;
* {@link RepositoryQuery} implementation for Mongo.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class PartTreeMongoQuery extends AbstractMongoQuery {
@@ -86,4 +87,13 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
protected boolean isCountQuery() {
return tree.isCountProjection();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return tree.isDelete();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,18 +30,32 @@ import com.mongodb.util.JSON;
* Query to use a plain JSON String to create the {@link Query} to actually execute.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class StringBasedMongoQuery extends AbstractMongoQuery {
private static final String COUND_AND_DELETE = "Manually defined query for %s cannot be both a count and delete query at the same time!";
private static final Pattern PLACEHOLDER = Pattern.compile("\\?(\\d+)");
private static final Logger LOG = LoggerFactory.getLogger(StringBasedMongoQuery.class);
private final String query;
private final String fieldSpec;
private final boolean isCountQuery;
private final boolean isDeleteQuery;
/**
* Creates a new {@link StringBasedMongoQuery}.
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod} and {@link MongoOperations}.
*
* @param method must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
*/
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
this(method.getAnnotatedQuery(), method, mongoOperations);
}
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod} and
* {@link MongoOperations}.
*
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
@@ -53,10 +67,11 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
this.query = query;
this.fieldSpec = method.getFieldSpecification();
this.isCountQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().count() : false;
}
this.isDeleteQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().delete() : false;
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
this(method.getAnnotatedQuery(), method, mongoOperations);
if (isCountQuery && isDeleteQuery) {
throw new IllegalArgumentException(String.format(COUND_AND_DELETE, method));
}
}
/*
@@ -95,6 +110,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return isCountQuery;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return this.isDeleteQuery;
}
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor) {
Matcher matcher = PLACEHOLDER.matcher(input);

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.repository.Repository;
@@ -34,6 +35,7 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
private MongoOperations operations;
private boolean createIndexesForQueryMethods = false;
private boolean mappingContextConfigured = false;
/**
* Configures the {@link MongoOperations} to be used.
@@ -42,7 +44,6 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
*/
public void setMongoOperations(MongoOperations operations) {
this.operations = operations;
setMappingContext(operations.getConverter().getMappingContext());
}
/**
@@ -54,6 +55,17 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
this.createIndexesForQueryMethods = createIndexesForQueryMethods;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#setMappingContext(org.springframework.data.mapping.context.MappingContext)
*/
@Override
protected void setMappingContext(MappingContext<?, ?> mappingContext) {
super.setMappingContext(mappingContext);
this.mappingContextConfigured = true;
}
/*
* (non-Javadoc)
*
@@ -95,5 +107,9 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
super.afterPropertiesSet();
Assert.notNull(operations, "MongoTemplate must not be null!");
if (!mappingContextConfigured) {
setMappingContext(operations.getConverter().getMappingContext());
}
}
}

View File

@@ -1 +1 @@
http\://www.springframework.org/schema/data/mongo=org.springframework.data.mongodb.config.MongoNamespaceHandler
http\://www.springframework.org/schema/data/mongo=org.springframework.data.mongodb.repository.config.MongoRepositoryConfigNamespaceHandler

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Sample configuration class in default package.
*
* @see DATAMONGO-877
* @author Oliver Gierke
*/
@Configuration
public class ConfigClassInDefaultPackage extends AbstractMongoConfiguration {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.config.AbstractMongoConfiguration#getDatabaseName()
*/
@Override
protected String getDatabaseName() {
return "default";
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.config.AbstractMongoConfiguration#mongo()
*/
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
}

View File

@@ -1,20 +1,34 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
public interface MongoDocumentWriter {
}
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import org.junit.Test;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
/**
* Unit test for {@link ConfigClassInDefaultPackage}.
*
* @see DATAMONGO-877
* @author Oliver Gierke
*/
public class ConfigClassInDefaultPackageUnitTests {
/**
* @see DATAMONGO-877
*/
@Test
public void loadsConfigClassFromDefaultPackage() {
new AnnotationConfigApplicationContext(ConfigClassInDefaultPackage.class).close();
}
}

View File

@@ -34,6 +34,9 @@ import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
*/
public class AuditingIntegrationTests {
/**
* @see DATAMONGO-577, DATAMONGO-800, DATAMONGO-883
*/
@Test
public void enablesAuditingAndSetsPropertiesAccordingly() throws Exception {
@@ -58,8 +61,13 @@ public class AuditingIntegrationTests {
class Entity {
@CreatedDate DateTime created;
@LastModifiedDate DateTime modified;
@Id Long id;
@CreatedDate DateTime created;
DateTime modified;
@LastModifiedDate
public DateTime getModified() {
return modified;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,24 +19,26 @@ import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import java.net.UnknownHostException;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.domain.AuditorAware;
import org.springframework.data.mongodb.core.AuditablePerson;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
import org.springframework.stereotype.Repository;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
@@ -56,16 +58,16 @@ public class AuditingViaJavaConfigRepositoriesTests {
@Configuration
@EnableMongoAuditing(auditorAwareRef = "auditorProvider")
@EnableMongoRepositories(basePackageClasses = AuditablePersonRepository.class, considerNestedRepositories = true)
static class Config {
static class Config extends AbstractMongoConfiguration {
@Bean
public MongoOperations mongoTemplate() throws Exception {
return new MongoTemplate(new SimpleMongoDbFactory(new MongoClient(), "database"));
@Override
protected String getDatabaseName() {
return "database";
}
@Bean
public MongoMappingContext mappingContext() {
return new MongoMappingContext();
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
@Bean
@@ -81,21 +83,62 @@ public class AuditingViaJavaConfigRepositoriesTests {
this.auditor = auditablePersonRepository.save(new AuditablePerson("auditor"));
}
/**
* @see DATAMONGO-792, DATAMONGO-883
*/
@Test
public void basicAuditing() {
doReturn(this.auditor).when(this.auditorAware).getCurrentAuditor();
AuditablePerson user = new AuditablePerson("user");
AuditablePerson savedUser = auditablePersonRepository.save(user);
System.out.println(savedUser);
AuditablePerson savedUser = auditablePersonRepository.save(new AuditablePerson("user"));
AuditablePerson createdBy = savedUser.getCreatedBy();
assertThat(createdBy, is(notNullValue()));
assertThat(createdBy.getFirstname(), is(this.auditor.getFirstname()));
assertThat(savedUser.getCreatedAt(), is(notNullValue()));
}
/**
* @see DATAMONGO-843
*/
@Test
@SuppressWarnings("resource")
public void auditingUsesFallbackMappingContextIfNoneConfiguredWithRepositories() {
new AnnotationConfigApplicationContext(SimpleConfigWithRepositories.class);
}
/**
* @see DATAMONGO-843
*/
@Test
@SuppressWarnings("resource")
public void auditingUsesFallbackMappingContextIfNoneConfigured() {
new AnnotationConfigApplicationContext(SimpleConfig.class);
}
@Repository
static interface AuditablePersonRepository extends MongoRepository<AuditablePerson, String> {}
@Configuration
@EnableMongoRepositories
@EnableMongoAuditing
static class SimpleConfigWithRepositories {
@Bean
public MongoTemplate mongoTemplate() throws UnknownHostException {
return new MongoTemplate(new SimpleMongoDbFactory(new MongoClient(), "database"));
}
}
@Configuration
@EnableMongoAuditing
static class SimpleConfig {
@Bean
public MongoTemplate mongoTemplate() throws UnknownHostException {
return new MongoTemplate(new SimpleMongoDbFactory(new MongoClient(), "database"));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -57,6 +57,9 @@ public class MappingMongoConverterParserIntegrationTests {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter.xml"));
}
/**
* @see DATAMONGO-243
*/
@Test
public void allowsDbFactoryRefAttribute() {
@@ -64,6 +67,9 @@ public class MappingMongoConverterParserIntegrationTests {
factory.getBean("converter");
}
/**
* @see DATAMONGO-725
*/
@Test
public void hasCustomTypeMapper() {
@@ -73,6 +79,9 @@ public class MappingMongoConverterParserIntegrationTests {
assertThat(converter.getTypeMapper(), is(customMongoTypeMapper));
}
/**
* @see DATAMONGO-301
*/
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {
@@ -87,7 +96,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void activatesAbbreviatingPropertiesCorrectly() {
BeanDefinition definition = factory.getBeanDefinition("abbreviatingConverter.mappingContext");
BeanDefinition definition = factory.getBeanDefinition("abbreviatingConverter.mongoMappingContext");
Object value = definition.getPropertyValues().getPropertyValue("fieldNamingStrategy").getValue();
assertThat(value, is(instanceOf(BeanDefinition.class)));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,7 @@ import org.springframework.core.io.ClassPathResource;
* @see DATAMONGO-36
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Oliver Gierke
*/
public class MappingMongoConverterParserValidationIntegrationTests {
@@ -46,31 +47,43 @@ public class MappingMongoConverterParserValidationIntegrationTests {
reader = new XmlBeanDefinitionReader(factory);
}
/**
* @see DATAMONGO-36
*/
@Test
public void validatingEventListenerCreatedWithDefaultConfig() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-default.xml"));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER), is(not(nullValue())));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER_BEAN_NAME), is(not(nullValue())));
}
/**
* @see DATAMONGO-36
*/
@Test
public void validatingEventListenerCreatedWhenValidationEnabled() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-validation-enabled.xml"));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER), is(not(nullValue())));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER_BEAN_NAME), is(not(nullValue())));
}
/**
* @see DATAMONGO-36
*/
@Test(expected = NoSuchBeanDefinitionException.class)
public void validatingEventListenersIsNotCreatedWhenDisabled() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-validation-disabled.xml"));
factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER);
factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER_BEAN_NAME);
}
/**
* @see DATAMONGO-36
*/
@Test
public void validatingEventListenerCreatedWithCustomTypeMapperConfig() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-custom-typeMapper.xml"));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER), is(not(nullValue())));
assertThat(factory.getBean(BeanNames.VALIDATING_EVENT_LISTENER_BEAN_NAME), is(not(nullValue())));
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,33 +15,32 @@
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.is;
import static org.junit.Assert.assertThat;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* Integration tests for {@link MongoDbFactory}.
*
* @author Thomas Risbergf
* @author Thomas Risberg
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoDbFactoryNoDatabaseRunningTests {
@Autowired
MongoTemplate mongoTemplate;
@Autowired MongoTemplate mongoTemplate;
/**
* @see DATADOC-139
* @see DATAMONGO-139
*/
@Test
public void startsUpWithoutADatabaseRunning() {

View File

@@ -15,7 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Date;
import org.springframework.data.annotation.CreatedBy;
import org.springframework.data.annotation.CreatedDate;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.DBRef;
@@ -26,9 +29,10 @@ import org.springframework.data.mongodb.core.mapping.DBRef;
*/
public class AuditablePerson {
@Id private String id;
private @Id String id;
private String firstname;
@DBRef @CreatedBy private AuditablePerson createdBy;
private @DBRef @CreatedBy AuditablePerson createdBy;
private @CreatedDate Date createdAt;
public AuditablePerson() {}
@@ -59,4 +63,8 @@ public class AuditablePerson {
public void setCreatedBy(AuditablePerson createdBy) {
this.createdBy = createdBy;
}
public Date getCreatedAt() {
return createdAt;
}
}

View File

@@ -70,7 +70,7 @@ public abstract class DBObjectTestUtils {
}
@SuppressWarnings("unchecked")
private static <T> T getTypedValue(DBObject source, String key, Class<T> type) {
public static <T> T getTypedValue(DBObject source, String key, Class<T> type) {
Object value = source.get(key);
assertThat(value, is(notNullValue()));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.junit.Assert.*;
@@ -27,11 +26,11 @@ import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.dao.DataAccessException;
import org.springframework.data.geo.Point;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.AbstractMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.NearQuery;

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.junit.Assume.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
@@ -25,6 +26,7 @@ import static org.springframework.data.mongodb.core.query.Update.*;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
@@ -35,7 +37,6 @@ import org.bson.types.ObjectId;
import org.hamcrest.CoreMatchers;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
@@ -60,6 +61,7 @@ import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxy;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
@@ -76,6 +78,7 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
@@ -101,10 +104,14 @@ import com.mongodb.WriteResult;
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoTemplateTests {
private static final org.springframework.data.util.Version TWO_DOT_FOUR = org.springframework.data.util.Version
.parse("2.4");
@Autowired MongoTemplate template;
@Autowired MongoDbFactory factory;
MongoTemplate mappingTemplate;
org.springframework.data.util.Version mongoVersion;
@Rule public ExpectedException thrown = ExpectedException.none();
@@ -135,6 +142,7 @@ public class MongoTemplateTests {
@Before
public void setUp() {
cleanDb();
queryMongoVersionIfNecessary();
}
@After
@@ -142,6 +150,14 @@ public class MongoTemplateTests {
cleanDb();
}
private void queryMongoVersionIfNecessary() {
if (mongoVersion == null) {
CommandResult result = template.executeCommand("{ buildInfo: 1 }");
mongoVersion = org.springframework.data.util.Version.parse(result.get("version").toString());
}
}
protected void cleanDb() {
template.dropCollection(Person.class);
template.dropCollection(PersonWithAList.class);
@@ -168,6 +184,9 @@ public class MongoTemplateTests {
template.dropCollection(BaseDoc.class);
template.dropCollection(ObjectWithEnumValue.class);
template.dropCollection(DocumentWithCollection.class);
template.dropCollection(DocumentWithCollectionOfSimpleType.class);
template.dropCollection(DocumentWithMultipleCollections.class);
template.dropCollection(DocumentWithDBRefCollection.class);
}
@Test
@@ -887,7 +906,7 @@ public class MongoTemplateTests {
l2.add(31);
Query q3 = new Query(Criteria.where("age").in(l1, l2));
template.find(q3, PersonWithIdPropertyOfTypeObjectId.class);
Assert.fail("Should have trown an InvalidDocumentStoreApiUsageException");
fail("Should have trown an InvalidDocumentStoreApiUsageException");
} catch (InvalidMongoDbApiUsageException e) {}
}
@@ -2202,8 +2221,8 @@ public class MongoTemplateTests {
template.findAndModify(query, update, Document.class);
Document retrieved = template.findOne(query, Document.class);
Assert.assertThat(retrieved.model, instanceOf(ModelA.class));
Assert.assertThat(retrieved.model.value(), equalTo("value2"));
assertThat(retrieved.model, instanceOf(ModelA.class));
assertThat(retrieved.model.value(), equalTo("value2"));
}
/**
@@ -2212,16 +2231,16 @@ public class MongoTemplateTests {
@Test
public void updatesShouldRetainTypeInformationEvenForCollections() {
DocumentWithCollection doc = new DocumentWithCollection();
List<Model> models = Arrays.<Model> asList(new ModelA("foo"));
DocumentWithCollection doc = new DocumentWithCollection(models);
doc.id = "4711";
doc.model = new ArrayList<Model>();
doc.model.add(new ModelA("foo"));
template.insert(doc);
Query query = new Query(Criteria.where("id").is(doc.id));
query.addCriteria(where("model.value").is("foo"));
query.addCriteria(where("models.value").is("foo"));
String newModelValue = "bar";
Update update = Update.update("model.$", new ModelA(newModelValue));
Update update = Update.update("models.$", new ModelA(newModelValue));
template.updateFirst(query, update, DocumentWithCollection.class);
Query findQuery = new Query(Criteria.where("id").is(doc.id));
@@ -2229,23 +2248,399 @@ public class MongoTemplateTests {
assertThat(result, is(notNullValue()));
assertThat(result.id, is(doc.id));
assertThat(result.model, is(notNullValue()));
assertThat(result.model, hasSize(1));
assertThat(result.model.get(0).value(), is(newModelValue));
assertThat(result.models, is(notNullValue()));
assertThat(result.models, hasSize(1));
assertThat(result.models.get(0).value(), is(newModelValue));
}
/**
* @see DATAMONGO-812
*/
@Test
public void updateMultiShouldAddValuesCorrectlyWhenUsingPushEachWithComplexTypes() {
assumeThat(mongoVersion.isGreaterThanOrEqualTo(TWO_DOT_FOUR), is(true));
DocumentWithCollection document = new DocumentWithCollection(Collections.<Model> emptyList());
template.save(document);
Query query = query(where("id").is(document.id));
assumeThat(template.findOne(query, DocumentWithCollection.class).models, hasSize(1));
Update update = new Update().push("models").each(new ModelA("model-b"), new ModelA("model-c"));
template.updateMulti(query, update, DocumentWithCollection.class);
assertThat(template.findOne(query, DocumentWithCollection.class).models, hasSize(3));
}
/**
* @see DATAMONGO-812
*/
@Test
public void updateMultiShouldAddValuesCorrectlyWhenUsingPushEachWithSimpleTypes() {
assumeThat(mongoVersion.isGreaterThanOrEqualTo(TWO_DOT_FOUR), is(true));
DocumentWithCollectionOfSimpleType document = new DocumentWithCollectionOfSimpleType();
document.values = Arrays.asList("spring");
template.save(document);
Query query = query(where("id").is(document.id));
assumeThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(1));
Update update = new Update().push("values").each("data", "mongodb");
template.updateMulti(query, update, DocumentWithCollectionOfSimpleType.class);
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
/**
* @see DATAMONOGO-828
*/
@Test
public void updateFirstShouldDoNothingWhenCalledForEntitiesThatDoNotExist() {
Query q = query(where("id").is(Long.MIN_VALUE));
template.updateFirst(q, Update.update("lastname", "supercalifragilisticexpialidocious"), VersionedPerson.class);
assertThat(template.findOne(q, VersionedPerson.class), nullValue());
}
/**
* @see DATAMONGO-354
*/
@Test
public void testUpdateShouldAllowMultiplePushAll() {
DocumentWithMultipleCollections doc = new DocumentWithMultipleCollections();
doc.id = "1234";
doc.string1 = Arrays.asList("spring");
doc.string2 = Arrays.asList("one");
template.save(doc);
Update update = new Update().pushAll("string1", new Object[] { "data", "mongodb" });
update.pushAll("string2", new String[] { "two", "three" });
Query findQuery = new Query(Criteria.where("id").is(doc.id));
template.updateFirst(findQuery, update, DocumentWithMultipleCollections.class);
DocumentWithMultipleCollections result = template.findOne(findQuery, DocumentWithMultipleCollections.class);
assertThat(result.string1, hasItems("spring", "data", "mongodb"));
assertThat(result.string2, hasItems("one", "two", "three"));
}
/**
* @see DATAMONGO-404
*/
@Test
public void updateWithPullShouldRemoveNestedItemFromDbRefAnnotatedCollection() {
Sample sample1 = new Sample("1", "A");
Sample sample2 = new Sample("2", "B");
template.save(sample1);
template.save(sample2);
DocumentWithDBRefCollection doc = new DocumentWithDBRefCollection();
doc.id = "1";
doc.dbRefAnnotatedList = Arrays.asList( //
sample1, //
sample2 //
);
template.save(doc);
Update update = new Update().pull("dbRefAnnotatedList", doc.dbRefAnnotatedList.get(1));
Query qry = query(where("id").is("1"));
template.updateFirst(qry, update, DocumentWithDBRefCollection.class);
DocumentWithDBRefCollection result = template.findOne(qry, DocumentWithDBRefCollection.class);
assertThat(result, is(notNullValue()));
assertThat(result.dbRefAnnotatedList, hasSize(1));
assertThat(result.dbRefAnnotatedList.get(0), is(notNullValue()));
assertThat(result.dbRefAnnotatedList.get(0).id, is((Object) "1"));
}
/**
* @see DATAMONGO-404
*/
@Test
public void updateWithPullShouldRemoveNestedItemFromDbRefAnnotatedCollectionWhenGivenAnIdValueOfComponentTypeEntity() {
Sample sample1 = new Sample("1", "A");
Sample sample2 = new Sample("2", "B");
template.save(sample1);
template.save(sample2);
DocumentWithDBRefCollection doc = new DocumentWithDBRefCollection();
doc.id = "1";
doc.dbRefAnnotatedList = Arrays.asList( //
sample1, //
sample2 //
);
template.save(doc);
Update update = new Update().pull("dbRefAnnotatedList.id", "2");
Query qry = query(where("id").is("1"));
template.updateFirst(qry, update, DocumentWithDBRefCollection.class);
DocumentWithDBRefCollection result = template.findOne(qry, DocumentWithDBRefCollection.class);
assertThat(result, is(notNullValue()));
assertThat(result.dbRefAnnotatedList, hasSize(1));
assertThat(result.dbRefAnnotatedList.get(0), is(notNullValue()));
assertThat(result.dbRefAnnotatedList.get(0).id, is((Object) "1"));
}
/**
* @see DATAMONGO-852
*/
@Test
public void updateShouldNotBumpVersionNumberIfVersionPropertyIncludedInUpdate() {
VersionedPerson person = new VersionedPerson();
person.firstname = "Dave";
person.lastname = "Matthews";
template.save(person);
assertThat(person.id, is(notNullValue()));
Query qry = query(where("id").is(person.id));
VersionedPerson personAfterFirstSave = template.findOne(qry, VersionedPerson.class);
assertThat(personAfterFirstSave.version, is(0L));
template.updateFirst(qry, Update.update("lastname", "Bubu").set("version", 100L), VersionedPerson.class);
VersionedPerson personAfterUpdateFirst = template.findOne(qry, VersionedPerson.class);
assertThat(personAfterUpdateFirst.version, is(100L));
assertThat(personAfterUpdateFirst.lastname, is("Bubu"));
}
/**
* @see DATAMONGO-468
*/
@Test
public void shouldBeAbleToUpdateDbRefPropertyWithDomainObject() {
Sample sample1 = new Sample("1", "A");
Sample sample2 = new Sample("2", "B");
template.save(sample1);
template.save(sample2);
DocumentWithDBRefCollection doc = new DocumentWithDBRefCollection();
doc.id = "1";
doc.dbRefProperty = sample1;
template.save(doc);
Update update = new Update().set("dbRefProperty", sample2);
Query qry = query(where("id").is("1"));
template.updateFirst(qry, update, DocumentWithDBRefCollection.class);
DocumentWithDBRefCollection updatedDoc = template.findOne(qry, DocumentWithDBRefCollection.class);
assertThat(updatedDoc, is(notNullValue()));
assertThat(updatedDoc.dbRefProperty, is(notNullValue()));
assertThat(updatedDoc.dbRefProperty.id, is(sample2.id));
assertThat(updatedDoc.dbRefProperty.field, is(sample2.field));
}
/**
* @see DATAMONGO-862
*/
@Test
public void testUpdateShouldWorkForPathsOnInterfaceMethods() {
DocumentWithCollection document = new DocumentWithCollection(Arrays.<Model> asList(new ModelA("spring"),
new ModelA("data")));
template.save(document);
Query query = query(where("id").is(document.id).and("models._id").exists(true));
Update update = new Update().set("models.$.value", "mongodb");
template.findAndModify(query, update, DocumentWithCollection.class);
DocumentWithCollection result = template.findOne(query(where("id").is(document.id)), DocumentWithCollection.class);
assertThat(result.models.get(0).value(), is("mongodb"));
}
/**
* @see DATAMONGO-773
*/
@Test
public void testShouldSupportQueryWithIncludedDbRefField() {
Sample sample = new Sample("47111", "foo");
template.save(sample);
DocumentWithDBRefCollection doc = new DocumentWithDBRefCollection();
doc.id = "4711";
doc.dbRefProperty = sample;
template.save(doc);
Query qry = query(where("id").is(doc.id));
qry.fields().include("dbRefProperty");
List<DocumentWithDBRefCollection> result = template.find(qry, DocumentWithDBRefCollection.class);
assertThat(result, is(notNullValue()));
assertThat(result, hasSize(1));
assertThat(result.get(0), is(notNullValue()));
assertThat(result.get(0).dbRefProperty, is(notNullValue()));
assertThat(result.get(0).dbRefProperty.field, is(sample.field));
}
/**
* @see DATAMONGO-566
*/
@Test
public void testFindAllAndRemoveFullyReturnsAndRemovesDocuments() {
Sample spring = new Sample("100", "spring");
Sample data = new Sample("200", "data");
Sample mongodb = new Sample("300", "mongodb");
template.insert(Arrays.asList(spring, data, mongodb), Sample.class);
Query qry = query(where("field").in("spring", "mongodb"));
List<Sample> result = template.findAllAndRemove(qry, Sample.class);
assertThat(result, hasSize(2));
assertThat(
template.getDb().getCollection("sample")
.find(new BasicDBObject("field", new BasicDBObject("$in", Arrays.asList("spring", "mongodb")))).count(),
is(0));
assertThat(template.getDb().getCollection("sample").find(new BasicDBObject("field", "data")).count(), is(1));
}
/**
* @see DATAMONGO-880
*/
@Test
public void savingAndReassigningLazyLoadingProxies() {
template.dropCollection(SomeTemplate.class);
template.dropCollection(SomeMessage.class);
template.dropCollection(SomeContent.class);
SomeContent content = new SomeContent();
content.id = "C1";
content.text = "BUBU";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "T1";
tmpl.content = content; // @DBRef(lazy=true) tmpl.content
template.save(tmpl);
SomeTemplate savedTmpl = template.findById(tmpl.id, SomeTemplate.class);
SomeMessage message = new SomeMessage();
message.id = "M1";
message.dbrefContent = savedTmpl.content; // @DBRef message.dbrefContent
message.normalContent = savedTmpl.content;
template.save(message);
SomeMessage savedMessage = template.findById(message.id, SomeMessage.class);
assertThat(savedMessage.dbrefContent.text, is(content.text));
assertThat(savedMessage.normalContent.text, is(content.text));
}
/**
* @see DATAMONGO-884
*/
@Test
public void callingNonObjectMethodsOnLazyLoadingProxyShouldReturnNullIfUnderlyingDbrefWasDeletedInbetween() {
template.dropCollection(SomeTemplate.class);
template.dropCollection(SomeContent.class);
SomeContent content = new SomeContent();
content.id = "C1";
content.text = "BUBU";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "T1";
tmpl.content = content; // @DBRef(lazy=true) tmpl.content
template.save(tmpl);
SomeTemplate savedTmpl = template.findById(tmpl.id, SomeTemplate.class);
template.remove(content);
assertThat(savedTmpl.getContent().toString(), is("someContent:C1$LazyLoadingProxy"));
assertThat(savedTmpl.getContent(), is(instanceOf(LazyLoadingProxy.class)));
assertThat(savedTmpl.getContent().getText(), is(nullValue()));
}
/**
* @see DATAMONGO-471
*/
@Test
public void updateMultiShouldAddValuesCorrectlyWhenUsingAddToSetWithEach() {
DocumentWithCollectionOfSimpleType document = new DocumentWithCollectionOfSimpleType();
document.values = Arrays.asList("spring");
template.save(document);
Query query = query(where("id").is(document.id));
assumeThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(1));
Update update = new Update().addToSet("values").each("data", "mongodb");
template.updateMulti(query, update, DocumentWithCollectionOfSimpleType.class);
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
static class DocumentWithDBRefCollection {
@Id public String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
public List<Sample> dbRefAnnotatedList;
@org.springframework.data.mongodb.core.mapping.DBRef//
public Sample dbRefProperty;
}
static class DocumentWithCollection {
@Id public String id;
public List<Model> model;
@Id String id;
List<Model> models;
DocumentWithCollection(List<Model> models) {
this.models = models;
}
}
static class DocumentWithCollectionOfSimpleType {
@Id String id;
List<String> values;
}
static class DocumentWithMultipleCollections {
@Id String id;
List<String> string1;
List<String> string2;
}
static interface Model {
String value();
String id();
}
static class ModelA implements Model {
@Id String id;
private String value;
ModelA(String value) {
@@ -2256,6 +2651,11 @@ public class MongoTemplateTests {
public String value() {
return this.value;
}
@Override
public String id() {
return id;
}
}
static class Document {
@@ -2279,6 +2679,13 @@ public class MongoTemplateTests {
@Id String id;
String field;
public Sample() {}
public Sample(String id, String field) {
this.id = id;
this.field = field;
}
}
static class TestClass {
@@ -2371,4 +2778,30 @@ public class MongoTemplateTests {
@Id String id;
EnumValue value;
}
public static class SomeTemplate {
String id;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) SomeContent content;
public SomeContent getContent() {
return content;
}
}
public static class SomeContent {
String id;
String text;
public String getText() {
return text;
}
}
static class SomeMessage {
String id;
@org.springframework.data.mongodb.core.mapping.DBRef SomeContent dbrefContent;
SomeContent normalContent;
}
}

View File

@@ -25,10 +25,14 @@ import java.util.Collections;
import java.util.regex.Pattern;
import org.bson.types.ObjectId;
import org.hamcrest.core.Is;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.ArgumentMatcher;
import org.mockito.Matchers;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
@@ -37,6 +41,7 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.Version;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
@@ -44,12 +49,15 @@ import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
@@ -58,6 +66,7 @@ import com.mongodb.MongoException;
* Unit tests for {@link MongoTemplate}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@@ -68,6 +77,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Mock Mongo mongo;
@Mock DB db;
@Mock DBCollection collection;
@Mock DBCursor cursor;
MongoExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
MappingMongoConverter converter;
@@ -79,11 +89,14 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
when(factory.getDb()).thenReturn(db);
when(factory.getExceptionTranslator()).thenReturn(exceptionTranslator);
when(db.getCollection(Mockito.any(String.class))).thenReturn(collection);
when(collection.find(Mockito.any(DBObject.class))).thenReturn(cursor);
when(cursor.limit(anyInt())).thenReturn(cursor);
when(cursor.sort(Mockito.any(DBObject.class))).thenReturn(cursor);
when(cursor.hint(anyString())).thenReturn(cursor);
this.mappingContext = new MongoMappingContext();
this.converter = new MappingMongoConverter(new DefaultDbRefResolver(factory), mappingContext);
this.template = new MongoTemplate(factory, converter);
}
@Test(expected = IllegalArgumentException.class)
@@ -205,6 +218,46 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(entity.id, is(5));
}
/**
* @see DATAMONGO-868
*/
@Test
public void findAndModifyShouldBumpVersionByOneWhenVersionFieldNotIncludedInUpdate() {
VersionedEntity v = new VersionedEntity();
v.id = 1;
v.version = 0;
ArgumentCaptor<DBObject> captor = ArgumentCaptor.forClass(DBObject.class);
template.findAndModify(new Query(), new Update().set("id", "10"), VersionedEntity.class);
verify(collection, times(1)).findAndModify(Matchers.any(DBObject.class),
org.mockito.Matchers.isNull(DBObject.class), org.mockito.Matchers.isNull(DBObject.class), eq(false),
captor.capture(), eq(false), eq(false));
Assert.assertThat(captor.getValue().get("$inc"), Is.<Object> is(new BasicDBObject("version", 1)));
}
/**
* @see DATAMONGO-868
*/
@Test
public void findAndModifyShouldNotBumpVersionByOneWhenVersionFieldAlreadyIncludedInUpdate() {
VersionedEntity v = new VersionedEntity();
v.id = 1;
v.version = 0;
ArgumentCaptor<DBObject> captor = ArgumentCaptor.forClass(DBObject.class);
template.findAndModify(new Query(), new Update().set("version", 100), VersionedEntity.class);
verify(collection, times(1)).findAndModify(Matchers.any(DBObject.class), isNull(DBObject.class),
isNull(DBObject.class), eq(false), captor.capture(), eq(false), eq(false));
Assert.assertThat(captor.getValue().get("$set"), Is.<Object> is(new BasicDBObject("version", 100)));
Assert.assertThat(captor.getValue().get("$inc"), nullValue());
}
/**
* @see DATAMONGO-533
*/
@@ -235,6 +288,47 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
}));
}
/**
* @see DATAMONGO-566
*/
@Test
public void findAllAndRemoveShouldRetrieveMatchingDocumentsPriorToRemoval() {
BasicQuery query = new BasicQuery("{'foo':'bar'}");
template.findAllAndRemove(query, VersionedEntity.class);
verify(collection, times(1)).find(Matchers.eq(query.getQueryObject()));
}
/**
* @see DATAMONGO-566
*/
@Test
public void findAllAndRemoveShouldRemoveDocumentsReturedByFindQuery() {
Mockito.when(cursor.hasNext()).thenReturn(true).thenReturn(true).thenReturn(false);
Mockito.when(cursor.next()).thenReturn(new BasicDBObject("_id", Integer.valueOf(0)))
.thenReturn(new BasicDBObject("_id", Integer.valueOf(1)));
ArgumentCaptor<DBObject> queryCaptor = ArgumentCaptor.forClass(DBObject.class);
BasicQuery query = new BasicQuery("{'foo':'bar'}");
template.findAllAndRemove(query, VersionedEntity.class);
verify(collection, times(1)).remove(queryCaptor.capture());
DBObject idField = DBObjectTestUtils.getAsDBObject(queryCaptor.getValue(), "_id");
assertThat((Object[]) idField.get("$in"), is(new Object[] { Integer.valueOf(0), Integer.valueOf(1) }));
}
/**
* @see DATAMONGO-566
*/
@Test
public void findAllAndRemoveShouldNotTriggerRemoveIfFindResultIsEmpty() {
template.findAllAndRemove(new BasicQuery("{'foo':'bar'}"), VersionedEntity.class);
verify(collection, never()).remove(Mockito.any(DBObject.class));
}
class AutogenerateableId {
@Id BigInteger id;
@@ -249,6 +343,12 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
}
}
static class VersionedEntity {
@Id Integer id;
@Version Integer version;
}
enum MyConverter implements Converter<AutogenerateableId, String> {
INSTANCE;

View File

@@ -23,8 +23,7 @@ public class PersonWithVersionPropertyOfTypeLong {
String firstName;
int age;
@Version
Long version;
@Version Long version;
@Override
public String toString() {

View File

@@ -189,7 +189,7 @@ public class AggregationTests {
AggregationResults<TagCount> results = mongoTemplate.aggregate(agg, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
assertThat(results.getServerUsed(), endsWith("127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
@@ -217,7 +217,7 @@ public class AggregationTests {
AggregationResults<TagCount> results = mongoTemplate.aggregate(aggregation, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
assertThat(results.getServerUsed(), endsWith("127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
@@ -241,7 +241,7 @@ public class AggregationTests {
AggregationResults<TagCount> results = mongoTemplate.aggregate(aggregation, INPUT_COLLECTION, TagCount.class);
assertThat(results, is(notNullValue()));
assertThat(results.getServerUsed(), is("/127.0.0.1:27017"));
assertThat(results.getServerUsed(), endsWith("127.0.0.1:27017"));
List<TagCount> tagCount = results.getMappedResults();
@@ -782,6 +782,46 @@ public class AggregationTests {
assertThat(String.valueOf(firstItem.get("_id")), is("u1"));
}
/**
* @see DATAMONGO-840
*/
@Test
public void shouldAggregateOrderDataToAnInvoice() {
mongoTemplate.dropCollection(Order.class);
double taxRate = 0.19;
LineItem product1 = new LineItem("1", "p1", 1.23);
LineItem product2 = new LineItem("2", "p2", 0.87, 2);
LineItem product3 = new LineItem("3", "p3", 5.33);
Order order = new Order("o4711", "c42", new Date()).addItem(product1).addItem(product2).addItem(product3);
mongoTemplate.save(order);
AggregationResults<Invoice> results = mongoTemplate.aggregate(newAggregation(Order.class, //
match(where("id").is(order.getId())), unwind("items"), //
project("id", "customerId", "items") //
.andExpression("items.price * items.quantity").as("lineTotal"), //
group("id") //
.sum("lineTotal").as("netAmount") //
.addToSet("items").as("items"), //
project("id", "items", "netAmount") //
.and("orderId").previousOperation() //
.andExpression("netAmount * [0]", taxRate).as("taxAmount") //
.andExpression("netAmount * (1 + [0])", taxRate).as("totalAmount") //
), Invoice.class);
Invoice invoice = results.getUniqueMappedResult();
assertThat(invoice, is(notNullValue()));
assertThat(invoice.getOrderId(), is(order.getId()));
assertThat(invoice.getNetAmount(), is(closeTo(8.3, 000001)));
assertThat(invoice.getTaxAmount(), is(closeTo(1.577, 000001)));
assertThat(invoice.getTotalAmount(), is(closeTo(9.877, 000001)));
}
private void assertLikeStats(LikeStats like, String id, long count) {
assertThat(like, is(notNullValue()));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
@@ -161,4 +162,21 @@ public class AggregationUnitTests {
assertThat(fields.get("aCnt"), is((Object) 1));
assertThat(fields.get("a"), is((Object) "$_id.a"));
}
/**
* @see DATAMONGO-838
*/
@Test
public void expressionBasedFieldsShouldBeReferencableInFollowingOperations() {
DBObject agg = newAggregation( //
project("a").andExpression("b+c").as("foo"), //
group("a").sum("foo").as("foosum") //
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
@SuppressWarnings("unchecked")
DBObject secondProjection = ((List<DBObject>) agg.get("pipeline")).get(1);
DBObject fields = getAsDBObject(secondProjection, "$group");
assertThat(fields.get("foosum"), is((Object) new BasicDBObject("$sum", "$foo")));
}
}

View File

@@ -0,0 +1,74 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.List;
/**
* @author Thomas Darimont
*/
public class Invoice {
String orderId;
double taxAmount;
double netAmount;
double totalAmount;
List<LineItem> items;
public String getOrderId() {
return orderId;
}
public void setOrderId(String orderId) {
this.orderId = orderId;
}
public double getTaxAmount() {
return taxAmount;
}
public void setTaxAmount(double taxAmount) {
this.taxAmount = taxAmount;
}
public double getNetAmount() {
return netAmount;
}
public void setNetAmount(double netAmount) {
this.netAmount = netAmount;
}
public double getTotalAmount() {
return totalAmount;
}
public void setTotalAmount(double totalAmount) {
this.totalAmount = totalAmount;
}
public List<LineItem> getItems() {
return items;
}
public void setItems(List<LineItem> items) {
this.items = items;
}
}

View File

@@ -0,0 +1,46 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
/**
* @author Thomas Darimont
*/
public class LineItem {
final String id;
final String caption;
final double price;
int quantity = 1;
@SuppressWarnings("unused")
private LineItem() {
this(null, null, 0.0, 0);
}
public LineItem(String id, String caption, double price) {
this.id = id;
this.caption = caption;
this.price = price;
}
public LineItem(String id, String caption, double price, int quantity) {
this(id, caption, price);
this.quantity = quantity;
}
}

View File

@@ -0,0 +1,70 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Date;
import java.util.List;
/**
* @author Thomas Darimont
*/
public class Order {
final String id;
final String customerId;
final Date orderDate;
final List<LineItem> items;
public Order(String id, String customerId, Date orderDate) {
this(id, customerId, orderDate, new ArrayList<LineItem>());
}
public Order(String id, String customerId, Date orderDate, List<LineItem> items) {
this.id = id;
this.customerId = customerId;
this.orderDate = orderDate;
this.items = items;
}
public Order addItem(LineItem item) {
List<LineItem> newItems = new ArrayList<LineItem>(items != null ? items : Collections.<LineItem> emptyList());
newItems.add(item);
return new Order(id, customerId, orderDate, newItems);
}
public String getId() {
return id;
}
public String getCustomerId() {
return customerId;
}
public Date getOrderDate() {
return orderDate;
}
public List<LineItem> getItems() {
return items;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.mongodb.core.Person;
/**
* Unit tests for {@link SpelExpressionTransformer}.
@@ -174,6 +175,26 @@ public class SpelExpressionTransformerUnitTests {
is("{ \"$multiply\" : [ { \"$add\" : [ 1 , 42 , 1.2345]} , 23]}"));
}
/**
* @see DATAMONGO-840
*/
@Test
public void shouldRenderCompoundExpressionsWithIndexerAndFieldReference() {
Person person = new Person();
person.setAge(10);
assertThat(transform("[0].age + a.c", person), is("{ \"$add\" : [ 10 , \"$a.c\"]}"));
}
/**
* @see DATAMONGO-840
*/
@Test
public void shouldRenderCompoundExpressionsWithOnlyFieldReferences() {
assertThat(transform("a.b + a.c"), is("{ \"$add\" : [ \"$a.b\" , \"$a.c\"]}"));
}
@Test
public void shouldRenderStringFunctions() {

View File

@@ -7,6 +7,7 @@ import java.net.URL;
import java.text.DateFormat;
import java.text.Format;
import java.util.Arrays;
import java.util.Date;
import java.util.Locale;
import java.util.UUID;
@@ -183,6 +184,19 @@ public class CustomConversionsUnitTests {
assertThat(conversions.getCustomWriteTarget(DateTime.class, null), is(equalTo((Class) String.class)));
}
/**
* @see DATAMONGO-881
*/
@Test
public void customConverterOverridesDefault() {
CustomConversions conversions = new CustomConversions(Arrays.asList(CustomDateTimeConverter.INSTANCE));
GenericConversionService conversionService = new DefaultConversionService();
conversions.registerConvertersIn(conversionService);
assertThat(conversionService.convert(new DateTime(), Date.class), is(new Date(0)));
}
enum FormatToStringConverter implements Converter<Format, String> {
INSTANCE;
@@ -227,4 +241,14 @@ public class CustomConversionsUnitTests {
return "";
}
}
enum CustomDateTimeConverter implements Converter<DateTime, Date> {
INSTANCE;
@Override
public Date convert(DateTime source) {
return new Date(0);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -50,7 +50,10 @@ import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* Unit tests dor {@link DbRefMappingMongoConverterUnitTests}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class DbRefMappingMongoConverterUnitTests {
@@ -283,6 +286,149 @@ public class DbRefMappingMongoConverterUnitTests {
assertThat(deserializedResult.dbRefToSerializableTarget.getValue(), is(value));
}
/**
* @see DATAMONGO-884
*/
@Test
public void lazyLoadingProxyForToStringObjectMethodOverridingDbref() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToToStringObjectMethodOverride, is(notNullValue()));
assertProxyIsResolved(result.dbRefToToStringObjectMethodOverride, false);
assertThat(result.dbRefToToStringObjectMethodOverride.toString(), is(id + ":" + value));
assertProxyIsResolved(result.dbRefToToStringObjectMethodOverride, true);
}
/**
* @see DATAMONGO-884
*/
@Test
public void callingToStringObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
// calling Object#toString does not initialize the proxy.
String proxyString = result.dbRefToPlainObject.toString();
assertThat(proxyString, is("lazyDbRefTarget" + ":" + id + "$LazyLoadingProxy"));
assertProxyIsResolved(result.dbRefToPlainObject, false);
// calling another method not declared on object triggers proxy initialization.
assertThat(result.dbRefToPlainObject.getValue(), is(value));
assertProxyIsResolved(result.dbRefToPlainObject, true);
}
/**
* @see DATAMONGO-884
*/
@Test
public void equalsObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
assertThat(result.dbRefToPlainObject, is(equalTo(result.dbRefToPlainObject)));
assertThat(result.dbRefToPlainObject, is(not(equalTo(null))));
assertThat(result.dbRefToPlainObject, is(not(equalTo((Object) lazyDbRefs.dbRefToToStringObjectMethodOverride))));
assertProxyIsResolved(result.dbRefToPlainObject, false);
}
/**
* @see DATAMONGO-884
*/
@Test
public void hashcodeObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
assertThat(result.dbRefToPlainObject.hashCode(), is(311365444));
assertProxyIsResolved(result.dbRefToPlainObject, false);
}
/**
* @see DATAMONGO-884
*/
@Test
public void lazyLoadingProxyForEqualsAndHashcodeObjectMethodOverridingDbref() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefEqualsAndHashcodeObjectMethodOverride1 = new EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(
id, value);
lazyDbRefs.dbRefEqualsAndHashcodeObjectMethodOverride2 = new EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(
id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride1, false);
assertThat(result.dbRefEqualsAndHashcodeObjectMethodOverride1, is(notNullValue()));
result.dbRefEqualsAndHashcodeObjectMethodOverride1.equals(null);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride1, true);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride2, false);
assertThat(result.dbRefEqualsAndHashcodeObjectMethodOverride2, is(notNullValue()));
result.dbRefEqualsAndHashcodeObjectMethodOverride2.hashCode();
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride2, true);
}
private Object transport(Object result) {
return SerializationUtils.deserialize(SerializationUtils.serialize(result));
}
@@ -345,6 +491,7 @@ public class DbRefMappingMongoConverterUnitTests {
}
}
@SuppressWarnings("serial")
static class LazyDbRefTargetWithPeristenceConstructor extends LazyDbRefTarget {
boolean persistenceConstructorCalled;
@@ -362,6 +509,7 @@ public class DbRefMappingMongoConverterUnitTests {
}
}
@SuppressWarnings("serial")
static class LazyDbRefTargetWithPeristenceConstructorWithoutDefaultConstructor extends LazyDbRefTarget {
boolean persistenceConstructorCalled;
@@ -387,4 +535,74 @@ public class DbRefMappingMongoConverterUnitTests {
private static final long serialVersionUID = 1L;
}
static class ToStringObjectMethodOverrideLazyDbRefTarget extends LazyDbRefTarget {
private static final long serialVersionUID = 1L;
public ToStringObjectMethodOverrideLazyDbRefTarget() {}
public ToStringObjectMethodOverrideLazyDbRefTarget(String id, String value) {
super(id, value);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return this.id + ":" + this.value;
}
}
static class EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget extends LazyDbRefTarget {
private static final long serialVersionUID = 1L;
public EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget() {}
public EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(String id, String value) {
super(id, value);
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((id == null) ? 0 : id.hashCode());
result = prime * result + ((value == null) ? 0 : value.hashCode());
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget other = (EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget) obj;
if (id == null) {
if (other.id != null)
return false;
} else if (!id.equals(other.id))
return false;
if (value == null) {
if (other.value != null)
return false;
} else if (!value.equals(other.value))
return false;
return true;
}
}
static class WithObjectMethodOverrideLazyDbRefs {
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) LazyDbRefTarget dbRefToPlainObject;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) ToStringObjectMethodOverrideLazyDbRefTarget dbRefToToStringObjectMethodOverride;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget dbRefEqualsAndHashcodeObjectMethodOverride2;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget dbRefEqualsAndHashcodeObjectMethodOverride1;
}
}

View File

@@ -0,0 +1,199 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Arrays;
import org.junit.Test;
import org.springframework.data.geo.Box;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.mongodb.core.convert.GeoConverters.BoxToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.CircleToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToBoxConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToCircleConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToLegacyCircleConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToPointConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToPolygonConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.DbObjectToSphereConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.GeoCommandToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.LegacyCircleToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.PointToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.PolygonToDbObjectConverter;
import org.springframework.data.mongodb.core.convert.GeoConverters.SphereToDbObjectConverter;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand;
import com.mongodb.DBObject;
/**
* Unit tests for {@link GeoConverters}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.5
*/
@SuppressWarnings("deprecation")
public class GeoConvertersUnitTests {
/**
* @see DATAMONGO-858
*/
@Test
public void convertsBoxToDbObjectAndBackCorrectly() {
Box box = new Box(new Point(1, 2), new Point(3, 4));
DBObject dbo = BoxToDbObjectConverter.INSTANCE.convert(box);
Box result = DbObjectToBoxConverter.INSTANCE.convert(dbo);
assertThat(result, is(box));
assertThat(result.getClass().equals(org.springframework.data.mongodb.core.geo.Box.class), is(true));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsCircleToDbObjectAndBackCorrectlyNeutralDistance() {
Circle circle = new Circle(new Point(1, 2), 3);
DBObject dbo = CircleToDbObjectConverter.INSTANCE.convert(circle);
Circle result = DbObjectToCircleConverter.INSTANCE.convert(dbo);
assertThat(result, is(circle));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsCircleToDbObjectAndBackCorrectlyMilesDistance() {
Distance radius = new Distance(3, Metrics.MILES);
Circle circle = new Circle(new Point(1, 2), radius);
DBObject dbo = CircleToDbObjectConverter.INSTANCE.convert(circle);
Circle result = DbObjectToCircleConverter.INSTANCE.convert(dbo);
assertThat(result, is(circle));
assertThat(result.getRadius(), is(radius));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsLegacyCircleToDbObjectAndBackCorrectly() {
org.springframework.data.mongodb.core.geo.Circle circle = new org.springframework.data.mongodb.core.geo.Circle(
new Point(1, 2), 3);
DBObject dbo = LegacyCircleToDbObjectConverter.INSTANCE.convert(circle);
org.springframework.data.mongodb.core.geo.Circle result = DbObjectToLegacyCircleConverter.INSTANCE.convert(dbo);
assertThat(result, is(circle));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsPolygonToDbObjectAndBackCorrectly() {
Polygon polygon = new Polygon(new Point(1, 2), new Point(2, 3), new Point(3, 4), new Point(5, 6));
DBObject dbo = PolygonToDbObjectConverter.INSTANCE.convert(polygon);
Polygon result = DbObjectToPolygonConverter.INSTANCE.convert(dbo);
assertThat(result, is(polygon));
assertThat(result.getClass().equals(org.springframework.data.mongodb.core.geo.Polygon.class), is(true));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsSphereToDbObjectAndBackCorrectlyWithNeutralDistance() {
Sphere sphere = new Sphere(new Point(1, 2), 3);
DBObject dbo = SphereToDbObjectConverter.INSTANCE.convert(sphere);
Sphere result = DbObjectToSphereConverter.INSTANCE.convert(dbo);
assertThat(result, is(sphere));
assertThat(result.getClass().equals(org.springframework.data.mongodb.core.geo.Sphere.class), is(true));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsSphereToDbObjectAndBackCorrectlyWithKilometerDistance() {
Distance radius = new Distance(3, Metrics.KILOMETERS);
Sphere sphere = new Sphere(new Point(1, 2), radius);
DBObject dbo = SphereToDbObjectConverter.INSTANCE.convert(sphere);
Sphere result = DbObjectToSphereConverter.INSTANCE.convert(dbo);
assertThat(result, is(sphere));
assertThat(result.getRadius(), is(radius));
assertThat(result.getClass().equals(org.springframework.data.mongodb.core.geo.Sphere.class), is(true));
}
/**
* @see DATAMONGO-858
*/
@Test
public void convertsPointToListAndBackCorrectly() {
Point point = new Point(1, 2);
DBObject dbo = PointToDbObjectConverter.INSTANCE.convert(point);
Point result = DbObjectToPointConverter.INSTANCE.convert(dbo);
assertThat(result, is(point));
assertThat(result.getClass().equals(org.springframework.data.mongodb.core.geo.Point.class), is(true));
}
/**
* @see DATAMONGO-858
*/
@Test
@SuppressWarnings("unchecked")
public void convertsGeoCommandToDbObjectCorrectly() {
Box box = new Box(new double[] { 1, 2 }, new double[] { 3, 4 });
GeoCommand cmd = new GeoCommand(box);
DBObject dbo = GeoCommandToDbObjectConverter.INSTANCE.convert(cmd);
assertThat(dbo, is(notNullValue()));
DBObject boxObject = (DBObject) dbo.get("$box");
assertThat(boxObject,
is((Object) Arrays.asList(GeoConverters.toList(box.getFirst()), GeoConverters.toList(box.getSecond()))));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import static org.junit.Assert.*;
import org.springframework.aop.framework.Advised;
import org.springframework.cglib.proxy.Factory;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import org.springframework.test.util.ReflectionTestUtils;
/**
* Utility class to test proxy handling for lazy loading.
@@ -39,8 +40,8 @@ public class LazyLoadingTestUtils {
public static void assertProxyIsResolved(Object target, boolean expected) {
LazyLoadingInterceptor interceptor = extractInterceptor(target);
assertThat(interceptor.isResolved(), is(expected));
assertThat(interceptor.getResult(), is(expected ? notNullValue() : nullValue()));
assertThat(ReflectionTestUtils.getField(interceptor, "resolved"), is((Object) expected));
assertThat(ReflectionTestUtils.getField(interceptor, "result"), is(expected ? notNullValue() : nullValue()));
}
private static LazyLoadingInterceptor extractInterceptor(Object proxy) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.DBObjectTestUtils.*;
import java.math.BigDecimal;
import java.math.BigInteger;
@@ -27,6 +28,8 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.EnumMap;
import java.util.EnumSet;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
@@ -34,11 +37,14 @@ import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.SortedMap;
import java.util.TreeMap;
import org.bson.types.ObjectId;
import org.hamcrest.Matcher;
import org.hamcrest.Matchers;
import org.joda.time.LocalDate;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
@@ -49,15 +55,25 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.annotation.TypeAlias;
import org.springframework.data.geo.Box;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.MappingInstantiationException;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.convert.DBObjectAccessorUnitTests.NestedType;
import org.springframework.data.mongodb.core.convert.DBObjectAccessorUnitTests.ProjectingType;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.PersonPojoStringId;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBList;
@@ -72,6 +88,7 @@ import com.mongodb.util.JSON;
*
* @author Oliver Gierke
* @author Patrik Wasik
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MappingMongoConverterUnitTests {
@@ -644,9 +661,7 @@ public class MappingMongoConverterUnitTests {
public void readsMapListNestedValuesCorrectly() {
BasicDBList list = new BasicDBList();
BasicDBObject nested = new BasicDBObject();
nested.append("Hello", "World");
list.add(nested);
list.add(new BasicDBObject("Hello", "World"));
DBObject source = new BasicDBObject("mapOfObjects", new BasicDBObject("Foo", list));
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, source);
@@ -1021,12 +1036,12 @@ public class MappingMongoConverterUnitTests {
address.city = "London";
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singleton(address));
Object result = converter.convertToMongoType(Collections.singleton(address), ClassTypeInformation.OBJECT);
assertThat(result, is(instanceOf(BasicDBList.class)));
Set<?> readResult = converter.read(Set.class, (BasicDBList) result);
assertThat(readResult.size(), is(1));
assertThat(readResult.iterator().next(), is(instanceOf(Map.class)));
assertThat(readResult.iterator().next(), is(instanceOf(Address.class)));
}
/**
@@ -1240,9 +1255,9 @@ public class MappingMongoConverterUnitTests {
DB db = mock(DB.class);
DBRef dbRef = new DBRef(db, "collection", "id");
org.springframework.data.mongodb.core.mapping.DBRef annotation = mock(org.springframework.data.mongodb.core.mapping.DBRef.class);
MongoPersistentProperty property = mock(MongoPersistentProperty.class);
assertThat(converter.createDBRef(dbRef, annotation), is(dbRef));
assertThat(converter.createDBRef(dbRef, property), is(dbRef));
}
/**
@@ -1381,6 +1396,416 @@ public class MappingMongoConverterUnitTests {
assertThat(aValue.get("c"), is((Object) "C"));
}
/**
* @see DATAMONGO-812
*/
@Test
public void convertsListToBasicDBListAndRetainsTypeInformationForComplexObjects() {
Address address = new Address();
address.city = "London";
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singletonList(address),
ClassTypeInformation.from(Address.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList dbList = (BasicDBList) result;
assertThat(dbList, hasSize(1));
assertThat(getTypedValue(getAsDBObject(dbList, 0), "_class", String.class), equalTo(Address.class.getName()));
}
/**
* @see DATAMONGO-812
*/
@Test
public void convertsListToBasicDBListWithoutTypeInformationForSimpleTypes() {
Object result = converter.convertToMongoType(Collections.singletonList("foo"));
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList dbList = (BasicDBList) result;
assertThat(dbList, hasSize(1));
assertThat(dbList.get(0), instanceOf(String.class));
}
/**
* @see DATAMONGO-812
*/
@Test
public void convertsArrayToBasicDBListAndRetainsTypeInformationForComplexObjects() {
Address address = new Address();
address.city = "London";
address.street = "Foo";
Object result = converter.convertToMongoType(new Address[] { address }, ClassTypeInformation.OBJECT);
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList dbList = (BasicDBList) result;
assertThat(dbList, hasSize(1));
assertThat(getTypedValue(getAsDBObject(dbList, 0), "_class", String.class), equalTo(Address.class.getName()));
}
/**
* @see DATAMONGO-812
*/
@Test
public void convertsArrayToBasicDBListWithoutTypeInformationForSimpleTypes() {
Object result = converter.convertToMongoType(new String[] { "foo" });
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList dbList = (BasicDBList) result;
assertThat(dbList, hasSize(1));
assertThat(dbList.get(0), instanceOf(String.class));
}
/**
* @see DATAMONGO-833
*/
@Test
public void readsEnumSetCorrectly() {
BasicDBList enumSet = new BasicDBList();
enumSet.add("SECOND");
DBObject dbObject = new BasicDBObject("enumSet", enumSet);
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, dbObject);
assertThat(result.enumSet, is(instanceOf(EnumSet.class)));
assertThat(result.enumSet.size(), is(1));
assertThat(result.enumSet, hasItem(SampleEnum.SECOND));
}
/**
* @see DATAMONGO-833
*/
@Test
public void readsEnumMapCorrectly() {
BasicDBObject enumMap = new BasicDBObject("FIRST", "Dave");
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, new BasicDBObject("enumMap", enumMap));
assertThat(result.enumMap, is(instanceOf(EnumMap.class)));
assertThat(result.enumMap.size(), is(1));
assertThat(result.enumMap.get(SampleEnum.FIRST), is("Dave"));
}
/**
* @see DATAMONGO-887
*/
@Test
public void readsTreeMapCorrectly() {
DBObject person = new BasicDBObject("foo", "Dave");
DBObject treeMapOfPerson = new BasicDBObject("key", person);
DBObject document = new BasicDBObject("treeMapOfPersons", treeMapOfPerson);
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, document);
assertThat(result.treeMapOfPersons, is(notNullValue()));
assertThat(result.treeMapOfPersons.get("key"), is(notNullValue()));
assertThat(result.treeMapOfPersons.get("key").firstname, is("Dave"));
}
/**
* @see DATAMONGO-887
*/
@Test
public void writesTreeMapCorrectly() {
Person person = new Person();
person.firstname = "Dave";
ClassWithMapProperty source = new ClassWithMapProperty();
source.treeMapOfPersons = new TreeMap<String, Person>();
source.treeMapOfPersons.put("key", person);
DBObject result = new BasicDBObject();
converter.write(source, result);
DBObject map = getAsDBObject(result, "treeMapOfPersons");
DBObject entry = getAsDBObject(map, "key");
assertThat(entry.get("foo"), is((Object) "Dave"));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoBoxCorrectly() {
ClassWithGeoBox object = new ClassWithGeoBox();
object.box = new Box(new Point(1, 2), new Point(3, 4));
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("box"), is(instanceOf(DBObject.class)));
assertThat(dbo.get("box"), is((Object) new BasicDBObject().append("first", toDbObject(object.box.getFirst()))
.append("second", toDbObject(object.box.getSecond()))));
}
private static DBObject toDbObject(Point point) {
return new BasicDBObject("x", point.getX()).append("y", point.getY());
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldReadEntityWithGeoBoxCorrectly() {
ClassWithGeoBox object = new ClassWithGeoBox();
object.box = new Box(new Point(1, 2), new Point(3, 4));
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoBox result = converter.read(ClassWithGeoBox.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.box, is(object.box));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoPolygonCorrectly() {
ClassWithGeoPolygon object = new ClassWithGeoPolygon();
object.polygon = new Polygon(new Point(1, 2), new Point(3, 4), new Point(4, 5));
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("polygon"), is(instanceOf(DBObject.class)));
DBObject polygonDbo = (DBObject) dbo.get("polygon");
@SuppressWarnings("unchecked")
List<DBObject> points = (List<DBObject>) polygonDbo.get("points");
assertThat(points, hasSize(3));
assertThat(points, Matchers.<DBObject> hasItems(toDbObject(object.polygon.getPoints().get(0)),
toDbObject(object.polygon.getPoints().get(1)), toDbObject(object.polygon.getPoints().get(2))));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldReadEntityWithGeoPolygonCorrectly() {
ClassWithGeoPolygon object = new ClassWithGeoPolygon();
object.polygon = new Polygon(new Point(1, 2), new Point(3, 4), new Point(4, 5));
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoPolygon result = converter.read(ClassWithGeoPolygon.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.polygon, is(object.polygon));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoCircleCorrectly() {
ClassWithGeoCircle object = new ClassWithGeoCircle();
Circle circle = new Circle(new Point(1, 2), 3);
Distance radius = circle.getRadius();
object.circle = circle;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("circle"), is(instanceOf(DBObject.class)));
assertThat(
dbo.get("circle"),
is((Object) new BasicDBObject("center", new BasicDBObject("x", circle.getCenter().getX()).append("y", circle
.getCenter().getY())).append("radius", radius.getNormalizedValue()).append("metric",
radius.getMetric().toString())));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldReadEntityWithGeoCircleCorrectly() {
ClassWithGeoCircle object = new ClassWithGeoCircle();
object.circle = new Circle(new Point(1, 2), 3);
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoCircle result = converter.read(ClassWithGeoCircle.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.circle, is(result.circle));
}
/**
* @DATAMONGO-858
*/
@Test
@SuppressWarnings("deprecation")
public void shouldWriteEntityWithGeoLegacyCircleCorrectly() {
ClassWithGeoLegacyCircle object = new ClassWithGeoLegacyCircle();
org.springframework.data.mongodb.core.geo.Circle circle = new org.springframework.data.mongodb.core.geo.Circle(
new Point(1, 2), 3);
object.circle = circle;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("circle"), is(instanceOf(DBObject.class)));
assertThat(dbo.get("circle"), is((Object) new BasicDBObject("center", new BasicDBObject("x", circle.getCenter()
.getX()).append("y", circle.getCenter().getY())).append("radius", circle.getRadius())));
}
/**
* @DATAMONGO-858
*/
@Test
@SuppressWarnings("deprecation")
public void shouldReadEntityWithGeoLegacyCircleCorrectly() {
ClassWithGeoLegacyCircle object = new ClassWithGeoLegacyCircle();
object.circle = new org.springframework.data.mongodb.core.geo.Circle(new Point(1, 2), 3);
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoLegacyCircle result = converter.read(ClassWithGeoLegacyCircle.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.circle, is(result.circle));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoSphereCorrectly() {
ClassWithGeoSphere object = new ClassWithGeoSphere();
Sphere sphere = new Sphere(new Point(1, 2), 3);
Distance radius = sphere.getRadius();
object.sphere = sphere;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("sphere"), is(instanceOf(DBObject.class)));
assertThat(
dbo.get("sphere"),
is((Object) new BasicDBObject("center", new BasicDBObject("x", sphere.getCenter().getX()).append("y", sphere
.getCenter().getY())).append("radius", radius.getNormalizedValue()).append("metric",
radius.getMetric().toString())));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoSphereWithMetricDistanceCorrectly() {
ClassWithGeoSphere object = new ClassWithGeoSphere();
Sphere sphere = new Sphere(new Point(1, 2), new Distance(3, Metrics.KILOMETERS));
Distance radius = sphere.getRadius();
object.sphere = sphere;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("sphere"), is(instanceOf(DBObject.class)));
assertThat(
dbo.get("sphere"),
is((Object) new BasicDBObject("center", new BasicDBObject("x", sphere.getCenter().getX()).append("y", sphere
.getCenter().getY())).append("radius", radius.getNormalizedValue()).append("metric",
radius.getMetric().toString())));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldReadEntityWithGeoSphereCorrectly() {
ClassWithGeoSphere object = new ClassWithGeoSphere();
object.sphere = new Sphere(new Point(1, 2), 3);
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoSphere result = converter.read(ClassWithGeoSphere.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.sphere, is(object.sphere));
}
/**
* @DATAMONGO-858
*/
@Test
public void shouldWriteEntityWithGeoShapeCorrectly() {
ClassWithGeoShape object = new ClassWithGeoShape();
Sphere sphere = new Sphere(new Point(1, 2), 3);
Distance radius = sphere.getRadius();
object.shape = sphere;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
assertThat(dbo, is(notNullValue()));
assertThat(dbo.get("shape"), is(instanceOf(DBObject.class)));
assertThat(
dbo.get("shape"),
is((Object) new BasicDBObject("center", new BasicDBObject("x", sphere.getCenter().getX()).append("y", sphere
.getCenter().getY())).append("radius", radius.getNormalizedValue()).append("metric",
radius.getMetric().toString())));
}
/**
* @DATAMONGO-858
*/
@Test
@Ignore
public void shouldReadEntityWithGeoShapeCorrectly() {
ClassWithGeoShape object = new ClassWithGeoShape();
Sphere sphere = new Sphere(new Point(1, 2), 3);
object.shape = sphere;
DBObject dbo = new BasicDBObject();
converter.write(object, dbo);
ClassWithGeoShape result = converter.read(ClassWithGeoShape.class, dbo);
assertThat(result, is(notNullValue()));
assertThat(result.shape, is((Shape) sphere));
}
static class GenericType<T> {
T content;
}
@@ -1389,6 +1814,8 @@ public class MappingMongoConverterUnitTests {
SampleEnum sampleEnum;
List<SampleEnum> enums;
EnumSet<SampleEnum> enumSet;
EnumMap<SampleEnum, String> enumMap;
}
static enum SampleEnum {
@@ -1446,6 +1873,7 @@ public class MappingMongoConverterUnitTests {
Map<String, Object> mapOfObjects;
Map<String, String[]> mapOfStrings;
Map<String, Person> mapOfPersons;
TreeMap<String, Person> treeMapOfPersons;
}
static class ClassWithNestedMaps {
@@ -1593,4 +2021,35 @@ public class MappingMongoConverterUnitTests {
return m_property;
}
}
class ClassWithGeoBox {
Box box;
}
class ClassWithGeoCircle {
Circle circle;
}
@SuppressWarnings("deprecation")
class ClassWithGeoLegacyCircle {
org.springframework.data.mongodb.core.geo.Circle circle;
}
class ClassWithGeoSphere {
Sphere sphere;
}
class ClassWithGeoPolygon {
Polygon polygon;
}
class ClassWithGeoShape {
Shape shape;
}
}

Some files were not shown because too many files have changed in this diff Show More