Compare commits

..

75 Commits

Author SHA1 Message Date
Oliver Gierke
9cd33763b6 DATAMONGO-357 - Prepare 1.0.0 release.
Updated changelog, changed reference documentation inclusion links to point to SD Commons 1.0.0.RELEASE documentation. Updated dependency information in reference documentation.
2011-12-22 21:04:43 +01:00
Oliver Gierke
de8d2a1c74 DATAMONGO-355 - Upgraded to Spring 3.0.7. 2011-12-22 20:40:26 +01:00
Oliver Gierke
9f940cd2b6 DATAMONGO-257 - Documented TypeMapper abstraction. 2011-12-22 16:23:15 +01:00
Oliver Gierke
0d69baa32c DATAMONGO-350 - Upgraded to Querydsl 2.3.0.
Adapted changes in annotation processor API.
2011-12-22 08:27:11 +01:00
Oliver Gierke
433e5a660e DATAMONGO-260 - Fixed setup of ConversionService.
Remonde removal of generic ObjectToStringConverter as it doesn't break our converter lookup due to the changed algorithm how to involve Spring Converters in the conversion process.
2011-12-22 08:25:48 +01:00
Oliver Gierke
ee33ce1571 DATAMONGO-336 - Fixed potential NullPointerException in MongoTemplate.
The execution of MongoTemplate.geoNear(…) potentially caused NullPointerExceptions in case the actual query does not return any results. The wrapping return object returns null for the result list and general statistics which we didn't shield against.
2011-12-21 18:47:50 +01:00
Mark Pollack
9ac11e967a DATAMONGO-260 - MapReduce fails when using with Long as key-type. 2011-12-21 12:34:12 -05:00
Thomas Risberg
a89a0ac542 DATAMONGO-139 Adding test to verify that MongoTemplate does not eagerly try to connect to MongoDB 2011-12-20 12:44:10 -05:00
Oliver Gierke
0ad0dad124 DATAMONGO-330 - Updated reference documentation regarding custom converters.
Documented classpath scanning feature of custom converters (DATAMONGO-301). Documented converter disambiguation using @ReadingConverter, @WritingConverter (DATACMNS-113, DATAMONGO-342). Fixed some code formatting on the way.
2011-12-20 18:20:59 +01:00
Oliver Gierke
8f2771416e DATACMNS-110 - Narrow repository base-package reference to avoid picking up types from SD Commons. 2011-12-20 16:39:48 +01:00
Oliver Gierke
7af009cc7f DATAMONGO-347 - Added documentation to mention query parameter binding does not work with DBRefs. 2011-12-19 15:12:25 +01:00
Oliver Gierke
35ad949c18 DATAMONGO-326 - Fixed enum handling in $in, $nin and $ne criterias.
Updated Criteria implementation to consistently store collections instead of arrays internally.
2011-12-19 14:22:52 +01:00
Oliver Gierke
8206b5f950 DATAMONGO-342 - Introduced support for @ReadingConverter / @WritingConverter.
CustomConversions now evaluates @ReadingConverter / @WritingConverter when adding Converter implementations. See DATACMNS-113 and the appropriate commit for details. Added unit test to verify StringToBigIntegerConverter does not get added as writing converter.
2011-12-16 14:23:31 +01:00
Oliver Gierke
170081137a DATAMONGO-346 - Fixed id-handling in queries.
In case a query referencing an entity's id needs massaging before being executed (e.g. Strings that can be ObjectID) the massaging failed in case the query was built using _id already as we did not detect that one being an id reference as we compared it to the entity's id property name. We now also compare against it's field name.
2011-12-14 20:31:08 +01:00
Oliver Gierke
6862bd8a45 DATAMONGO-343 - Fixed registration of ServerAddressPropertyEditor.
Changed the setter parameters for ServerAddresses to use arrays instead of List. We now register the ServerAddressPropertyEditor to convert a ServerAddress[] and thus don't register a PropertyEditor for List which caused unwanted side effects before.
2011-12-12 18:03:27 +01:00
Oliver Gierke
490db7c39f DATAMONGO-341 - Eagerly reject null values in MongoTemplate.geoNear(…). 2011-12-07 18:06:25 +01:00
Oliver Gierke
f3979c3676 DATAMONGO-340 - Prepare 1.0.0.BUILD-SNAPSHOT. 2011-12-07 01:10:16 +01:00
Oliver Gierke
8d18729898 DATAMONGO-340 - Prepare 1.0.0.RC1. 2011-12-07 01:09:58 +01:00
Oliver Gierke
b5e0b2bec2 DATAMONGO-340 - Polished reference documentation.
Added section ids to generate stable URLs for HTML documentation.
2011-12-07 01:09:58 +01:00
Oliver Gierke
9eb47827c1 DATAMONGO-337 - Added Criteria.nin(…) and ….all(…) taking a Collection. 2011-12-07 01:09:57 +01:00
Oliver Gierke
f97ab25411 DATAMONGO-338 - Updated reference documentation regarding new keywords. 2011-12-07 01:09:57 +01:00
Oliver Gierke
6616761f50 DATAMONGO-322 - MongoTemplates refuses to save entities with unset id if not auto-generateable.
If an entity is handed into the template to be saved or inserted we now check that the auto-generated ObjectId can actually be applied to the id property after saving the object.
2011-12-07 01:09:57 +01:00
Mark Pollack
89de566893 Add findAndModify to docs, update test to include findAndModify with upsert 2011-12-06 13:04:24 -05:00
Mark Pollack
ea1f090b40 Add docs for index ops and clean up 'bare' references to Mongo, change to MongoDB 2011-12-06 11:09:23 -05:00
Oliver Gierke
b5958fb5cc DATAMONGO-338 - Query parser implementation go Regex, Exists, True and False keywords. 2011-12-06 17:01:56 +01:00
Oliver Gierke
75b7aff80a DATAMONGO-318 - Don't throw exceptions for updates not affecting any documents.
Throwing an exception if an update does not affect any documents doesn't make sense in all cases. Removed throwing an exception by default but made the relevant method (handleAnyWriteResultErrors(…)) protected so that subclasses might override this behavior.
2011-12-06 15:15:13 +01:00
Oliver Gierke
7da0fcdd0c DATAMONGO-199 - Fixed bug in CachingMongoPersistentProperty. 2011-12-06 14:48:25 +01:00
Oliver Gierke
c88b6d89db DATAMONGO-251 - Polishing.
JavaDoc, Formatting. Made dependencies in DefaultIndexOperations final. Reduced dependency to MongoOperations instead of depending on MongoTemplate directly. Added not-null assertion to constructor of DIO.
2011-12-06 14:33:45 +01:00
Oliver Gierke
de1540aadc DATAMONGO-234 - Polishing.
Removed unused imports, corrected whitespace, formatting.
2011-12-06 14:24:51 +01:00
Oliver Gierke
d1b24d6cfb DATAMONGO-332 - Updated reference documentation to list correct dependencies.
Fixed formatting of log output along the way.
2011-12-06 14:06:59 +01:00
Mark Pollack
e85f3d3299 DATAMONGO-251 - Support geting index information on a collection or mapped class. 2011-12-06 02:25:13 -05:00
Mark Pollack
ef6e08d3f4 DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-12-06 00:26:18 -05:00
Oliver Gierke
21010fbd49 DATACMNS-91 - Reject null parameters in SimpleMongoRepository.
According to the specification in CrudRepository we now reject null values for ids and entities in CRUD methods.
2011-12-02 13:34:44 +01:00
Oliver Gierke
4325d6c9fa Reactivated accidentally disabled unit tests. 2011-12-02 13:02:57 +01:00
Oliver Gierke
bc16ccfded DATACMNS-77 - Using constants from ClassTypeInformation inside MappingMongoConverter. 2011-12-02 11:33:37 +01:00
Oliver Gierke
04f5f9f662 DATACMNS-103 - Adapt changes in BeanWrapper.
Removed obsolete exception handling code.
2011-12-02 10:08:48 +01:00
Oliver Gierke
b1f1b8efaa DATAMONGO-321 - Overhaul of id handling.
Cleaned up the id handling on query mapping and mapping in general. We now only try to convert id values into an ObjectId and store it as is using potentially registered custom converters. Register BigInteger<->String converters by default now.
2011-12-01 17:50:36 +01:00
Oliver Gierke
de300e2643 DATAMONGO-328 - Set required MongoDB version to 0.
The MANIFEST.MF in current MongoDB driver version is broken in terms of not stating package versions. Thus we unfortunately cannot refer to a particular version range but have to use the generic 0 as required version.
2011-12-01 17:05:28 +01:00
Oliver Gierke
20088b83d9 Removed compiler warnings. 2011-12-01 16:47:40 +01:00
Oliver Gierke
58f200f15e DATAMONGO-335 - Set up hybrid Spring 3.1/3.0.6 build.
Also see DATACMNS-103.
2011-12-01 16:06:38 +01:00
Oliver Gierke
8718700249 DATAMONGO-334 - Switched to use http://repo.springsource.org as repository.
Fixed versions of build plugins along the way.
2011-12-01 16:05:08 +01:00
Oliver Gierke
f4063d1679 DATAMONGO-333 - Default to Object for AbstractMongoEventlistener domain type.
In case an extension of AbstractMongoEventListener does not define a parameter type we now default to Object as handled domain type as we'd cause a NullPointerException if not.
2011-12-01 12:16:27 +01:00
Oliver Gierke
ef063613c7 DATAMONGO-325 - MongoTemplate now correctly refuses not found map reduce JavaScript files.
We now check whether a URL was passed in as map and/or reduce function and throw an exception in case the file either does not exist or cannot be read.
2011-11-30 22:53:56 +01:00
Oliver Gierke
2eda0f1701 DATAMONGO-185 - Expose hints on Query.
Query now exposes a withHint(…) method which will be applied to the DBCursor on query execution. Reduced CursorPreparer's visibility to the package and removed methods exposing it from MongoOperations.
2011-11-30 22:29:59 +01:00
Oliver Gierke
ec7b65e21d DATAMONGO-331 - Fixed typo in WriteConcern enumeration for db-factory element. 2011-11-30 18:27:33 +01:00
Oliver Gierke
c7f7571f3f DATAMONGO-326 - QueryMapper now delegates type conversion to MongoConverter.
QueryMapper now delegates to a MongoConverter instead of a plain ConversionService and invokes optional conversion on it. This optional conversion now removes type information from the created DBObject.
2011-11-30 17:56:44 +01:00
Oliver Gierke
9f71af42e8 DATAMONGO-329 - Fixed Collection and Map value handling for more open properties.
The decision whether a property value was handled as Collection or Map was based on inspecting the property's type which failed for classes using very open property declarations such as:

class MyClass {

  Object something;
}

We now rather inspect the value type instead of the property.
2011-11-30 16:20:25 +01:00
Oliver Gierke
92775170e1 DATAMONGO-301 - Allow classpath-scanning for Converters.
<mongo:custom-conversions /> now has a base-package attribute that scans for Converter and GenericConverter beans. Added <tool:exports /> metadata for MappingMongoConverter.
2011-11-30 15:26:18 +01:00
Oliver Gierke
4c7e338770 Adapt refactorings in SD Commons. 2011-11-28 18:07:33 +01:00
Oliver Gierke
e3fff52d17 DATAMONGO-298 - CustomConversions now also considers sub-types of Number as simple.
CustomConversions now delegates to MongoSimpleTypes.HOLDER.isSimpleType(…) instead of maintaining an additional list of Mongo-primitive types. Added DBObject to the list of Mongo-primitive types.
2011-11-24 15:20:49 +01:00
Oliver Gierke
5477ab20b2 DATAMONGO-324 - Added shortcut in MappingMongoConverter to allow reading DBObjects without conversion.
Added check in MappingMongoConverter.read(…) to shortcut object conversion if the requested type is DBObject.
2011-11-24 12:59:47 +01:00
Oliver Gierke
4cf3567f42 DATAMONGO-310 - MappingMongoConverter now creates native Mongo types for Maps and Collections in convertToMongoType(…).
MappingMongoConverter.convertToMongoType(…) not only converts elements of collections and maps but also converts the wrapper into the appropriate MongoDB type (BasicDBList, BasicDBObject).
2011-11-23 13:15:50 +01:00
Oliver Gierke
b26bb62a63 DATAMONGO-305 - Removed synchronization from Query class.
As Query is not intended to be thread-safe at all, we can safely remove the synchronized blocks from sort() and fields().
2011-11-23 12:48:39 +01:00
Oliver Gierke
f156d7b5af DATAMONGO-312 - MappingMongoConverter handles complex enum types correctly.
If an Enum implements abstract methods, the Class object derived from ${ENUM}.getClass() does not return true for ….isEnum(). Thus we have to rather check Enum.class.isAssignableFrom(…) as this catches this scenario as well. Also see DATACMNS-99 for a related fix in simple type handling in the core infrastructure.
2011-11-23 11:58:09 +01:00
Oliver Gierke
7bf3643902 DATAMONGO-304 - Removed document subpackage from Log4jAppender module. 2011-11-23 11:07:47 +01:00
Oliver Gierke
201ae3e92d Polished unit test. 2011-11-23 10:58:35 +01:00
Oliver Gierke
07556ec58c DATAMONGO-323 - Annotated repository queries consider dynamic sort now.
Applying a Sort parameter handed into a repository query method now for string based (aka. @Query annotated) queries.
2011-11-23 10:58:22 +01:00
Oliver Gierke
39807b17e1 DATAMONGO-309 - MappingMongoConverter now correctly maps Arrays as Map values.
We now not only convert collection values of Maps into BasicDBLists but arrays as well.
2011-11-23 10:28:40 +01:00
Oliver Gierke
4913fe26ac DATAMONGO-296 - Hook into Querydsl serialization to get predicate parameters converted.
Overrode MongoDbSerializer.asDBObject(…) and delegate to our MongoConverter to potentially convert predicate parameters. Upgraded to Querydsl 2.2.5.
2011-11-23 10:15:46 +01:00
Oliver Gierke
7642a719ff Polishing.
Removed unused imports, removed compiler warnings, polished JavaDoc.
2011-11-23 09:26:19 +01:00
Oliver Gierke
6c1ce576a4 DATACMNS-98 - Reflect refactoring.
MongoQueryMethod now uses RepositoryMetadata.getReturnedDomainType(…) instead of the static method of ClassUtils.
2011-11-21 19:21:22 +01:00
Mark Pollack
c99882201d DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-11-17 17:38:54 -05:00
Mark Pollack
ce6a64e4a9 DATAMONGO-308 - Add support for upsert methods 2011-11-16 16:25:58 -05:00
Mark Pollack
2fcc323bcd DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:55:35 -05:00
Mark Pollack
17c7b1d2b5 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:30:56 -05:00
Mark Pollack
cfefe46cd4 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods
DATAMONGO-320 - Remove use of slaveOk boolean option in MongoTemplate as it is deprecated. Replace with ReadPreference
2011-11-16 13:28:13 -05:00
Mark Pollack
64921ddad1 DATAMONGO-319 - WriteConcern not parsed correctly in namespace handlers
DATAMONGO-311 - Update MongoDB driver to v 2.7.x
2011-11-15 16:54:02 -05:00
Mark Pollack
edda1764fe DATAMONGO-311 - Update MongoDB driver to v 2.7.x
Still investigating write_concern compatiblity as mentioned in the ticket
2011-11-15 12:44:25 -05:00
Mark Pollack
8113b79109 DATAMONGO-315 - MongoTemplate.findOne(query) methods ignore SortOrder on query 2011-11-14 23:26:16 -05:00
Mark Pollack
9fde4dff3e DATAMONGO-195 - Add description of @Field mapping annotation to reference docs 2011-11-14 22:53:08 -05:00
Mark Pollack
d4b3e2b99d DATAMONGO-306 - NullPointerException if mongo factory created via URI with out credentials 2011-11-14 22:48:07 -05:00
Mark Pollack
68a31d75f3 DATAMONGO-313 [Refactoring] - Use MongoOperations interface instead of MongoTemplate class 2011-11-14 22:20:50 -05:00
Mark Pollack
3e15c21419 DATAMONGO-208 - Add suppoprt for group() operation on collection in MongoOperations 2011-11-14 22:08:29 -05:00
Mark Pollack
e9f253d34f DATAMONGO-316 - Replica Set configuration via properties file throws ArrayIndexOutOfBoundsException 2011-11-14 16:38:18 -05:00
Thomas Risberg
80aa057acb Preparing for snapshot builds 2011-11-04 09:24:45 -04:00
103 changed files with 4679 additions and 1063 deletions

14
pom.xml
View File

@@ -3,9 +3,9 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongo-dist</artifactId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.M5</version>
<version>1.0.0.RELEASE</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>
@@ -272,7 +272,7 @@
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
<url>http://repo.springsource.org/release</url>
</pluginRepository>
</pluginRepositories>
@@ -282,13 +282,13 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-document/snapshot-site/
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/docs/${project.version}
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
<id>spring-release</id>
<name>Spring Release Repository</name>
<url>s3://maven.springframework.org/release</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
@@ -23,7 +23,7 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>

View File

@@ -5,7 +5,7 @@ and connects directly to the MongoDB server using the driver. It has no dependen
To use it, configure a host, port, (optionally) applicationId, and database property in your Log4J configuration:
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.document.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;

View File

@@ -1,6 +1,6 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost

View File

@@ -6,7 +6,7 @@
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.M5</version>
<version>1.0.0.RELEASE</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -16,8 +16,10 @@
<org.mockito.version>1.8.4</org.mockito.version>
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version>3.0.6.RELEASE</org.springframework.version>
<data.commons.version>1.2.0.M2</data.commons.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.2.0.RELEASE</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
@@ -63,13 +65,13 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/mongodb/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-mongodb/docs/${project.version}
</url>
</site>
<repository>
<id>spring-milestone</id>
<name>Spring Milestone Repository</name>
<url>s3://maven.springframework.org/milestone</url>
<id>spring-release</id>
<name>Spring Release Repository</name>
<url>s3://maven.springframework.org/release</url>
</repository>
<snapshotRepository>
<id>spring-snapshot</id>
@@ -92,42 +94,42 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
@@ -289,6 +291,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
@@ -308,19 +311,18 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/Abstract*.java</exclude>
</excludes>
<junitArtifactName>junit:junit</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
@@ -375,24 +377,14 @@
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
<url>http://repo.springsource.org/release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
</repository>
<repository>
<id>repository.springframework.maven.milestone</id>
<name>Spring Framework Maven Milestone Repository</name>
<url>http://maven.springframework.org/milestone</url>
</repository>
<repository>
<id>repository.springframework.maven.snapshot</id>
<name>Spring Framework Maven Snapshot Repository</name>
<url>http://maven.springframework.org/snapshot</url>
<url>http://repo.springsource.org/release</url>
</repository>
</repositories>
<reporting>

View File

@@ -5,15 +5,15 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.6.5</mongo.version>
<querydsl.version>2.2.4</querydsl.version>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.3.0</querydsl.version>
</properties>
<dependencies>

View File

@@ -18,6 +18,9 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.io.IOException;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
@@ -36,13 +39,20 @@ import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.type.classreading.MetadataReader;
import org.springframework.core.type.classreading.MetadataReaderFactory;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -133,15 +143,29 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
if (customConvertersElements.size() == 1) {
Element customerConvertersElement = customConvertersElements.get(0);
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<BeanMetadataElement>();
List<Element> converterElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (converterElements != null) {
for (Element listenerElement : converterElements) {
converterBeans.add(parseConverter(listenerElement, parserContext));
}
}
// Scan for Converter and GenericConverter beans in the given base-package
String packageToScan = customerConvertersElement.getAttribute(BASE_PACKAGE);
if (StringUtils.hasText(packageToScan)) {
ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(true);
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class), new AssignableTypeFilter(
GenericConverter.class)));
for (BeanDefinition candidate : provider.findCandidateComponents(packageToScan)) {
converterBeans.add(candidate);
}
}
BeanDefinitionBuilder conversionsBuilder = BeanDefinitionBuilder.rootBeanDefinition(CustomConversions.class);
conversionsBuilder.addConstructorArgValue(converterBeans);
@@ -194,4 +218,39 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
"Element <converter> must specify 'ref' or contain a bean definition for the converter", element);
return null;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*
* @author Oliver Gierke
*/
private static class NegatingFilter implements TypeFilter {
private final Set<TypeFilter> delegates;
/**
* Creates a new {@link NegatingFilter} with the given delegates.
*
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {
Assert.notNull(filters);
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory) throws IOException {
for (TypeFilter delegate : delegates) {
if (delegate.match(metadataReader, metadataReaderFactory)) {
return false;
}
}
return true;
}
}
}

View File

@@ -56,7 +56,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
@@ -65,20 +65,21 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if(StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", parserContext.extractSource(element));
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
// Defaulting
mongoRef = StringUtils.hasText(mongoRef) ? mongoRef : registerMongoBeanDefinition(element, parserContext);
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbFactoryBuilder.addConstructorArgValue(dbname);
@@ -86,6 +87,8 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
@@ -128,7 +131,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return getSourceBeanDefinition(userCredentialsBuilder, context, element);
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
*
@@ -136,10 +139,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* @return
*/
private BeanDefinition getMongoUri(String uri) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();
}
}

View File

@@ -16,9 +16,15 @@
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoFactoryBean;
@@ -26,7 +32,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for &lt;mongo;gt; definitions. If no name
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
*/
@@ -39,15 +45,35 @@ public class MongoParser extends AbstractSingleBeanDefinitionParser {
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
super.doParse(element, builder);
ParsingUtils.setPropertyValue(element, builder, "port", "port");
ParsingUtils.setPropertyValue(element, builder, "host", "host");
ParsingUtils.setPropertyValue(element, builder, "write-concern", "writeConcern");
ParsingUtils.parseMongoOptions(parserContext, element, builder);
ParsingUtils.parseReplicaSet(parserContext, element, builder);
ParsingUtils.parseMongoOptions(element, builder);
ParsingUtils.parseReplicaSet(element, builder);
registerServerAddressPropertyEditor(parserContext.getRegistry());
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*
* @param parserContext the ParserContext to
*/
private void registerServerAddressPropertyEditor(BeanDefinitionRegistry registry) {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
@Override

View File

@@ -16,18 +16,22 @@
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
import com.mongodb.ServerAddress;
abstract class ParsingUtils {
/**
@@ -38,22 +42,11 @@ abstract class ParsingUtils {
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
static boolean parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
ManagedList<Object> serverAddresses = new ManagedList<Object>();
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
for (String element2 : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(element2, ":");
BeanDefinitionBuilder defBuilder = BeanDefinitionBuilder.genericBeanDefinition(ServerAddress.class);
defBuilder.addConstructorArgValue(hostAndPort[0]);
defBuilder.addConstructorArgValue(hostAndPort[1]);
serverAddresses.add(defBuilder.getBeanDefinition());
}
if (!serverAddresses.isEmpty()) {
mongoBuilder.addPropertyValue("replicaSetSeeds", serverAddresses);
}
mongoBuilder.addPropertyValue("replicaSetSeeds", replicaSetString);
}
return true;
@@ -64,7 +57,7 @@ abstract class ParsingUtils {
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
@@ -122,9 +115,27 @@ abstract class ParsingUtils {
* @param element must not be {@literal null}.
* @return
*/
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context, Element element) {
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context,
Element element) {
AbstractBeanDefinition definition = builder.getBeanDefinition();
definition.setSource(context.extractSource(element));
return definition;
}
/**
* Registers a {@link WriteConcernPropertyEditor} in the given {@link BeanDefinitionRegistry}.
*
* @param registry must not be {@literal null}.
*/
static void registerWriteConcernPropertyEditor(BeanDefinitionRegistry registry) {
Assert.notNull(registry);
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
}

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.UnknownHostException;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
/**
* Parse a {@link String} to a {@link ServerAddress} array. The format is host1:port1,host2:port2,host3:port3.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String replicaSetString) {
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
ServerAddress[] serverAddresses = new ServerAddress[replicaSetStringArray.length];
for (int i = 0; i < replicaSetStringArray.length; i++) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(replicaSetStringArray[i], ":");
try {
serverAddresses[i] = new ServerAddress(hostAndPort[0], Integer.parseInt(hostAndPort[1]));
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Could not parse port " + hostAndPort[1], e);
} catch (UnknownHostException e) {
throw new IllegalArgumentException("Could not parse host " + hostAndPort[0], e);
}
}
setValue(serverAddresses);
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.WriteConcern;
/**
* Parse a string to a {@link WriteConcern}. If it is a well know {@link String} as identified by the
* {@link WriteConcern#valueOf(String)}, use the well known {@link WriteConcern} value, otherwise pass the string as is
* to the constructor of the write concern. There is no support for other constructor signatures when parsing from a
* string value.
*
* @author Mark Pollack
*/
public class WriteConcernPropertyEditor extends PropertyEditorSupport {
/**
* Parse a string to a List<ServerAddress>
*/
@Override
public void setAsText(String writeConcernString) {
WriteConcern writeConcern = WriteConcern.valueOf(writeConcernString);
if (writeConcern != null) {
// have a well known string
setValue(writeConcern);
} else {
// pass on the string to the constructor
setValue(new WriteConcern(writeConcernString));
}
}
}

View File

@@ -22,7 +22,7 @@ import com.mongodb.DBCursor;
*
* @author Oliver Gierke
*/
public interface CursorPreparer {
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.

View File

@@ -0,0 +1,162 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* Default implementation of {@link IndexOperations}.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class DefaultIndexOperations implements IndexOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.dropIndex(name);
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@SuppressWarnings("unchecked")
private List<IndexInfo> getIndexData(List<DBObject> dbObjectList) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
Map<String, Order> keyOrderMap = new LinkedHashMap<String, Order>();
DBObject keyDbObject = (DBObject) ix.get("key");
Iterator<?> entries = keyDbObject.toMap().entrySet().iterator();
while (entries.hasNext()) {
Entry<Object, Integer> thisEntry = (Entry<Object, Integer>) entries.next();
String key = thisEntry.getKey().toString();
int value = thisEntry.getValue();
if (value == 1) {
keyOrderMap.put(key, Order.ASCENDING);
} else {
keyOrderMap.put(key, Order.DESCENDING);
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(keyOrderMap, name, unique, dropDuplicates, sparse));
}
return indexInfoList;
}
});
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
public class FindAndModifyOptions {
boolean returnNew;
boolean upsert;
boolean remove;
/**
* Static factory method to create a FindAndModifyOptions instance
*
* @return a new instance
*/
public static FindAndModifyOptions options() {
return new FindAndModifyOptions();
}
public FindAndModifyOptions returnNew(boolean returnNew) {
this.returnNew = returnNew;
return this;
}
public FindAndModifyOptions upsert(boolean upsert) {
this.upsert = upsert;
return this;
}
public FindAndModifyOptions remove(boolean remove) {
this.remove = remove;
return this;
}
public boolean isReturnNew() {
return returnNew;
}
public boolean isUpsert() {
return upsert;
}
public boolean isRemove() {
return remove;
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface IndexOperations {
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the collection indicated by the entity
* class. If not it will be created.
*
* @param indexDefinition must not be {@literal null}.
*/
void ensureIndex(IndexDefinition indexDefinition);
/**
* Drops an index from this collection.
*
* @param name name of index to drop
*/
void dropIndex(String name);
/**
* Drops all indices from this collection.
*/
void dropAllIndexes();
/**
* Clears all indices that have not yet been applied to this collection.
*/
void resetIndexCache();
/**
* Returns the index information on the collection.
*
* @return index information on the collection
*/
List<IndexInfo> getIndexInfo();
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern.
* The EntityClass is null only for the MongoActionOperaton.INSERT_LIST.
*
* INSERT, SAVE have null query,
* REMOVE has null document
* INSERT_LIST has null entityClass, document, and query.
*
* @author Mark Pollack
*
*/
public class MongoAction {
private String collectionName;
private WriteConcern defaultWriteConcern;
private Class<?> entityClass;
private MongoActionOperation mongoActionOperation;
private DBObject query;
private DBObject document;
/**
* Create an instance of a MongoAction
* @param defaultWriteConcern the default write concern
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
* @param entityClass the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityClass, DBObject document, DBObject query) {
super();
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;
this.collectionName = collectionName;
this.entityClass = entityClass;
this.query = query;
this.document = document;
}
public String getCollectionName() {
return collectionName;
}
public WriteConcern getDefaultWriteConcern() {
return defaultWriteConcern;
}
public Class<?> getEntityClass() {
return entityClass;
}
public MongoActionOperation getMongoActionOperation() {
return mongoActionOperation;
}
public DBObject getQuery() {
return query;
}
public DBObject getDocument() {
return document;
}
}

View File

@@ -13,21 +13,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.TypeMapper;
package org.springframework.data.mongodb.core;
/**
* Interfaces for components being able to provide a {@link TypeMapper}.
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the
* WriteConcern to use for a given mutating operation
*
* @author Oliver Gierke
* @author Mark Pollack
* @see MongoAction
*
*/
public interface TypeKeyAware {
/**
* Returns the {@link TypeMapper}.
*
* @return the {@link TypeMapper} or {@literal null} if none available.
*/
boolean isTypeKey(String key);
public enum MongoActionOperation {
REMOVE,
UPDATE,
INSERT,
INSERT_LIST,
SAVE
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import org.apache.commons.logging.Log;
@@ -24,6 +25,7 @@ import org.springframework.beans.factory.FactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
import com.mongodb.ServerAddress;
@@ -57,12 +59,12 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, PersistenceExceptio
this.mongoOptions = mongoOptions;
}
public void setReplicaSetSeeds(List<ServerAddress> replicaSetSeeds) {
this.replicaSetSeeds = replicaSetSeeds;
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = Arrays.asList(replicaSetSeeds);
}
public void setReplicaPair(List<ServerAddress> replicaPair) {
this.replicaPair = replicaPair;
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = Arrays.asList(replicaPair);
}
public void setHost(String host) {

View File

@@ -19,21 +19,24 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.CommandResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteResult;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
@@ -48,6 +51,7 @@ public interface MongoOperations {
/**
* The collection name used for the specified class by this template.
*
* @param entityClass must not be {@literal null}.
* @return
*/
String getCollectionName(Class<?> entityClass);
@@ -68,7 +72,7 @@ public interface MongoOperations {
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
@@ -77,7 +81,7 @@ public interface MongoOperations {
* @param options query options to use
*/
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
@@ -87,18 +91,6 @@ public interface MongoOperations {
* @param dch the handler that will extract results, one document at a time
*/
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler using the
* provided CursorPreparer.
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
*/
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer);
/**
* Executes a {@link DbCallback} translating any exceptions as necessary.
@@ -237,6 +229,20 @@ public interface MongoOperations {
*/
void dropCollection(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection associated with the given entity class
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -265,11 +271,39 @@ public interface MongoOperations {
* @return the converted collection
*/
<T> List<T> findAll(Class<T> entityClass, String collectionName);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parameterized type of the returned list
* @return The results of the group operation
*/
<T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parameterized type of the returned list
* @return The results of the group operation
*/
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -277,11 +311,12 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
/**
* Execute a map-reduce operation that takes additional map-reduce options.
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -289,12 +324,13 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of INLINE
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of
* INLINE
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -303,10 +339,12 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
/**
* Execute a map-reduce operation that takes a query and additional map-reduce options
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -315,7 +353,8 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
@@ -338,23 +377,6 @@ public interface MongoOperations {
*/
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass, String collectionName);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the collection indicated by the entity
* class. If not it will be created.
*
* @param indexDefinition
* @param entityClass class that determines the collection to use
*/
void ensureIndex(IndexDefinition indexDefinition, Class<?> entityClass);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists. If not it will be created.
*
* @param index
* @param collectionName
*/
void ensureIndex(IndexDefinition indexDefinition, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
@@ -425,26 +447,6 @@ public interface MongoOperations {
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
* @param collectionName name of the collection to retrieve the objects from
*
* @return the List of converted objects.
*/
<T> List<T> find(Query query, Class<T> entityClass, CursorPreparer preparer, String collectionName);
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
@@ -468,6 +470,15 @@ public interface MongoOperations {
*/
<T> T findById(Object id, Class<T> entityClass, String collectionName);
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
@@ -512,7 +523,7 @@ public interface MongoOperations {
* @return
*/
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
*
@@ -521,7 +532,7 @@ public interface MongoOperations {
* @return
*/
long count(Query query, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -612,6 +623,29 @@ public interface MongoOperations {
*/
void save(Object objectToSave, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, Class<?> entityClass);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, String collectionName);
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
@@ -620,6 +654,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, Class<?> entityClass);
@@ -631,6 +666,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, String collectionName);
@@ -642,6 +678,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(Query query, Update update, Class<?> entityClass);
@@ -653,6 +690,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(Query query, Update update, String collectionName);
@@ -662,7 +700,7 @@ public interface MongoOperations {
* @param object
*/
void remove(Object object);
/**
* Removes the given object from the given collection.
*

View File

@@ -96,6 +96,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
*
* Defaults to false
*/
@SuppressWarnings("deprecation")
private boolean slaveOk = MONGO_OPTIONS.slaveOk;
/**
@@ -204,6 +205,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.slaveOk = slaveOk;
}
@SuppressWarnings("deprecation")
public void afterPropertiesSet() {
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;

View File

@@ -2,6 +2,8 @@
* Copyright 2010-2011 the original author or authors.
*
* Licensed under t
import org.springframework.util.ResourceUtils;
import org.springframework.data.convert.EntityReader;
he Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +21,7 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Collection;
@@ -30,20 +33,6 @@ import java.util.Map;
import java.util.Scanner;
import java.util.Set;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
@@ -72,27 +61,46 @@ import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.jca.cci.core.ConnectionCallback;
import org.springframework.util.Assert;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -122,16 +130,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private WriteConcern writeConcern = null;
private WriteConcernResolver writeConcernResolver = new DefaultWriteConcernResolver();
/*
* WriteResultChecking to be used for write operations if it has been
* specified. Otherwise we should not do any checking.
*/
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
/*
* Flag used to indicate use of slaveOk() for any operations on collections.
/**
* Set the ReadPreference when operating on a collection. See {@link #prepareCollection(DBCollection)}
*/
private boolean slaveOk = false;
private ReadPreference readPreference = null;
private final MongoConverter mongoConverter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -181,11 +191,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param mongoConverter
*/
public MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter) {
Assert.notNull(mongoDbFactory);
this.mongoDbFactory = mongoDbFactory;
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
this.mapper = new QueryMapper(this.mongoConverter.getConversionService());
this.mapper = new QueryMapper(this.mongoConverter);
// We always have a mapping context in the converter, whether it's a simple one or not
mappingContext = this.mongoConverter.getMappingContext();
@@ -220,12 +231,22 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* TODO: document properly
* Configures the {@link WriteConcernResolver} to be used with the template.
*
* @param slaveOk
* @param writeConcernResolver
*/
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver;
}
/**
* Used by @{link {@link #prepareCollection(DBCollection)} to set the {@link ReadPreference} before any operations are
* performed.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
@@ -293,16 +314,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, null);
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer) {
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a
* {@link DocumentCallbackHandler} using the provided CursorPreparer.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification, must not be {@literal null}.
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
* limits, skips and so on).
*/
protected void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer) {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
}
@@ -399,24 +436,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
// Indexing methods
public void ensureIndex(IndexDefinition indexDefinition, Class<?> entityClass) {
ensureIndex(indexDefinition, determineCollectionName(entityClass));
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(this, collectionName);
}
public void ensureIndex(final IndexDefinition indexDefinition, String collectionName) {
execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
});
public IndexOperations indexOps(Class<?> entityClass) {
return new DefaultIndexOperations(this, determineCollectionName(entityClass));
}
// Find methods that take a Query to express the query and that return a single object.
@@ -426,7 +451,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findOne(Query query, Class<T> entityClass, String collectionName) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass);
if (query.getSortObject() == null) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass);
} else {
query.limit(1);
List<T> results = find(query, entityClass, collectionName);
return (results.isEmpty() ? null : results.get(0));
}
}
// Find methods that take a Query to express the query and that return a List of objects.
@@ -436,36 +467,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> List<T> find(final Query query, Class<T> entityClass, String collectionName) {
CursorPreparer cursorPreparer = null;
if (query.getSkip() > 0 || query.getLimit() > 0 || query.getSortObject() != null) {
cursorPreparer = new CursorPreparer() {
public DBCursor prepare(DBCursor cursor) {
DBCursor cursorToUse = cursor;
try {
if (query.getSkip() > 0) {
cursorToUse = cursorToUse.skip(query.getSkip());
}
if (query.getLimit() > 0) {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
return cursorToUse;
}
};
}
CursorPreparer cursorPreparer = query == null ? null : new QueryCursorPreparer(query);
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, cursorPreparer);
}
public <T> List<T> find(Query query, Class<T> entityClass, CursorPreparer preparer, String collectionName) {
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, preparer);
}
public <T> T findById(Object id, Class<T> entityClass) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
return findById(id, entityClass, persistentEntity.getCollection());
@@ -482,14 +487,25 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return geoNear(near, entityClass, determineCollectionName(entityClass));
}
@SuppressWarnings("unchecked")
public <T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass, String collectionName) {
if (near == null) {
throw new InvalidDataAccessApiUsageException("NearQuery must not be null!");
}
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException("Entity class must not be null!");
}
String collection = StringUtils.hasText(collectionName) ? collectionName : determineCollectionName(entityClass);
BasicDBObject command = new BasicDBObject("geoNear", collection);
command.putAll(near.toDBObject());
CommandResult commandResult = executeCommand(command);
BasicDBList results = (BasicDBList) commandResult.get("results");
List<Object> results = (List<Object>) commandResult.get("results");
results = results == null ? Collections.emptyList() : results;
DbObjectCallback<GeoResult<T>> callback = new GeoNearResultDbObjectCallback<T>(new ReadDbObjectCallback<T>(
mongoConverter, entityClass), near.getMetric());
List<GeoResult<T>> result = new ArrayList<GeoResult<T>>(results.size());
@@ -498,10 +514,29 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
result.add(callback.doWith((DBObject) element));
}
double averageDistance = (Double) ((DBObject) commandResult.get("stats")).get("avgDistance");
DBObject stats = (DBObject) commandResult.get("stats");
double averageDistance = stats == null ? 0 : (Double) stats.get("avgDistance");
return new GeoResults<T>(result, new Distance(averageDistance, near.getMetric()));
}
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, determineCollectionName(entityClass));
}
public <T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, collectionName);
}
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass) {
return findAndModify(query, update, options, entityClass, determineCollectionName(entityClass));
}
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
// collection in the database.
@@ -522,13 +557,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public long count(final Query query, String collectionName) {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
return execute(collectionName, new CollectionCallback<Long>() {
public Long doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.count(dbObject);
@@ -569,8 +604,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param collection
*/
protected void prepareCollection(DBCollection collection) {
if (this.slaveOk) {
collection.slaveOk();
if (this.readPreference != null) {
collection.setReadPreference(readPreference);
}
}
@@ -581,18 +616,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param writeConcern any WriteConcern already configured or null
* @return The prepared WriteConcern or null
*/
protected WriteConcern prepareWriteConcern(WriteConcern writeConcern) {
return writeConcern;
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
return writeConcernResolver.resolve(mongoAction);
}
protected <T> void doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = insertDBObject(collectionName, dbDoc);
Object id = insertDBObject(collectionName, dbDoc, objectToSave.getClass());
populateIdIfNecessary(objectToSave, id);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
@@ -670,25 +708,30 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doSave(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = saveDBObject(collectionName, dbDoc);
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
populateIdIfNecessary(objectToSave, id);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
}
protected Object insertDBObject(String collectionName, final DBObject dbDoc) {
protected Object insertDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDoc);
} else {
@@ -699,7 +742,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
protected List<ObjectId> insertDBObjectList(String collectionName, final List<DBObject> dbDocList) {
protected List<ObjectId> insertDBObjectList(final String collectionName, final List<DBObject> dbDocList) {
if (dbDocList.isEmpty()) {
return Collections.emptyList();
}
@@ -709,7 +752,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDocList);
} else {
@@ -732,13 +777,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return ids;
}
protected Object saveDBObject(String collectionName, final DBObject dbDoc) {
protected Object saveDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("save DBObject containing fields: " + dbDoc.keySet());
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.save(dbDoc);
} else {
@@ -749,6 +796,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
public WriteResult upsert(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, true, false);
}
public WriteResult upsert(Query query, Update update, String collectionName) {
return doUpdate(collectionName, query, update, null, true, false);
}
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
}
@@ -787,13 +842,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
WriteResult wr;
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
entityClass, updateObj, queryObj);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
if (multi) {
wr = collection.updateMulti(queryObj, updateObj);
} else {
wr = collection.update(queryObj, updateObj);
}
wr = collection.update(queryObj, updateObj, upsert, multi);
} else {
wr = collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
}
@@ -811,18 +864,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
remove(getIdQueryFor(object), object.getClass());
}
public void remove(Object object, String collection) {
Assert.hasText(collection);
if (object == null) {
return;
}
remove(getIdQueryFor(object), collection);
}
/**
* Returns a {@link Query} for the given entity by its id.
*
@@ -830,9 +883,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @return
*/
private Query getIdQueryFor(Object object) {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
@@ -843,15 +896,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ConversionService service = mongoConverter.getConversionService();
Object idProperty = null;
try {
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
}
private void assertUpdateableIdIfNotSet(Object entity) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
if (idProperty == null) {
return;
}
ConversionService service = mongoConverter.getConversionService();
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class, true);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(String.format(
"Cannot autogenerate id of type %s for entity of type %s!", idProperty.getType().getName(), entity.getClass()
.getName()));
}
}
@@ -860,7 +924,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
doRemove(determineCollectionName(entityClass), query, entityClass);
}
protected <T> void doRemove(String collectionName, final Query query, Class<T> entityClass) {
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
}
@@ -870,7 +934,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject dboq = mapper.getMappedObject(queryObject, entity);
WriteResult wr = null;
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + queryObject + " in collection: " + collection.getName());
}
@@ -959,21 +1025,98 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MapReduceResults<T> mapReduceResult = new MapReduceResults<T>(mappedResults, commandResult);
return mapReduceResult;
}
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
return group(null, inputCollectionName, groupBy, entityClass);
}
public <T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy,
Class<T> entityClass) {
DBObject dbo = groupBy.getGroupByObject();
dbo.put("ns", inputCollectionName);
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
if (dbo.containsField("initial")) {
Object initialObj = dbo.get("initial");
if (initialObj instanceof String) {
String initialAsString = replaceWithResourceIfNecessary((String) initialObj);
dbo.put("initial", JSON.parse(initialAsString));
}
}
if (dbo.containsField("$reduce")) {
dbo.put("$reduce", replaceWithResourceIfNecessary(dbo.get("$reduce").toString()));
}
if (dbo.containsField("$keyf")) {
dbo.put("$keyf", replaceWithResourceIfNecessary(dbo.get("$keyf").toString()));
}
if (dbo.containsField("finalize")) {
dbo.put("finalize", replaceWithResourceIfNecessary(dbo.get("finalize").toString()));
}
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
}
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("retval");
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
GroupByResults<T> groupByResult = new GroupByResults<T>(mappedResults, commandResult);
return groupByResult;
}
protected String replaceWithResourceIfNecessary(String function) {
String func = function;
if (this.resourceLoader != null) {
if (this.resourceLoader != null && ResourceUtils.isUrl(function)) {
Resource functionResource = resourceLoader.getResource(func);
if (!functionResource.exists()) {
throw new InvalidDataAccessApiUsageException(String.format("Resource %s not found!", function));
}
try {
Resource functionResource = resourceLoader.getResource(func);
if (functionResource.exists()) {
return new Scanner(functionResource.getInputStream()).useDelimiter("\\A").next();
}
} catch (Exception e) {
// ignore - could be embedded JavaScript text
return new Scanner(functionResource.getInputStream()).useDelimiter("\\A").next();
} catch (IOException e) {
throw new InvalidDataAccessApiUsageException(String.format("Cannot read map-reduce file %s!", function), e);
}
}
return func;
}
@@ -1178,6 +1321,31 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
protected <T> T doFindAndModify(String collectionName, DBObject query, DBObject fields, DBObject sort,
Class<T> entityClass, Update update, FindAndModifyOptions options) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (options == null) {
options = new FindAndModifyOptions();
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject updateObj = update.getUpdateObject();
for (String key : updateObj.keySet()) {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " and update: " + updateObj + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mapper.getMappedObject(query, entity), fields, sort,
updateObj, options), new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
/**
* Populates the id property of the saved object, if it's not set already.
*
@@ -1316,7 +1484,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return null;
}
private String determineCollectionName(Class<?> entityClass) {
String determineCollectionName(Class<?> entityClass) {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
@@ -1334,30 +1502,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Checks and handles any errors.
* <p/>
* TODO: current implementation logs errors - will be configurable to log warning, errors or throw exception in later
* versions
* Current implementation logs errors. Future version may make this configurable to log warning, errors or throw
* exception.
*/
private void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
protected void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
if (WriteResultChecking.NONE == this.writeResultChecking) {
return;
}
String error = wr.getError();
int n = wr.getN();
if (error != null) {
String message = "Execution of '" + operation + (query == null ? "" : "' using '" + query.toString() + "' query")
+ " failed: " + error;
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.error(message);
}
} else if (n == 0) {
String message = "Execution of '" + operation + (query == null ? "" : "' using '" + query.toString() + "' query")
+ " did not succeed: 0 documents updated";
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.warn(message);
return;
}
}
}
@@ -1469,6 +1634,29 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private static class FindAndModifyCallback implements CollectionCallback<DBObject> {
private final DBObject query;
private final DBObject fields;
private final DBObject sort;
private final DBObject update;
private final FindAndModifyOptions options;
public FindAndModifyCallback(DBObject query, DBObject fields, DBObject sort, DBObject update,
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.options = options;
}
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.findAndModify(query, fields, sort, options.isRemove(), update, options.isReturnNew(),
options.isUpsert());
}
}
/**
* Simple internal callback to allow operations on a {@link DBObject}.
*
@@ -1510,6 +1698,60 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private class DefaultWriteConcernResolver implements WriteConcernResolver {
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
public QueryCursorPreparer(Query query) {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.DBCursor)
*/
public DBCursor prepare(DBCursor cursor) {
if (query == null) {
return cursor;
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint())) {
return cursor;
}
DBCursor cursorToUse = cursor;
try {
if (query.getSkip() > 0) {
cursorToUse = cursorToUse.skip(query.getSkip());
}
if (query.getLimit() > 0) {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
return cursorToUse;
}
}
/**
* {@link DbObjectCallback} that assumes a {@link GeoResult} to be created, delegates actual content unmarshalling to
* a delegate and creates a {@link GeoResult} from the result.
@@ -1543,4 +1785,5 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new GeoResult<T>(doWith, new Distance(distance, metric));
}
}
}

View File

@@ -25,7 +25,9 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -40,15 +42,17 @@ import com.mongodb.DBObject;
public class QueryMapper {
private final ConversionService conversionService;
private final MongoConverter converter;
/**
* Creates a new {@link QueryMapper} with the given {@link ConversionService}.
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
*
* @param conversionService must not be {@literal null}.
* @param converter must not be {@literal null}.
*/
public QueryMapper(ConversionService conversionService) {
Assert.notNull(conversionService);
this.conversionService = conversionService;
public QueryMapper(MongoConverter converter) {
Assert.notNull(converter);
this.conversionService = converter.getConversionService();
this.converter = converter;
}
/**
@@ -60,28 +64,19 @@ public class QueryMapper {
* @return
*/
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
String idKey = null;
if (null != entity && entity.getIdProperty() != null) {
idKey = entity.getIdProperty().getName();
} else if (query.containsField("id")) {
idKey = "id";
} else if (query.containsField("_id")) {
idKey = "_id";
}
DBObject newDbo = new BasicDBObject();
for (String key : query.keySet()) {
String newKey = key;
Object value = query.get(key);
if (key.equals(idKey)) {
if (isIdKey(key, entity)) {
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Object[]) valueDbo.get(inKey)) {
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
@@ -103,36 +98,48 @@ public class QueryMapper {
value = newConditions;
} else if (key.equals("$ne")) {
value = convertId(value);
} else if (value instanceof DBObject) {
newDbo.put(newKey, getMappedObject((DBObject) value, entity));
return newDbo;
}
newDbo.put(newKey, value);
newDbo.put(newKey, converter.convertToMongoType(value));
}
return newDbo;
}
/**
* Returns whether the given key will be considered an id key.
*
* @param key
* @param entity
* @return
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (null != entity && entity.getIdProperty() != null) {
MongoPersistentProperty idProperty = entity.getIdProperty();
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return Arrays.asList("id", "_id").contains(key);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
* @param id
* @return
*/
@SuppressWarnings("unchecked")
public Object convertId(Object id) {
for (Class<?> type : Arrays.asList(ObjectId.class, String.class)) {
if (id.getClass().isAssignableFrom(type)) {
return id;
}
try {
return conversionService.convert(id, type);
} catch (ConversionException e) {
// Ignore
}
try {
return conversionService.convert(id, ObjectId.class);
} catch (ConversionException e) {
// Ignore
}
return id;
return converter.convertToMongoType(id);
}
}

View File

@@ -82,9 +82,8 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws UnknownHostException
* @see MongoURI
*/
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), String.valueOf(uri.getPassword())));
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())));
}
/**
@@ -95,6 +94,10 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
public WriteConcern getWriteConcern() {
return writeConcern;
}
/*
* (non-Javadoc)
@@ -127,4 +130,12 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
public void destroy() throws Exception {
mongo.close();
}
public static String parseChars(char[] chars) {
if (chars == null) {
return null;
} else {
return String.valueOf(chars);
}
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* A strategy interface to determine the WriteConcern to use for a given MongoDbAction.
*
* Return the passed in default WriteConcern (a property on MongoAction) if no determination can be made.
*
* @author Mark Pollack
*
*/
public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
*/
WriteConcern resolve(MongoAction action);
}

View File

@@ -17,16 +17,15 @@
package org.springframework.data.mongodb.core.convert;
import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObjectIdConverter;
/**
@@ -46,10 +45,10 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService.removeConvertible(Object.class, String.class);
}
/**
@@ -80,12 +79,6 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, String.class)) {
conversionService.addConverter(BigIntegerToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, BigInteger.class)) {
conversionService.addConverter(StringToBigIntegerConverter.INSTANCE);
}
conversions.registerConvertersIn(conversionService);
}

View File

@@ -0,0 +1,115 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
/**
* Conversion registration information.
*
* @author Oliver Gierke
*/
class ConverterRegistration {
private final ConvertiblePair convertiblePair;
private final boolean reading;
private final boolean writing;
/**
* Creates a new {@link ConverterRegistration}.
*
* @param convertiblePair must not be {@literal null}.
* @param isReading whether to force to consider the converter for reading.
* @param isWritingwhether to force to consider the converter for reading.
*/
public ConverterRegistration(ConvertiblePair convertiblePair, boolean isReading, boolean isWriting) {
Assert.notNull(convertiblePair);
this.convertiblePair = convertiblePair;
this.reading = isReading;
this.writing = isWriting;
}
/**
* Creates a new {@link ConverterRegistration} from the given source and target type and read/write flags.
*
* @param source the source type to be converted from, must not be {@literal null}.
* @param target the target type to be converted to, must not be {@literal null}.
* @param isReading whether to force to consider the converter for reading.
* @param isWriting whether to force to consider the converter for writing.
*/
public ConverterRegistration(Class<?> source, Class<?> target, boolean isReading, boolean isWriting) {
this(new ConvertiblePair(source, target), isReading, isWriting);
}
/**
* Returns whether the converter shall be used for writing.
*
* @return
*/
public boolean isWriting() {
return writing == true || (!reading && isSimpleTargetType());
}
/**
* Returns whether the converter shall be used for reading.
*
* @return
*/
public boolean isReading() {
return reading == true || (!writing && isSimpleSourceType());
}
/**
* Returns the actual conversion pair.
*
* @return
*/
public ConvertiblePair getConvertiblePair() {
return convertiblePair;
}
/**
* Returns whether the source type is a Mongo simple one.
*
* @return
*/
public boolean isSimpleSourceType() {
return isMongoBasicType(convertiblePair.getSourceType());
}
/**
* Returns whether the target type is a Mongo simple one.
*
* @return
*/
public boolean isSimpleTargetType() {
return isMongoBasicType(convertiblePair.getTargetType());
}
/**
* Returns whether the given type is a type that Mongo can handle basically.
*
* @param type
* @return
*/
private static boolean isMongoBasicType(Class<?> type) {
return MongoSimpleTypes.HOLDER.isSimpleType(type);
}
}

View File

@@ -17,13 +17,13 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import org.bson.types.ObjectId;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.core.GenericTypeResolver;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
@@ -31,14 +31,16 @@ import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Value object to capture custom conversion. That is essentially a {@link List} of converters and some additional logic
* around them. The converters are pretty much builds up two sets of types which Mongo basic types {@see #MONGO_TYPES}
@@ -50,9 +52,9 @@ import com.mongodb.DBObject;
*/
public class CustomConversions {
@SuppressWarnings({ "unchecked" })
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, ObjectId.class,
String.class, DBObject.class);
private static final Log LOG = LogFactory.getLog(CustomConversions.class);
private static final String READ_CONVERTER_NOT_SIMPLE = "Registering converter from %s to %s as reading converter although it doesn't convert from a Mongo supported type! You might wanna check you annotation setup at the converter implementation.";
private static final String WRITE_CONVERTER_NOT_SIMPLE = "Registering converter from %s to %s as writing converter although it doesn't convert to a Mongo supported type! You might wanna check you annotation setup at the converter implementation.";
private final Set<ConvertiblePair> readingPairs;
private final Set<ConvertiblePair> writingPairs;
@@ -85,6 +87,8 @@ public class CustomConversions {
this.converters.add(CustomToStringConverter.INSTANCE);
this.converters.add(BigDecimalToStringConverter.INSTANCE);
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.addAll(converters);
for (Object c : this.converters) {
@@ -155,14 +159,18 @@ public class CustomConversions {
*/
private void registerConversion(Object converter) {
Class<?> type = converter.getClass();
boolean isWriting = type.isAnnotationPresent(WritingConverter.class);
boolean isReading = type.isAnnotationPresent(ReadingConverter.class);
if (converter instanceof GenericConverter) {
GenericConverter genericConverter = (GenericConverter) converter;
for (ConvertiblePair pair : genericConverter.getConvertibleTypes()) {
register(pair);
register(new ConverterRegistration(pair, isReading, isWriting));
}
} else if (converter instanceof Converter) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
register(new ConvertiblePair(arguments[0], arguments[1]));
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
} else {
throw new IllegalArgumentException("Unsupported Converter type!");
}
@@ -174,15 +182,27 @@ public class CustomConversions {
*
* @param pair
*/
private void register(ConvertiblePair pair) {
private void register(ConverterRegistration context) {
ConvertiblePair pair = context.getConvertiblePair();
if (context.isReading()) {
if (isMongoBasicType(pair.getSourceType())) {
readingPairs.add(pair);
if (LOG.isWarnEnabled() && !context.isSimpleSourceType()) {
LOG.warn(String.format(READ_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
}
}
if (isMongoBasicType(pair.getTargetType())) {
if (context.isWriting()) {
writingPairs.add(pair);
customSimpleTypes.add(pair.getSourceType());
if (LOG.isWarnEnabled() && !context.isSimpleTargetType()) {
LOG.warn(String.format(WRITE_CONVERTER_NOT_SIMPLE, pair.getSourceType(), pair.getTargetType()));
}
}
}
@@ -273,16 +293,7 @@ public class CustomConversions {
return null;
}
/**
* Returns whether the given type is a type that Mongo can handle basically.
*
* @param type
* @return
*/
private static boolean isMongoBasicType(Class<?> type) {
return MONGO_TYPES.contains(type);
}
@WritingConverter
private enum CustomToStringConverter implements GenericConverter {
INSTANCE;

View File

@@ -17,20 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.lang.reflect.Array;
import java.lang.reflect.InvocationTargetException;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -74,16 +70,7 @@ import com.mongodb.DBRef;
* @author Oliver Gierke
* @author Jon Brisbin
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, TypeKeyAware {
@SuppressWarnings("rawtypes")
private static final TypeInformation<Map> MAP_TYPE_INFORMATION = ClassTypeInformation.from(Map.class);
@SuppressWarnings("rawtypes")
private static final TypeInformation<Collection> COLLECTION_TYPE_INFORMATION = ClassTypeInformation
.from(Collection.class);
private static final List<Class<?>> VALID_ID_TYPES = Arrays.asList(new Class<?>[] { ObjectId.class, String.class,
BigInteger.class, byte[].class });
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
protected static final Log log = LogFactory.getLog(MappingMongoConverter.class);
@@ -101,6 +88,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
@@ -112,7 +100,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext);
this.idMapper = new QueryMapper(conversionService);
this.idMapper = new QueryMapper(this);
}
/**
@@ -128,14 +116,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
mappingContext) : typeMapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeKeyAware#isTypeKey(java.lang.String)
*/
public boolean isTypeKey(String key) {
return typeMapper.isTypeKey(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.EntityConverter#getMappingContext()
@@ -184,6 +164,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
return conversionService.convert(dbo, rawType);
}
if (DBObject.class.isAssignableFrom(rawType)) {
return (S) dbo;
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo);
@@ -266,13 +250,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
Object obj = getValueInternal(prop, dbo, spelCtx, prop.getSpelExpression());
try {
wrapper.setProperty(prop, obj, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(prop, obj, useFieldAccessOnly);
}
});
@@ -338,12 +316,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (Map.class.isAssignableFrom(obj.getClass())) {
writeMapInternal((Map<Object, Object>) obj, dbo, MAP_TYPE_INFORMATION);
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
writeCollectionInternal((Collection<?>) obj, COLLECTION_TYPE_INFORMATION, (BasicDBList) dbo);
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return;
}
@@ -367,46 +345,24 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
// Write the ID
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
Object idObj = null;
Class<?>[] targetClasses = new Class<?>[] { ObjectId.class, String.class, Object.class };
for (Class<?> targetClass : targetClasses) {
try {
idObj = wrapper.getProperty(idProperty, targetClass, useFieldAccessOnly);
if (null != idObj) {
break;
}
} catch (ConversionException ignored) {
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
if (null != idObj) {
dbo.put("_id", idObj);
} else {
if (!VALID_ID_TYPES.contains(idProperty.getType())) {
throw new MappingException("Invalid data type " + idProperty.getType().getName()
+ " for Id property. Should be one of " + VALID_ID_TYPES);
}
try {
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {
}
}
// Write the properties
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
return;
}
Object propertyObj;
try {
propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
Object propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
@@ -421,14 +377,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType();
Object propertyObj;
try {
propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
Object propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp);
}
@@ -444,16 +393,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
String name = prop.getFieldName();
TypeInformation<?> valueType = ClassTypeInformation.from(obj.getClass());
TypeInformation<?> type = prop.getTypeInformation();
if (prop.isCollection()) {
if (valueType.isCollectionLike()) {
DBObject collectionInternal = createCollection(asCollection(obj), prop);
dbo.put(name, collectionInternal);
return;
}
TypeInformation<?> type = prop.getTypeInformation();
if (prop.isMap()) {
if (valueType.isMap()) {
BasicDBObject mapDbObj = new BasicDBObject();
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
dbo.put(name, mapDbObj);
@@ -589,12 +538,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
String simpleKey = key.toString();
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection) {
} else if (val instanceof Collection || val.getClass().isArray()) {
dbo.put(simpleKey,
writeCollectionInternal((Collection<?>) val, propertyType.getMapValueType(), new BasicDBList()));
writeCollectionInternal(asCollection(val), propertyType.getMapValueType(), new BasicDBList()));
} else {
DBObject newDbo = new BasicDBObject();
writeInternal(val, newDbo, propertyType);
TypeInformation<?> valueTypeInfo = propertyType.isMap() ? propertyType.getMapValueType() : ClassTypeInformation.OBJECT;
writeInternal(val, newDbo, valueTypeInfo);
dbo.put(simpleKey, newDbo);
}
} else {
@@ -653,7 +603,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (customTarget != null) {
return conversionService.convert(value, customTarget);
} else {
return value.getClass().isEnum() ? ((Enum<?>) value).name() : value;
return Enum.class.isAssignableFrom(value.getClass()) ? ((Enum<?>) value).name() : value;
}
}
@@ -676,7 +626,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return conversionService.convert(value, target);
}
if (target.isEnum()) {
if (Enum.class.isAssignableFrom(target)) {
return Enum.valueOf((Class<Enum>) target, value.toString());
}
@@ -691,18 +641,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
MongoPersistentProperty idProperty = targetEntity.getIdProperty();
Object id = null;
BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(target, conversionService);
try {
id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
String collection = dbref.collection();
@@ -748,7 +691,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
&& ((DBObject) sourceValue).keySet().size() == 0) {
// It's empty
return Array.newInstance(prop.getComponentType(), 0);
} else if (prop.isCollection() && sourceValue instanceof BasicDBList) {
} else if (prop.isCollectionLike() && sourceValue instanceof BasicDBList) {
return readCollectionOrArray((TypeInformation<? extends Collection<?>>) prop.getTypeInformation(),
(BasicDBList) sourceValue);
}
@@ -888,46 +831,68 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (obj instanceof Map) {
Map<Object, Object> m = new HashMap<Object, Object>();
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
m.put(entry.getKey(), convertToMongoType(entry.getValue()));
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue()));
}
return m;
return result;
}
if (obj instanceof List) {
List<?> l = (List<?>) obj;
List<Object> newList = new ArrayList<Object>();
for (Object o : l) {
newList.add(convertToMongoType(o));
}
return newList;
return maybeConvertList((List<?>) obj);
}
if (obj.getClass().isArray()) {
return maybeConvertArray((Object[]) obj);
return maybeConvertList(Arrays.asList((Object[]) obj));
}
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return newDbo;
return removeTypeInfoRecursively(newDbo);
}
public Object[] maybeConvertArray(Object[] src) {
Object[] newArr = new Object[src.length];
for (int i = 0; i < src.length; i++) {
newArr[i] = convertToMongoType(src[i]);
}
return newArr;
}
public BasicDBList maybeConvertList(BasicDBList dbl) {
public BasicDBList maybeConvertList(Iterable<?> source) {
BasicDBList newDbl = new BasicDBList();
Iterator<?> iter = dbl.iterator();
while (iter.hasNext()) {
Object o = iter.next();
newDbl.add(convertToMongoType(o));
for (Object element : source) {
newDbl.add(convertToMongoType(element));
}
return newDbl;
}
/**
* Removes the type information from the conversion result.
*
* @param object
* @return
*/
private Object removeTypeInfoRecursively(Object object) {
if (!(object instanceof DBObject)) {
return object;
}
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element);
}
} else {
removeTypeInfoRecursively(value);
}
}
if (keyToRemove != null) {
dbObject.removeField(keyToRemove);
}
return dbObject;
}
}

View File

@@ -20,11 +20,16 @@ import org.springframework.data.convert.TypeMapper;
import com.mongodb.DBObject;
/**
* Combining interface to express Mongo specific {@link TypeMapper} implementations will be {@link TypeKeyAware} as
* well.
* Mongo-specific {@link TypeMapper} exposing that {@link DBObject}s might contain a type key.
*
* @author Oliver Gierke
*/
public interface MongoTypeMapper extends TypeMapper<DBObject>, TypeKeyAware {
public interface MongoTypeMapper extends TypeMapper<DBObject> {
/**
* Returns whether the given key is the type key.
*
* @return
*/
boolean isTypeKey(String key);
}

View File

@@ -0,0 +1,115 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Map;
import org.springframework.data.mongodb.core.query.Order;
public class IndexInfo {
private final Map<String, Order> fieldSpec;
private String name;
private boolean unique = false;
private boolean dropDuplicates = false;
private boolean sparse = false;
public IndexInfo(Map<String, Order> fieldSpec, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
super();
this.fieldSpec = fieldSpec;
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
}
public Map<String, Order> getFieldSpec() {
return fieldSpec;
}
public String getName() {
return name;
}
public boolean isUnique() {
return unique;
}
public boolean isDropDuplicates() {
return dropDuplicates;
}
public boolean isSparse() {
return sparse;
}
@Override
public String toString() {
return "IndexInfo [fieldSpec=" + fieldSpec + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
result = prime * result + ((fieldSpec == null) ? 0 : fieldSpec.hashCode());
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
IndexInfo other = (IndexInfo) obj;
if (dropDuplicates != other.dropDuplicates)
return false;
if (fieldSpec == null) {
if (other.fieldSpec != null)
return false;
} else if (!fieldSpec.equals(other.fieldSpec))
return false;
if (name == null) {
if (other.name != null)
return false;
} else if (!name.equals(other.name))
return false;
if (sparse != other.sparse)
return false;
if (unique != other.unique)
return false;
return true;
}
/**
* [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "database.person" , "name" : "_id_"},
{ "v" : 1 , "key" : { "age" : -1} , "ns" : "database.person" , "name" : "age_-1" , "unique" : true , "dropDups" : true}]
*/
}

View File

@@ -53,7 +53,7 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
if (this.isIdProperty == null) {
this.isIdProperty = super.isIdProperty();
}
return this.isIdProperty;
}
@@ -67,7 +67,7 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
if (this.fieldName == null) {
this.fieldName = super.getFieldName();
}
return super.getFieldName();
return this.fieldName;
}
}

View File

@@ -19,11 +19,13 @@ import java.math.BigInteger;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.regex.Pattern;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
@@ -33,19 +35,21 @@ import com.mongodb.DBRef;
*/
public abstract class MongoSimpleTypes {
public static final Set<Class<?>> SUPPORTED_ID_CLASSES;
public static final Set<Class<?>> AUTOGENERATED_ID_TYPES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableSet(classes);
AUTOGENERATED_ID_TYPES = Collections.unmodifiableSet(classes);
Set<Class<?>> simpleTypes = new HashSet<Class<?>>();
simpleTypes.add(DBRef.class);
simpleTypes.add(ObjectId.class);
simpleTypes.add(CodeWScope.class);
simpleTypes.add(DBObject.class);
simpleTypes.add(Pattern.class);
MONGO_SIMPLE_TYPES = Collections.unmodifiableSet(simpleTypes);
}

View File

@@ -36,7 +36,8 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
* Creates a new {@link AbstractMongoEventListener}.
*/
public AbstractMongoEventListener() {
this.domainClass = GenericTypeResolver.resolveTypeArgument(this.getClass(), AbstractMongoEventListener.class);
Class<?> typeArgument = GenericTypeResolver.resolveTypeArgument(this.getClass(), AbstractMongoEventListener.class);
this.domainClass = typeArgument == null ? Object.class : typeArgument;
}
/*

View File

@@ -0,0 +1,118 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Collects the parameters required to perform a group operation on a collection. The query condition and the input collection are specified on the group method as method arguments
* to be consistent with other operations, e.g. map-reduce.
*
* @author Mark Pollack
*
*/
public class GroupBy {
private DBObject dboKeys;
private String keyFunction;
private String initial;
private DBObject initialDbObject;
private String reduce;
private String finalize;
public GroupBy(String... keys) {
DBObject dbo = new BasicDBObject();
for (String key : keys) {
dbo.put(key, 1);
}
dboKeys = dbo;
}
// NOTE GroupByCommand does not handle keyfunction.
public GroupBy(String key, boolean isKeyFunction) {
DBObject dbo = new BasicDBObject();
if (isKeyFunction) {
keyFunction = key;
} else {
dbo.put(key, 1);
dboKeys = dbo;
}
}
public static GroupBy keyFunction(String key) {
return new GroupBy(key, true);
}
public static GroupBy key(String... keys) {
return new GroupBy(keys);
}
public GroupBy initialDocument(String initialDocument) {
initial = initialDocument;
return this;
}
public GroupBy initialDocument(DBObject initialDocument) {
initialDbObject = initialDocument;
return this;
}
public GroupBy reduceFunction(String reduceFunction) {
reduce = reduceFunction;
return this;
}
public GroupBy finalizeFunction(String finalizeFunction) {
finalize = finalizeFunction;
return this;
}
public DBObject getGroupByObject() {
// return new GroupCommand(dbCollection, dboKeys, condition, initial, reduce, finalize);
BasicDBObject dbo = new BasicDBObject();
if (dboKeys != null) {
dbo.put("key", dboKeys);
}
if (keyFunction != null) {
dbo.put("$keyf", keyFunction);
}
dbo.put("$reduce", reduce);
dbo.put("initial", initialDbObject);
if (initial != null) {
dbo.put("initial", initial);
}
if (finalize != null) {
dbo.put("finalize", finalize);
}
return dbo;
}
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of executing a group operation.
*
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
*/
public class GroupByResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private DBObject rawResults;
private double count;
private int keys;
private String serverUsed;
public GroupByResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = mappedResults;
this.rawResults = rawResults;
parseKeys();
parseCount();
parseServerUsed();
}
public double getCount() {
return count;
}
public int getKeys() {
return keys;
}
public String getServerUsed() {
return serverUsed;
}
public Iterator<T> iterator() {
return mappedResults.iterator();
}
public DBObject getRawResults() {
return rawResults;
}
private void parseCount() {
Object object = rawResults.get("count");
if (object instanceof Double) {
count = (Double) object;
}
}
private void parseKeys() {
Object object = rawResults.get("keys");
if (object instanceof Integer) {
keys = (Integer) object;
}
}
private void parseServerUsed() {
//"serverUsed" : "127.0.0.1:27017"
Object object = rawResults.get("serverUsed");
if (object instanceof String) {
serverUsed = (String) object;
}
}
}

View File

@@ -44,10 +44,9 @@ public class MapReduceOptions {
/**
* Static factory method to create a Criteria using the provided key
* Static factory method to create a MapReduceOptions instance
*
* @param key
* @return
* @return a new instance
*/
public static MapReduceOptions options() {
return new MapReduceOptions();

View File

@@ -22,6 +22,12 @@ import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of performing a MapReduce operations.
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
*/
public class MapReduceResults<T> implements Iterable<T> {
private final List<T> mappedResults;

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
@@ -169,7 +170,7 @@ public class Criteria implements CriteriaDefinition {
throw new InvalidMongoDbApiUsageException("You can only pass in one argument of type "
+ o[1].getClass().getName());
}
criteria.put("$in", o);
criteria.put("$in", Arrays.asList(o));
return this;
}
@@ -180,7 +181,7 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria in(Collection<?> c) {
criteria.put("$in", c.toArray());
criteria.put("$in", c);
return this;
}
@@ -191,6 +192,10 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria nin(Object... o) {
return nin(Arrays.asList(o));
}
public Criteria nin(Collection<?> o) {
criteria.put("$nin", o);
return this;
}
@@ -217,6 +222,10 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria all(Object... o) {
return all(Arrays.asList(o));
}
public Criteria all(Collection<?> o) {
criteria.put("$all", o);
return this;
}

View File

@@ -19,9 +19,11 @@ import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
public class Query {
@@ -30,6 +32,7 @@ public class Query {
private Sort sort;
private int skip;
private int limit;
private String hint;
/**
* Static factory method to create a Query using the provided criteria
@@ -63,10 +66,8 @@ public class Query {
}
public Field fields() {
synchronized (this) {
if (fieldSpec == null) {
this.fieldSpec = new Field();
}
if (fieldSpec == null) {
this.fieldSpec = new Field();
}
return this.fieldSpec;
}
@@ -81,12 +82,23 @@ public class Query {
return this;
}
/**
* Configures the query to use the given hint when being executed.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public Query withHint(String name) {
Assert.hasText(name, "Hint must not be empty or null!");
this.hint = name;
return this;
}
public Sort sort() {
synchronized (this) {
if (this.sort == null) {
this.sort = new Sort();
}
if (this.sort == null) {
this.sort = new Sort();
}
return this.sort;
}
@@ -122,6 +134,10 @@ public class Query {
return this.limit;
}
public String getHint() {
return hint;
}
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}

View File

@@ -15,14 +15,14 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.springframework.data.mongodb.repository.query.QueryUtils.*;
import static org.springframework.data.mongodb.repository.query.QueryUtils.applyPagination;
import java.util.List;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
@@ -46,21 +46,21 @@ import com.mongodb.DBObject;
public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoTemplate template;
private final MongoOperations mongoOperations;
/**
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoTemplate}.
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoOperations}.
*
* @param method
* @param template
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoTemplate template) {
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations template) {
Assert.notNull(template);
Assert.notNull(method);
this.method = method;
this.template = template;
this.mongoOperations = template;
}
/* (non-Javadoc)
@@ -79,7 +79,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
public Object execute(Object[] parameters) {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(template.getConverter(), accessor));
Query query = createQuery(new ConvertingParameterAccessor(mongoOperations.getConverter(), accessor));
if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
@@ -110,7 +110,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
return template.find(query, metadata.getJavaType(), collectionName);
return mongoOperations.find(query, metadata.getJavaType(), collectionName);
}
}
@@ -164,7 +164,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
int count = getCollectionCursor(metadata.getCollectionName(), query.getQueryObject()).count();
List<?> result = template.find(applyPagination(query, pageable), metadata.getJavaType(),
List<?> result = mongoOperations.find(applyPagination(query, pageable), metadata.getJavaType(),
metadata.getCollectionName());
return new PageImpl(result, pageable, count);
@@ -172,7 +172,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private DBCursor getCollectionCursor(String collectionName, final DBObject query) {
return template.execute(collectionName, new CollectionCallback<DBCursor>() {
return mongoOperations.execute(collectionName, new CollectionCallback<DBCursor>() {
public DBCursor doInCollection(DBCollection collection) {
@@ -197,7 +197,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return template.findOne(query, entityInformation.getJavaType());
return mongoOperations.findOne(query, entityInformation.getJavaType());
}
}
@@ -234,7 +234,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityInformation<?,?> entityInformation = method.getEntityInformation();
GeoResults<?> results = template.geoNear(nearQuery, entityInformation.getJavaType(), entityInformation.getCollectionName());
GeoResults<?> results = mongoOperations.geoNear(nearQuery, entityInformation.getJavaType(), entityInformation.getCollectionName());
return isListOfGeoResult() ? results.getContent() : results;
}

View File

@@ -20,13 +20,10 @@ import java.util.Iterator;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.TypeKeyAware;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.repository.query.ParameterAccessor;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
import org.springframework.util.Assert;
/**
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
@@ -41,9 +38,14 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
/**
* Creates a new {@link ConvertingParameterAccessor} with the given {@link MongoWriter} and delegate.
*
* @param writer
* @param writer must not be {@literal null}.
* @param delegate must not be {@literal null}.
*/
public ConvertingParameterAccessor(MongoWriter<?> writer, MongoParameterAccessor delegate) {
Assert.notNull(writer);
Assert.notNull(delegate);
this.writer = writer;
this.delegate = delegate;
}
@@ -105,49 +107,7 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
* @return
*/
private Object getConvertedValue(Object value) {
if (!(writer instanceof TypeKeyAware)) {
return value;
}
return removeTypeInfoRecursively(writer.convertToMongoType(value), ((TypeKeyAware) writer));
}
/**
* Removes the type information from the conversion result.
*
* @param object
* @return
*/
private Object removeTypeInfoRecursively(Object object, TypeKeyAware typeKeyAware) {
if (!(object instanceof DBObject) || typeKeyAware == null) {
return object;
}
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeKeyAware.isTypeKey(key)) {
keyToRemove = key;
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element, typeKeyAware);
}
} else {
removeTypeInfoRecursively(value, typeKeyAware);
}
}
if (keyToRemove != null) {
dbObject.removeField(keyToRemove);
}
return dbObject;
return writer.convertToMongoType(value);
}
/**

View File

@@ -134,7 +134,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
*/
@Override
protected Query or(Query base, Query query) {
return new OrQuery(new Query[] {base, query});
return new OrQuery(new Query[] { base, query });
}
/*
@@ -192,6 +192,14 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
case LIKE:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
Distance distance = accessor.getMaxDistance();

View File

@@ -27,7 +27,6 @@ import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.Parameters;
import org.springframework.data.repository.query.QueryMethod;
import org.springframework.data.repository.util.ClassUtils;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
@@ -57,7 +56,7 @@ public class MongoQueryMethod extends QueryMethod {
super(method, metadata);
Assert.notNull(entityInformationCreator, "DefaultEntityInformationCreator must not be null!");
this.method = method;
this.entityInformation = entityInformationCreator.getEntityInformation(ClassUtils.getReturnedDomainClass(method),
this.entityInformation = entityInformationCreator.getEntityInformation(metadata.getReturnedDomainClass(method),
getDomainClass());
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Query;
@@ -40,12 +41,12 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
* @param method
* @param template
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoTemplate template) {
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, template);
super(method, mongoOperations);
this.tree = new PartTree(method.getName(), method.getEntityInformation().getJavaType());
this.isGeoNearQuery = method.isGeoNearQuery();
this.context = template.getConverter().getMappingContext();
this.context = mongoOperations.getConverter().getMappingContext();
}
/**

View File

@@ -21,7 +21,7 @@ import java.util.regex.Pattern;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -44,14 +44,14 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param method
* @param template
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoTemplate template) {
super(method, template);
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, mongoOperations);
this.query = query;
this.fieldSpec = method.getFieldSpecification();
}
public StringBasedMongoQuery(MongoQueryMethod method, MongoTemplate template) {
this(method.getAnnotatedQuery(), method, template);
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
this(method.getAnnotatedQuery(), method, mongoOperations);
}
/*
@@ -74,6 +74,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
} else {
query = new BasicQuery(queryString);
}
QueryUtils.applySorting(query, accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s", query.getQueryObject()));

View File

@@ -86,7 +86,7 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
operations.ensureIndex(index, metadata.getCollectionName());
operations.indexOps(metadata.getCollectionName()).ensureIndex(index);
LOG.debug(String.format("Created %s!", index));
}

View File

@@ -16,14 +16,11 @@
package org.springframework.data.mongodb.repository.support;
import java.util.Collections;
import java.util.Set;
import javax.annotation.processing.AbstractProcessor;
import javax.annotation.processing.RoundEnvironment;
import javax.annotation.processing.SupportedAnnotationTypes;
import javax.annotation.processing.SupportedSourceVersion;
import javax.lang.model.SourceVersion;
import javax.lang.model.element.TypeElement;
import javax.tools.Diagnostic;
import org.springframework.data.mongodb.core.mapping.Document;
@@ -33,8 +30,9 @@ import com.mysema.query.annotations.QueryEmbedded;
import com.mysema.query.annotations.QueryEntities;
import com.mysema.query.annotations.QuerySupertype;
import com.mysema.query.annotations.QueryTransient;
import com.mysema.query.apt.AbstractQuerydslProcessor;
import com.mysema.query.apt.Configuration;
import com.mysema.query.apt.DefaultConfiguration;
import com.mysema.query.apt.Processor;
/**
* Annotation processor to create Querydsl query types for QueryDsl annoated classes.
@@ -44,10 +42,14 @@ import com.mysema.query.apt.Processor;
@SuppressWarnings("restriction")
@SupportedAnnotationTypes({ "com.mysema.query.annotations.*", "org.springframework.data.mongodb.core.mapping.*" })
@SupportedSourceVersion(SourceVersion.RELEASE_6)
public class MongoAnnotationProcessor extends AbstractProcessor {
public class MongoAnnotationProcessor extends AbstractQuerydslProcessor {
/*
* (non-Javadoc)
* @see com.mysema.query.apt.AbstractQuerydslProcessor#createConfiguration(javax.annotation.processing.RoundEnvironment)
*/
@Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
protected Configuration createConfiguration(RoundEnvironment roundEnv) {
processingEnv.getMessager().printMessage(Diagnostic.Kind.NOTE, "Running " + getClass().getSimpleName());
@@ -56,8 +58,6 @@ public class MongoAnnotationProcessor extends AbstractProcessor {
QueryEmbeddable.class, QueryEmbedded.class, QueryTransient.class);
configuration.setUnknownAsEmbedded(true);
Processor processor = new Processor(processingEnv, roundEnv, configuration);
processor.process();
return true;
return configuration;
}
}

View File

@@ -21,8 +21,8 @@ import java.io.Serializable;
import java.lang.reflect.Method;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
@@ -37,7 +37,6 @@ import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* Factory to create {@link MongoRepository} instances.
@@ -46,7 +45,7 @@ import org.springframework.util.StringUtils;
*/
public class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoTemplate template;
private final MongoOperations mongoOperations;
private final EntityInformationCreator entityInformationCreator;
/**
@@ -55,11 +54,12 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @param template must not be {@literal null}
* @param mappingContext
*/
public MongoRepositoryFactory(MongoTemplate template) {
public MongoRepositoryFactory(MongoOperations mongoOperations) {
Assert.notNull(template);
this.template = template;
this.entityInformationCreator = new DefaultEntityInformationCreator(template.getConverter().getMappingContext());
Assert.notNull(mongoOperations);
this.mongoOperations = mongoOperations;
this.entityInformationCreator = new DefaultEntityInformationCreator(mongoOperations.getConverter()
.getMappingContext());
}
/*
@@ -84,9 +84,9 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainClass());
if (isQueryDslRepository(repositoryInterface)) {
return new QueryDslMongoRepository(entityInformation, template);
return new QueryDslMongoRepository(entityInformation, mongoOperations);
} else {
return new SimpleMongoRepository(entityInformation, template);
return new SimpleMongoRepository(entityInformation, mongoOperations);
}
}
@@ -110,41 +110,27 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @author Oliver Gierke
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryLookupStrategy#resolveQuery(java.lang.reflect.Method, org.springframework.data.repository.core.RepositoryMetadata, org.springframework.data.repository.core.NamedQueries)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, template);
return new StringBasedMongoQuery(namedQuery, queryMethod, mongoOperations);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, template);
return new StringBasedMongoQuery(queryMethod, mongoOperations);
} else {
return new PartTreeMongoQuery(queryMethod, template);
return new PartTreeMongoQuery(queryMethod, mongoOperations);
}
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.RepositoryFactorySupport#validate(org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
protected void validate(RepositoryMetadata metadata) {
Class<?> idClass = metadata.getIdClass();
if (!MongoSimpleTypes.SUPPORTED_ID_CLASSES.contains(idClass)) {
throw new IllegalArgumentException(String.format("Unsupported id class! Only %s are supported!",
StringUtils.collectionToCommaDelimitedString(MongoSimpleTypes.SUPPORTED_ID_CLASSES)));
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getEntityInformation(java.lang.Class)

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import java.util.List;
import java.util.regex.Pattern;
import org.apache.commons.collections15.Transformer;
import org.springframework.data.domain.Page;
@@ -25,7 +26,9 @@ import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
@@ -34,6 +37,7 @@ import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
@@ -64,9 +68,9 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @param entityInformation
* @param template
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoTemplate template) {
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, template, SimpleEntityPathResolver.INSTANCE);
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
}
/**
@@ -74,17 +78,17 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* and {@link EntityPathResolver}.
*
* @param entityInformation
* @param template
* @param mongoOperations
* @param resolver
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoTemplate template,
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations,
EntityPathResolver resolver) {
super(entityInformation, template);
super(entityInformation, mongoOperations);
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.serializer = new SpringDataMongodbSerializer(template.getConverter().getMappingContext());
this.serializer = new SpringDataMongodbSerializer(mongoOperations.getConverter());
}
/*
@@ -229,14 +233,17 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
*/
static class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.mappingContext = mappingContext;
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
}
@Override
@@ -247,5 +254,11 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property.getFieldName();
}
@Override
protected DBObject asDBObject(String key, Object value) {
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}
}

View File

@@ -42,7 +42,7 @@ import org.springframework.util.Assert;
*/
public class SimpleMongoRepository<T, ID extends Serializable> implements PagingAndSortingRepository<T, ID> {
private final MongoTemplate template;
private final MongoOperations mongoOperations;
private final MongoEntityInformation<T, ID> entityInformation;
/**
@@ -51,12 +51,12 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
* @param metadata
* @param template
*/
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoTemplate template) {
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoOperations mongoOperations) {
Assert.notNull(template);
Assert.notNull(mongoOperations);
Assert.notNull(metadata);
this.entityInformation = metadata;
this.template = template;
this.mongoOperations = mongoOperations;
}
/*
@@ -66,8 +66,10 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
* org.springframework.data.repository.Repository#save(java.lang.Object)
*/
public T save(T entity) {
Assert.notNull(entity, "Entity must not be null!");
template.save(entity, entityInformation.getCollectionName());
mongoOperations.save(entity, entityInformation.getCollectionName());
return entity;
}
@@ -79,6 +81,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public List<T> save(Iterable<? extends T> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<T> result = new ArrayList<T>();
for (T entity : entities) {
@@ -98,7 +102,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
return template.findById(id, entityInformation.getJavaType());
return mongoOperations.findById(id, entityInformation.getJavaType());
}
private Query getIdQuery(Object id) {
@@ -119,7 +123,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
return template.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
return mongoOperations.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
entityInformation.getCollectionName()) != null;
}
@@ -130,7 +134,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public long count() {
return template.getCollection(entityInformation.getCollectionName()).count();
return mongoOperations.getCollection(entityInformation.getCollectionName()).count();
}
/*
@@ -139,7 +143,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void delete(ID id) {
Assert.notNull(id, "The given id must not be null!");
template.remove(getIdQuery(id), entityInformation.getJavaType());
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType());
}
/*
@@ -161,6 +165,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void delete(Iterable<? extends T> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
for (T entity : entities) {
delete(entity);
}
@@ -173,7 +179,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void deleteAll() {
template.remove(new Query(), entityInformation.getCollectionName());
mongoOperations.remove(new Query(), entityInformation.getCollectionName());
}
/*
@@ -246,7 +252,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
return Collections.emptyList();
}
return template.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/**
@@ -256,7 +262,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
protected MongoOperations getMongoOperations() {
return this.template;
return this.mongoOperations;
}
/**

View File

@@ -11,7 +11,6 @@
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd">
<xsd:import namespace="http://www.springframework.org/schema/beans"/>
<xsd:import namespace="http://www.springframework.org/schema/tool"/>
<xsd:import namespace="http://www.springframework.org/schema/context"
@@ -93,7 +92,16 @@ The password to use when connecting to a MongoDB server.
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attributeGroup ref="writeConcern" />
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
@@ -140,9 +148,10 @@ The Mongo URI string.]]></xsd:documentation>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoConverter for getting rich mapping functionality.
]]></xsd:documentation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
@@ -150,14 +159,14 @@ Defines a MongoConverter for getting rich mapping functionality.
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject
]]>
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
@@ -259,6 +268,18 @@ The name of the Mongo object that determines what server to monitor. (by default
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
@@ -268,11 +289,13 @@ The name of the Mongo object that determines what server to monitor. (by default
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
@@ -288,7 +311,19 @@ The Mongo driver options
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[

View File

@@ -15,10 +15,26 @@
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Set;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.mapping.Account;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.stereotype.Component;
import com.mongodb.DBObject;
/**
* Integration tests for {@link MongoParser}.
@@ -26,13 +42,48 @@ import org.springframework.core.io.ClassPathResource;
* @author Oliver Gierke
*/
public class MappingMongoConverterParserIntegrationTests {
DefaultListableBeanFactory factory;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter.xml"));
}
@Test
public void allowsDbFactoryRefAttribute() {
ConfigurableListableBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/converter.xml"));
factory.getBeanDefinition("converter");
factory.getBean("converter");
}
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {
CustomConversions conversions = factory.getBean(CustomConversions.class);
assertThat(conversions.hasCustomWriteTarget(Person.class), is(true));
assertThat(conversions.hasCustomWriteTarget(Account.class), is(true));
}
@Component
public static class SampleConverter implements Converter<Person, DBObject> {
public DBObject convert(Person source) {
return null;
}
}
@Component
public static class SampleConverterFactory implements GenericConverter {
public Set<ConvertiblePair> getConvertibleTypes() {
return Collections.singleton(new ConvertiblePair(Account.class, DBObject.class));
}
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return null;
}
}
}

View File

@@ -0,0 +1,55 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.is;
import static org.junit.Assert.assertThat;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* Integration tests for {@link MongoDbFactory}.
*
* @author Thomas Risbergf
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoDbFactoryNoDatabaseRunningTests {
@Autowired
MongoTemplate mongoTemplate;
/**
* @see DATADOC-139
*/
@Test
public void startsUpWithoutADatabaseRunning() {
assertThat(mongoTemplate.getClass().getName(), is("org.springframework.data.mongodb.core.MongoTemplate"));
}
@Test(expected = DataAccessResourceFailureException.class)
public void failsDataAccessWithoutADatabaseRunning() {
mongoTemplate.getCollectionNames();
}
}

View File

@@ -18,19 +18,25 @@ package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.PropertyValue;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.ConstructorArgumentValues;
import org.springframework.beans.factory.config.ConstructorArgumentValues.ValueHolder;
import org.springframework.beans.factory.parsing.BeanDefinitionParsingException;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoDbFactoryParser}.
@@ -38,19 +44,79 @@ import com.mongodb.MongoURI;
* @author Oliver Gierke
*/
public class MongoDbFactoryParserIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void parsesWriteConcern() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
BeanDefinition definition = factory.getBeanDefinition("first");
List<PropertyValue> values = definition.getPropertyValues().getPropertyValueList();
assertThat(values, hasItem(new PropertyValue("writeConcern", "SAFE")));
public void testWriteConcern() throws Exception {
SimpleMongoDbFactory dbFactory = new SimpleMongoDbFactory(new Mongo("localhost"), "database");
dbFactory.setWriteConcern(WriteConcern.SAFE);
dbFactory.getDb();
assertThat(WriteConcern.SAFE, is(dbFactory.getWriteConcern()));
}
@Test
public void parsesWriteConcern() {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean.xml");
assertWriteConcern(ctx, WriteConcern.SAFE);
}
@Test
public void parsesCustomWriteConcern() {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean-custom-write-concern.xml");
assertWriteConcern(ctx, new WriteConcern("rack1"));
}
/**
* @see DATAMONGO-331
*/
@Test
public void readsReplicasWriteConcernCorrectly() {
ApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean-custom-write-concern.xml");
MongoDbFactory factory = ctx.getBean("second", MongoDbFactory.class);
DB db = factory.getDb();
assertThat(db.getWriteConcern(), is(WriteConcern.REPLICAS_SAFE));
}
private void assertWriteConcern(ClassPathXmlApplicationContext ctx, WriteConcern expectedWriteConcern) {
SimpleMongoDbFactory dbFactory = ctx.getBean("first", SimpleMongoDbFactory.class);
DB db = dbFactory.getDb();
assertThat("db", is(db.getName()));
MyWriteConcern myDbFactoryWriteConcern = new MyWriteConcern(dbFactory.getWriteConcern());
MyWriteConcern myDbWriteConcern = new MyWriteConcern(db.getWriteConcern());
MyWriteConcern myExpectedWriteConcern = new MyWriteConcern(expectedWriteConcern);
assertThat(myDbFactoryWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myDbFactoryWriteConcern));
}
//This test will fail since equals in WriteConcern uses == for _w and not .equals
public void testWriteConcernEquality() {
String s1 = new String("rack1");
String s2 = new String("rack1");
WriteConcern wc1 = new WriteConcern(s1);
WriteConcern wc2 = new WriteConcern(s2);
assertThat(wc1, equalTo(wc2));
}
@Test
public void createsDbFactoryBean() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/db-factory-bean.xml"));
factory.getBean("first");
}
@@ -60,7 +126,7 @@ public class MongoDbFactoryParserIntegrationTests {
@Test
public void parsesMaxAutoConnectRetryTimeCorrectly() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/db-factory-bean.xml"));
Mongo mongo = factory.getBean(Mongo.class);
assertThat(mongo.getMongoOptions().maxAutoConnectRetryTime, is(27L));
}
@@ -71,7 +137,7 @@ public class MongoDbFactoryParserIntegrationTests {
@Test
public void setsUpMongoDbFactoryUsingAMongoUri() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/mongo-uri.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongoDbFactory");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
@@ -80,11 +146,30 @@ public class MongoDbFactoryParserIntegrationTests {
assertThat(argument, is(notNullValue()));
}
/**
* @see DATADOC-306
*/
@Test
public void setsUpMongoDbFactoryUsingAMongoUriWithoutCredentials() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-no-credentials.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongoDbFactory");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
assertThat(constructorArguments.getArgumentCount(), is(1));
ValueHolder argument = constructorArguments.getArgumentValue(0, MongoURI.class);
assertThat(argument, is(notNullValue()));
MongoDbFactory dbFactory = factory.getBean("mongoDbFactory", MongoDbFactory.class);
DB db = dbFactory.getDb();
assertThat(db.getName(), is("database"));
}
/**
* @see DATADOC-295
*/
@Test(expected = BeanDefinitionParsingException.class)
public void rejectsUriPlusDetailedConfiguration() {
new XmlBeanFactory(new ClassPathResource("namespace/mongo-uri-and-details.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-and-details.xml"));
}
}

View File

@@ -45,18 +45,35 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
public void testParsingMongoWithReplicaSets() throws Exception {
assertTrue(ctx.containsBean("replicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&replicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("127.0.0.1", replicaSetSeeds.get(0).getHost());
assertEquals(10001, replicaSetSeeds.get(0).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
}
@Test
public void testParsingWithPropertyPlaceHolder() throws Exception {
assertTrue(ctx.containsBean("manyReplicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&manyReplicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("192.168.174.130", replicaSetSeeds.get(0).getHost());
assertEquals(27017, replicaSetSeeds.get(0).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(1).getHost());
assertEquals(27018, replicaSetSeeds.get(1).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(2).getHost());
assertEquals(27019, replicaSetSeeds.get(2).getPort());
}
@Test
@Ignore("CI infrastructure does not yet support replica sets")
public void testMongoWithReplicaSets() {
@@ -67,10 +84,10 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
assertEquals("localhost", servers.get(1).getHost());
assertEquals(10001, servers.get(0).getPort());
assertEquals(10002, servers.get(1).getPort());
MongoTemplate template = new MongoTemplate(mongo, "admin");
CommandResult result = template.executeCommand("{replSetGetStatus : 1}");
assertEquals("blort", result.getString("set"));
}
}

View File

@@ -68,6 +68,7 @@ public class MongoNamespaceTests {
}
@Test
@SuppressWarnings("deprecation")
public void testMongoSingletonWithPropertyPlaceHolders() throws Exception {
assertTrue(ctx.containsBean("mongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&mongo");

View File

@@ -15,18 +15,23 @@
*/
package org.springframework.data.mongodb.config;
import static org.junit.Assert.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.PropertyValue;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.context.support.GenericApplicationContext;
import org.springframework.core.io.ClassPathResource;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoParser}.
*
@@ -34,10 +39,19 @@ import org.springframework.core.io.ClassPathResource;
*/
public class MongoParserIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void readsMongoAttributesCorrectly() {
ConfigurableListableBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/mongo-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-bean.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongo");
List<PropertyValue> values = definition.getPropertyValues().getPropertyValueList();
@@ -45,4 +59,18 @@ public class MongoParserIntegrationTests {
factory.getBean("mongo");
}
/**
* @see DATAMONGO-343
*/
@Test
public void readsServerAddressesCorrectly() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-bean.xml"));
GenericApplicationContext context = new GenericApplicationContext(factory);
context.refresh();
assertThat(context.getBean("mongo2", Mongo.class), is(notNullValue()));
}
}

View File

@@ -0,0 +1,56 @@
package org.springframework.data.mongodb.config;
import com.mongodb.WriteConcern;
public class MyWriteConcern {
public MyWriteConcern(WriteConcern wc) {
this._w = wc.getWObject();
this._continueOnErrorForInsert = wc.getContinueOnErrorForInsert();
this._fsync = wc.getFsync();
this._j = wc.getJ();
this._wtimeout = wc.getWtimeout();
}
Object _w = 0;
int _wtimeout = 0;
boolean _fsync = false;
boolean _j = false;
boolean _continueOnErrorForInsert = false;
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (_continueOnErrorForInsert ? 1231 : 1237);
result = prime * result + (_fsync ? 1231 : 1237);
result = prime * result + (_j ? 1231 : 1237);
result = prime * result + ((_w == null) ? 0 : _w.hashCode());
result = prime * result + _wtimeout;
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
MyWriteConcern other = (MyWriteConcern) obj;
if (_continueOnErrorForInsert != other._continueOnErrorForInsert)
return false;
if (_fsync != other._fsync)
return false;
if (_j != other._j)
return false;
if (_w == null) {
if (other._w != null)
return false;
} else if (!_w.equals(other._w))
return false;
if (_wtimeout != other._wtimeout)
return false;
return true;
}
}

View File

@@ -30,8 +30,10 @@ import org.springframework.dao.DataAccessException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.AbstractMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.NearQuery;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -290,6 +292,66 @@ public abstract class MongoOperationsUnitTests {
}.assertDataAccessException();
}
/**
* @see DATAMONGO-341
*/
@Test
public void geoNearRejectsNullNearQuery() {
new Execution() {
@Override
public void doWith(MongoOperations operations) {
operations.geoNear(null, Person.class);
}
}.assertDataAccessException();
}
/**
* @see DATAMONGO-341
*/
@Test
public void geoNearRejectsNullNearQueryifCollectionGiven() {
new Execution() {
@Override
public void doWith(MongoOperations operations) {
operations.geoNear(null, Person.class, "collection");
}
}.assertDataAccessException();
}
/**
* @see DATAMONGO-341
*/
@Test
public void geoNearRejectsNullEntityClass() {
final NearQuery query = NearQuery.near(new Point(10, 20));
new Execution() {
@Override
public void doWith(MongoOperations operations) {
operations.geoNear(query, null);
}
}.assertDataAccessException();
}
/**
* @see DATAMONGO-341
*/
@Test
public void geoNearRejectsNullEntityClassIfCollectionGiven() {
final NearQuery query = NearQuery.near(new Point(10, 20));
new Execution() {
@Override
public void doWith(MongoOperations operations) {
operations.geoNear(query, null, "collection");
}
}.assertDataAccessException();
}
private abstract class Execution {
public void assertDataAccessException() {

View File

@@ -39,7 +39,7 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -47,6 +47,7 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
@@ -61,6 +62,8 @@ import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
/**
@@ -88,7 +91,7 @@ public class MongoTemplateTests {
CustomConversions conversions = new CustomConversions(Arrays.asList(DateToDateTimeConverter.INSTANCE,
DateTimeToDateConverter.INSTANCE));
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(PersonWith_idPropertyOfTypeObjectId.class,
PersonWith_idPropertyOfTypeString.class, PersonWithIdPropertyOfTypeObjectId.class,
@@ -101,7 +104,7 @@ public class MongoTemplateTests {
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.setCustomConversions(conversions);
mappingConverter.afterPropertiesSet();
this.mappingTemplate = new MongoTemplate(factory, mappingConverter);
}
@@ -109,14 +112,15 @@ public class MongoTemplateTests {
public void setUp() {
cleanDb();
}
@After
public void cleanUp() {
cleanDb();
}
protected void cleanDb() {
template.dropCollection(template.getCollectionName(Person.class));
template.dropCollection(template.getCollectionName(PersonWithAList.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeObjectId.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeString.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeObjectId.class));
@@ -141,7 +145,7 @@ public class MongoTemplateTests {
}
@Test
public void updateFailure() throws Exception {
public void bogusUpdateDoesNotTriggerException() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(factory);
mongoTemplate.setWriteResultChecking(WriteResultChecking.EXCEPTION);
@@ -152,10 +156,7 @@ public class MongoTemplateTests {
Query q = new Query(Criteria.where("BOGUS").gt(22));
Update u = new Update().set("firstName", "Sven");
thrown.expect(DataIntegrityViolationException.class);
thrown.expectMessage(endsWith("0 documents updated"));
mongoTemplate.updateFirst(q, u, Person.class);
}
@Test
@@ -168,11 +169,10 @@ public class MongoTemplateTests {
p2.setAge(40);
template.insert(p2);
template.ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP), Person.class);
template.indexOps(Person.class).ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP));
DBCollection coll = template.getCollection(template.getCollectionName(Person.class));
List<DBObject> indexInfo = coll.getIndexInfo();
assertThat(indexInfo.size(), is(2));
String indexKey = null;
boolean unique = false;
@@ -187,6 +187,16 @@ public class MongoTemplateTests {
assertThat(indexKey, is("{ \"age\" : -1}"));
assertThat(unique, is(true));
assertThat(dropDupes, is(true));
List<IndexInfo> indexInfoList = template.indexOps(Person.class).getIndexInfo();
System.out.println(indexInfoList);
assertThat(indexInfoList.size(), is(2));
IndexInfo ii = indexInfoList.get(1);
assertThat(ii.isUnique(), is(true));
assertThat(ii.isDropDuplicates(), is(true));
assertThat(ii.isSparse(), is(false));
assertThat(ii.getFieldSpec().containsKey("age"), is(true));
assertThat(ii.getFieldSpec().containsValue(Order.DESCENDING), is(true));
}
@Test
@@ -389,6 +399,56 @@ public class MongoTemplateTests {
assertThat(template.findAll(entityClass).size(), is(count));
}
/**
* @see DATAMONGO-234
*/
@Test
public void testFindAndUpdate() {
template.insert(new Person("Tom", 21));
template.insert(new Person("Dick", 22));
template.insert(new Person("Harry", 23));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().inc("age", 1);
Person p = template.findAndModify(query, update, Person.class); // return old
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(24));
p = template.findAndModify(query, update, Person.class, "person");
assertThat(p.getAge(), is(24));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(25));
p = template.findAndModify(query, update, new FindAndModifyOptions().returnNew(true), Person.class);
assertThat(p.getAge(), is(26));
p = template.findAndModify(query, update, null, Person.class, "person");
assertThat(p.getAge(), is(26));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(27));
Query query2 = new Query(Criteria.where("firstName").is("Mary"));
p = template.findAndModify(query2, update, new FindAndModifyOptions().returnNew(true).upsert(true), Person.class);
assertThat(p.getFirstName(), is("Mary"));
assertThat(p.getAge(), is(1));
}
@Test
public void testFindAndUpdateUpsert() {
template.insert(new Person("Tom", 21));
template.insert(new Person("Dick", 22));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().set("age", 23);
Person p = template.findAndModify(query, update, new FindAndModifyOptions().upsert(true).returnNew(true),
Person.class);
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
}
@Test
public void testFindAndRemove() throws Exception {
@@ -745,8 +805,32 @@ public class MongoTemplateTests {
}
@Test
public void testUsingSlaveOk() throws Exception {
this.template.execute("slaveOkTest", new CollectionCallback<Object>() {
public void testFindOneWithSort() {
PersonWithAList p = new PersonWithAList();
p.setFirstName("Sven");
p.setAge(22);
template.insert(p);
PersonWithAList p2 = new PersonWithAList();
p2.setFirstName("Erik");
p2.setAge(21);
template.insert(p2);
PersonWithAList p3 = new PersonWithAList();
p3.setFirstName("Mark");
p3.setAge(40);
template.insert(p3);
// test query with a sort
Query q2 = new Query(Criteria.where("age").gt(10));
q2.sort().on("age", Order.DESCENDING);
PersonWithAList p5 = template.findOne(q2, PersonWithAList.class);
assertThat(p5.getFirstName(), is("Mark"));
}
@Test
public void testUsingReadPreference() throws Exception {
this.template.execute("readPref", new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
assertThat(collection.getOptions(), is(0));
assertThat(collection.getDB().getOptions(), is(0));
@@ -754,10 +838,10 @@ public class MongoTemplateTests {
}
});
MongoTemplate slaveTemplate = new MongoTemplate(factory);
slaveTemplate.setSlaveOk(true);
slaveTemplate.execute("slaveOkTest", new CollectionCallback<Object>() {
slaveTemplate.setReadPreference(ReadPreference.SECONDARY);
slaveTemplate.execute("readPref", new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
assertThat(collection.getOptions(), is(4));
assertThat(collection.getReadPreference(), is(ReadPreference.SECONDARY));
assertThat(collection.getDB().getOptions(), is(0));
return null;
}
@@ -793,6 +877,53 @@ public class MongoTemplateTests {
assertThat(result.getFirstName(), is("Carter"));
}
@Test
public void testWriteConcernResolver() {
PersonWithIdPropertyOfTypeObjectId person = new PersonWithIdPropertyOfTypeObjectId();
person.setId(new ObjectId());
person.setFirstName("Dave");
template.setWriteConcern(WriteConcern.NONE);
template.save(person);
WriteResult result = template.updateFirst(query(where("id").is(person.getId())), update("firstName", "Carter"),
PersonWithIdPropertyOfTypeObjectId.class);
WriteConcern lastWriteConcern = result.getLastConcern();
assertThat(lastWriteConcern, equalTo(WriteConcern.NONE));
FsyncSafeWriteConcernResolver resolver = new FsyncSafeWriteConcernResolver();
template.setWriteConcernResolver(resolver);
Query q = query(where("_id").is(person.getId()));
Update u = update("firstName", "Carter");
result = template.updateFirst(q, u, PersonWithIdPropertyOfTypeObjectId.class);
lastWriteConcern = result.getLastConcern();
assertThat(lastWriteConcern, equalTo(WriteConcern.FSYNC_SAFE));
MongoAction lastMongoAction = resolver.getMongoAction();
assertThat(lastMongoAction.getCollectionName(), is("personWithIdPropertyOfTypeObjectId"));
assertThat(lastMongoAction.getDefaultWriteConcern(), equalTo(WriteConcern.NONE));
assertThat(lastMongoAction.getDocument(), notNullValue());
assertThat(lastMongoAction.getEntityClass().toString(), is(PersonWithIdPropertyOfTypeObjectId.class.toString()));
assertThat(lastMongoAction.getMongoActionOperation(), is(MongoActionOperation.UPDATE));
assertThat(lastMongoAction.getQuery(), equalTo(q.getQueryObject()));
assertThat(lastMongoAction.getDocument(), equalTo(u.getUpdateObject()));
}
private class FsyncSafeWriteConcernResolver implements WriteConcernResolver {
private MongoAction mongoAction;
public WriteConcern resolve(MongoAction action) {
this.mongoAction = action;
return WriteConcern.FSYNC_SAFE;
}
public MongoAction getMongoAction() {
return mongoAction;
}
}
/**
* @see DATADOC-246
*/
@@ -827,7 +958,7 @@ public class MongoTemplateTests {
}
});
assertEquals(3, names.size());
//template.remove(new Query(), Person.class);
// template.remove(new Query(), Person.class);
}
/**
@@ -855,7 +986,7 @@ public class MongoTemplateTests {
});
assertEquals(1, names.size());
//template.remove(new Query(), Person.class);
// template.remove(new Query(), Person.class);
}
/**
@@ -863,19 +994,19 @@ public class MongoTemplateTests {
*/
@Test
public void countsDocumentsCorrectly() {
assertThat(template.count(new Query(), Person.class), is(0L));
Person dave = new Person("Dave");
Person carter = new Person("Carter");
template.save(dave);
template.save(carter);
assertThat(template.count(null, Person.class), is(2L));
assertThat(template.count(query(where("firstName").is("Carter")), Person.class), is(1L));
}
/**
* @see DATADOC-183
*/
@@ -883,7 +1014,7 @@ public class MongoTemplateTests {
public void countRejectsNullEntityClass() {
template.count(null, (Class<?>) null);
}
/**
* @see DATADOC-183
*/
@@ -891,7 +1022,7 @@ public class MongoTemplateTests {
public void countRejectsEmptyCollectionName() {
template.count(null, "");
}
/**
* @see DATADOC-183
*/
@@ -902,11 +1033,11 @@ public class MongoTemplateTests {
@Test
public void returnsEntityWhenQueryingForDateTime() {
DateTime dateTime = new DateTime(2011, 3, 3, 12, 0, 0, 0);
TestClass testClass = new TestClass(dateTime);
mappingTemplate.save(testClass);
List<TestClass> testClassList = mappingTemplate.find(new Query(Criteria.where("myDate").is(dateTime.toDate())),
TestClass.class);
assertThat(testClassList.size(), is(1));
@@ -918,18 +1049,43 @@ public class MongoTemplateTests {
*/
@Test
public void removesEntityFromCollection() {
template.remove(new Query(), "mycollection");
Person person = new Person("Dave");
template.save(person, "mycollection");
assertThat(template.findAll(TestClass.class, "mycollection").size(), is(1));
template.remove(person, "mycollection");
assertThat(template.findAll(Person.class, "mycollection").isEmpty(), is(true));
}
/**
* @see DATADOC-349
*/
@Test
public void removesEntityWithAnnotatedIdIfIdNeedsMassaging() {
String id = new ObjectId().toString();
Sample sample = new Sample();
sample.id = id;
template.save(sample);
assertThat(template.findOne(query(where("id").is(id)), Sample.class).id, is(id));
template.remove(sample);
assertThat(template.findOne(query(where("id").is(id)), Sample.class), is(nullValue()));
}
public class Sample {
@Id
String id;
}
public class TestClass {
private DateTime myDate;
@@ -954,7 +1110,7 @@ public class MongoTemplateTests {
}
static enum DateToDateTimeConverter implements Converter<Date, DateTime> {
INSTANCE;
public DateTime convert(Date source) {

View File

@@ -15,21 +15,32 @@
*/
package org.springframework.data.mongodb.core;
import static org.junit.Assert.assertTrue;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.context.support.GenericApplicationContext;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
/**
* Unit tests for {@link MongoTemplate}.
*
@@ -46,9 +57,15 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Mock
DB db;
@Mock
DBCollection collection;
@Before
public void setUp() {
this.template = new MongoTemplate(mongo, "database");
when(mongo.getDB("database")).thenReturn(db);
when(db.getCollection(Mockito.any(String.class))).thenReturn(collection);
}
@Test(expected = IllegalArgumentException.class)
@@ -75,6 +92,62 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertTrue(ReflectionTestUtils.getField(template, "mongoConverter") instanceof MappingMongoConverter);
}
@Test(expected = InvalidDataAccessApiUsageException.class)
public void rejectsNotFoundMapReduceResource() {
template.setApplicationContext(new GenericApplicationContext());
template.mapReduce("foo", "classpath:doesNotExist.js", "function() {}", Person.class);
}
/**
* @see DATAMONGO-322
*/
@Test(expected = InvalidDataAccessApiUsageException.class)
public void rejectsEntityWithNullIdIfNotSupportedIdType() {
Object entity = new NotAutogenerateableId();
template.save(entity);
}
/**
* @see DATAMONGO-322
*/
@Test
public void storesEntityWithSetIdAlthoughNotAutogenerateable() {
NotAutogenerateableId entity = new NotAutogenerateableId();
entity.id = 1;
template.save(entity);
}
/**
* @see DATAMONGO-322
*/
@Test
public void autogeneratesIdForEntityWithAutogeneratableId() {
MongoTemplate template = spy(this.template);
doReturn(new ObjectId()).when(template).saveDBObject(Mockito.any(String.class), Mockito.any(DBObject.class),
Mockito.any(Class.class));
AutogenerateableId entity = new AutogenerateableId();
template.save(entity);
assertThat(entity.id, is(notNullValue()));
}
class AutogenerateableId {
@Id
BigInteger id;
}
class NotAutogenerateableId {
@Id
Integer id;
}
/**
* Mocks out the {@link MongoTemplate#getDb()} method to return the {@link DB} mock instead of executing the actual
* behaviour.

View File

@@ -27,15 +27,28 @@ public class Person {
private Person friend;
private boolean active = true;
public Person() {
this.id = new ObjectId();
}
@Override
public String toString() {
return "Person [id=" + id + ", firstName=" + firstName + ", age=" + age + ", friend=" + friend + "]";
}
public Person(ObjectId id, String firstname) {
this.id = id;
this.firstName = firstname;
}
public Person(String firstname, int age) {
this();
this.firstName = firstname;
this.age = age;
}
public Person(String firstname) {
this();
this.firstName = firstname;
@@ -69,6 +82,13 @@ public class Person {
this.friend = friend;
}
/**
* @return the active
*/
public boolean isActive() {
return active;
}
/*
* (non-Javadoc)
*

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate.QueryCursorPreparer;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
/**
* Unit tests for {@link QueryCursorPreparer}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryCursorPreparerUnitTests {
@Mock
MongoDbFactory factory;
@Mock
DBCursor cursor;
/**
* @see DATAMONGO-185
*/
@Test
public void appliesHintsCorrectly() {
Query query = query(where("foo").is("bar")).withHint("hint");
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query);
preparer.prepare(cursor);
verify(cursor).hint("hint");
}
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.mongodb.core.mapping.Person;
/**
* Unit tests for {@link ConverterRegistration}.
*
* @author Oliver Gierke
*/
public class ConverterRegistrationUnitTests {
@Test
public void considersNotExplicitlyReadingDependingOnTypes() {
ConverterRegistration context = new ConverterRegistration(Person.class, String.class, false, false);
assertThat(context.isWriting(), is(true));
assertThat(context.isReading(), is(false));
context = new ConverterRegistration(String.class, Person.class, false, false);
assertThat(context.isWriting(), is(false));
assertThat(context.isReading(), is(true));
context = new ConverterRegistration(String.class, Class.class, false, false);
assertThat(context.isWriting(), is(true));
assertThat(context.isReading(), is(true));
}
@Test
public void forcesReadWriteOnlyIfAnnotated() {
ConverterRegistration context = new ConverterRegistration(String.class, Class.class, false, true);
assertThat(context.isWriting(), is(true));
assertThat(context.isReading(), is(false));
context = new ConverterRegistration(String.class, Class.class, true, false);
assertThat(context.isWriting(), is(false));
assertThat(context.isReading(), is(true));
}
@Test
public void considersConverterForReadAndWriteIfBothAnnotated() {
ConverterRegistration context = new ConverterRegistration(String.class, Class.class, true, true);
assertThat(context.isWriting(), is(true));
assertThat(context.isReading(), is(true));
}
}

View File

@@ -12,6 +12,7 @@ import org.junit.Test;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import com.mongodb.DBRef;
@@ -40,8 +41,8 @@ public class CustomConversionsUnitTests {
@SuppressWarnings("unchecked")
public void considersSubtypesCorrectly() {
CustomConversions conversions = new CustomConversions(Arrays.asList(
NumberToStringConverter.INSTANCE, StringToNumberConverter.INSTANCE));
CustomConversions conversions = new CustomConversions(Arrays.asList(NumberToStringConverter.INSTANCE,
StringToNumberConverter.INSTANCE));
assertThat(conversions.getCustomWriteTarget(Long.class, null), is(typeCompatibleWith(String.class)));
assertThat(conversions.hasCustomReadTarget(String.class, Long.class), is(true));
@@ -55,30 +56,29 @@ public class CustomConversionsUnitTests {
}
/**
* @see DATADOC-240
* @see DATAMONGO-240
*/
@Test
public void considersObjectIdToBeSimpleType() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.isSimpleType(ObjectId.class), is(true));
assertThat(conversions.hasCustomWriteTarget(ObjectId.class), is(false));
}
/**
* @see DATADOC-240
* @see DATAMONGO-240
*/
@Test
public void considersCustomConverterForSimpleType() {
CustomConversions conversions = new CustomConversions(Arrays.asList(new Converter<ObjectId, String>() {
public String convert(ObjectId source) {
return source == null ? null : source.toString();
}
}));
assertThat(conversions.isSimpleType(ObjectId.class), is(true));
assertThat(conversions.hasCustomWriteTarget(ObjectId.class), is(true));
assertThat(conversions.hasCustomReadTarget(ObjectId.class, String.class), is(true));
@@ -87,33 +87,59 @@ public class CustomConversionsUnitTests {
@Test
public void considersDBRefsToBeSimpleTypes() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.isSimpleType(DBRef.class), is(true));
}
@Test
public void populatesConversionServiceCorrectly() {
@SuppressWarnings("deprecation")
GenericConversionService conversionService = ConversionServiceFactory.createDefaultConversionService();
assertThat(conversionService.canConvert(String.class, UUID.class), is(false));
CustomConversions conversions = new CustomConversions(
Arrays.asList(StringToUUIDConverter.INSTANCE));
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToUUIDConverter.INSTANCE));
conversions.registerConvertersIn(conversionService);
assertThat(conversionService.canConvert(String.class, UUID.class), is(true));
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void doesNotConsiderTypeSimpleIfOnlyReadConverterIsRegistered() {
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToUUIDConverter.INSTANCE));
assertThat(conversions.isSimpleType(UUID.class), is(false));
}
/**
* @see DATAMONGO-298
*/
@Test
public void discoversConvertersForSubtypesOfMongoTypes() {
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToIntegerConverter.INSTANCE));
assertThat(conversions.hasCustomReadTarget(String.class, Integer.class), is(true));
assertThat(conversions.hasCustomWriteTarget(String.class, Integer.class), is(true));
}
/**
* @see DATAMONGO-342
*/
@Test
public void doesNotHaveConverterForStringToBigIntegerByDefault() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.hasCustomWriteTarget(String.class), is(false));
assertThat(conversions.getCustomWriteTarget(String.class), is(nullValue()));
conversions = new CustomConversions(Arrays.asList(StringToBigIntegerConverter.INSTANCE));
assertThat(conversions.hasCustomWriteTarget(String.class), is(false));
assertThat(conversions.getCustomWriteTarget(String.class), is(nullValue()));
}
enum UuidToStringConverter implements Converter<UUID, String> {
INSTANCE;
@@ -142,4 +168,11 @@ public class CustomConversionsUnitTests {
return 0L;
}
}
enum StringToIntegerConverter implements Converter<String, Integer> {
INSTANCE;
public Integer convert(String source) {
return 0;
}
}
}

View File

@@ -119,7 +119,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-130
* @see DATAMONGO-130
*/
@Test
public void writesMapTypeCorrectly() {
@@ -133,7 +133,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-130
* @see DATAMONGO-130
*/
@Test
public void readsMapWithCustomKeyTypeCorrectly() {
@@ -146,7 +146,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-128
* @see DATAMONGO-128
*/
@Test
public void usesDocumentsStoredTypeIfSubtypeOfRequest() {
@@ -159,7 +159,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-128
* @see DATAMONGO-128
*/
@Test
public void ignoresDocumentsStoredTypeIfCompletelyDifferentTypeRequested() {
@@ -185,7 +185,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-136
* @see DATAMONGO-136
*/
@Test
public void writesEnumsCorrectly() {
@@ -201,7 +201,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-209
* @see DATAMONGO-209
*/
@Test
public void writesEnumCollectionCorrectly() {
@@ -220,7 +220,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-136
* @see DATAMONGO-136
*/
@Test
public void readsEnumsCorrectly() {
@@ -231,7 +231,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-209
* @see DATAMONGO-209
*/
@Test
public void readsEnumCollectionsCorrectly() {
@@ -248,14 +248,14 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-144
* @see DATAMONGO-144
*/
@Test
public void considersFieldNameWhenWriting() {
Person person = new Person();
person.firstname ="Oliver";
person.firstname = "Oliver";
DBObject result = new BasicDBObject();
converter.write(person, result);
@@ -264,7 +264,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-144
* @see DATAMONGO-144
*/
@Test
public void considersFieldNameWhenReading() {
@@ -276,7 +276,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-145
* @see DATAMONGO-145
*/
@Test
public void writesCollectionWithInterfaceCorrectly() {
@@ -299,7 +299,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-145
* @see DATAMONGO-145
*/
@Test
public void readsCollectionWithInterfaceCorrectly() {
@@ -336,7 +336,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-161
* @see DATAMONGO-161
*/
@Test
public void readsNestedMapsCorrectly() {
@@ -363,7 +363,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATACMNS-42, DATADOC-171
* @see DATACMNS-42, DATAMONGO-171
*/
@Test
public void writesClassWithBigDecimal() {
@@ -381,7 +381,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATACMNS-42, DATADOC-171
* @see DATACMNS-42, DATAMONGO-171
*/
@Test
public void readsClassWithBigDecimal() {
@@ -417,7 +417,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-192
* @see DATAMONGO-192
*/
@Test
public void readsEmptySetsCorrectly() {
@@ -446,7 +446,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-207
* @see DATAMONGO-207
*/
@Test
public void convertsCustomEmptyMapCorrectly() {
@@ -461,7 +461,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-211
* @see DATAMONGO-211
*/
@Test
public void maybeConvertHandlesNullValuesCorrectly() {
@@ -495,7 +495,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-228
* @see DATAMONGO-228
*/
@Test
public void writesNullValuesForMaps() {
@@ -530,7 +530,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void writesMapOfListsCorrectly() {
@@ -554,7 +554,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void readsMapListValuesCorrectly() {
@@ -568,7 +568,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void writesMapsOfObjectsCorrectly() {
@@ -593,7 +593,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void readsMapOfObjectsListValuesCorrectly() {
@@ -607,7 +607,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapListNestedValuesCorrectly() {
@@ -621,12 +621,12 @@ public class MappingMongoConverterUnitTests {
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, source);
Object firstObjectInFoo = ((List<?>) result.mapOfObjects.get("Foo")).get(0);
assertThat(firstObjectInFoo, is(instanceOf(Map.class)));
assertThat((String)((Map<?,?>) firstObjectInFoo).get("Hello"), is(equalTo("World")));
assertThat((String) ((Map<?, ?>) firstObjectInFoo).get("Hello"), is(equalTo("World")));
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapDoublyNestedValuesCorrectly() {
@@ -646,7 +646,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapListDoublyNestedValuesCorrectly() {
@@ -668,7 +668,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void writesListOfMapsCorrectly() {
@@ -676,7 +676,7 @@ public class MappingMongoConverterUnitTests {
Map<String, Locale> map = Collections.singletonMap("Foo", Locale.ENGLISH);
CollectionWrapper wrapper = new CollectionWrapper();
wrapper.listOfMaps = new ArrayList<Map<String,Locale>>();
wrapper.listOfMaps = new ArrayList<Map<String, Locale>>();
wrapper.listOfMaps.add(map);
DBObject result = new BasicDBObject();
@@ -692,7 +692,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void readsListOfMapsCorrectly() {
@@ -713,12 +713,12 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void writesPlainMapOfCollectionsCorrectly() {
Map<String,List<Locale>> map = Collections.singletonMap("Foo", Arrays.asList(Locale.US));
Map<String, List<Locale>> map = Collections.singletonMap("Foo", Arrays.asList(Locale.US));
DBObject result = new BasicDBObject();
converter.write(map, result);
@@ -733,7 +733,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-285
* @see DATAMONGO-285
*/
@Test
@SuppressWarnings({ "unchecked", "rawtypes" })
@@ -760,6 +760,102 @@ public class MappingMongoConverterUnitTests {
assertEquals(list.get(1), listFromMongo.get(1));
}
/**
* @see DATAMONGO-309
*/
@Test
@SuppressWarnings({ "unchecked" })
public void writesArraysAsMapValuesCorrectly() {
ClassWithMapProperty wrapper = new ClassWithMapProperty();
wrapper.mapOfObjects = new HashMap<String, Object>();
wrapper.mapOfObjects.put("foo", new String[] { "bar" });
DBObject result = new BasicDBObject();
converter.write(wrapper, result);
Object mapObject = result.get("mapOfObjects");
assertThat(mapObject, is(BasicDBObject.class));
DBObject map = (DBObject) mapObject;
Object valueObject = map.get("foo");
assertThat(valueObject, is(BasicDBList.class));
List<Object> list = (List<Object>) valueObject;
assertThat(list.size(), is(1));
assertThat(list, hasItem((Object) "bar"));
}
/**
* @see DATAMONGO-324
*/
@Test
public void writesDbObjectCorrectly() {
DBObject dbObject = new BasicDBObject();
dbObject.put("foo", "bar");
DBObject result = new BasicDBObject();
converter.write(dbObject, result);
result.removeField(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY);
assertThat(dbObject, is(result));
}
/**
* @see DATAMONGO-324
*/
@Test
public void readsDbObjectCorrectly() {
DBObject dbObject = new BasicDBObject();
dbObject.put("foo", "bar");
DBObject result = converter.read(DBObject.class, dbObject);
assertThat(result, is(dbObject));
}
/**
* @see DATAMONGO-329
*/
@Test
public void writesMapAsGenericFieldCorrectly() {
Map<String, A<String>> objectToSave = new HashMap<String, A<String>>();
objectToSave.put("test", new A<String>("testValue"));
A<Map<String, A<String>>> a = new A<Map<String, A<String>>>(objectToSave);
DBObject result = new BasicDBObject();
converter.write(a, result);
assertThat((String) result.get(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY), is(A.class.getName()));
assertThat((String) result.get("valueType"), is(HashMap.class.getName()));
DBObject object = (DBObject) result.get("value");
assertThat(object, is(notNullValue()));
DBObject inner = (DBObject) object.get("test");
assertThat(inner, is(notNullValue()));
assertThat((String) inner.get(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY), is(A.class.getName()));
assertThat((String) inner.get("valueType"), is(String.class.getName()));
assertThat((String) inner.get("value"), is("testValue"));
}
@Test
public void writesIntIdCorrectly() {
ClassWithIntId value = new ClassWithIntId();
value.id = 5;
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("_id"), is((Object) 5));
}
class GenericType<T> {
T content;
}
@@ -771,7 +867,19 @@ public class MappingMongoConverterUnitTests {
}
enum SampleEnum {
FIRST, SECOND;
FIRST {
@Override
void method() {
}
},
SECOND {
@Override
void method() {
}
};
abstract void method();
}
class Address {
@@ -779,6 +887,7 @@ public class MappingMongoConverterUnitTests {
String city;
}
interface Contact {
}
@@ -812,7 +921,7 @@ public class MappingMongoConverterUnitTests {
}
class ClassWithNestedMaps {
Map<String, Map<String, Map<String, String>>> nestedMaps;
Map<String, Map<String, Map<String, String>>> nestedMaps;
}
class BirthDateContainer {
@@ -839,6 +948,23 @@ public class MappingMongoConverterUnitTests {
@Id
BigInteger id;
}
class A<T> {
String valueType;
T value;
public A(T value) {
this.valueType = value.getClass().getName();
this.value = value;
}
}
class ClassWithIntId {
@Id
int id;
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {

View File

@@ -18,8 +18,8 @@ package org.springframework.data.mongodb.core.geo;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.net.UnknownHostException;
import java.util.Collection;
@@ -73,7 +73,7 @@ public class GeoSpatialTests {
applicationContext = new AnnotationConfigApplicationContext(GeoSpatialAppConfig.class);
template = applicationContext.getBean(MongoTemplate.class);
template.setWriteConcern(WriteConcern.FSYNC_SAFE);
template.ensureIndex(new GeospatialIndex("location"), Venue.class);
template.indexOps(Venue.class).ensureIndex(new GeospatialIndex("location"));
indexCreated();
addVenues();
parser = new SpelExpressionParser();
@@ -134,7 +134,7 @@ public class GeoSpatialTests {
@Test
public void withinBox() {
Box box = new Box(new Point(-73.99756, 40.73083), new Point(-73.988135, 40.741404));
List<Venue> venues = template.find(query(where("location").within(box)), Venue.class);
assertThat(venues.size(), is(4));
@@ -171,23 +171,23 @@ public class GeoSpatialTests {
@Test
public void searchAllData() {
Venue foundVenue = template.findOne(query(where("name").is("Penn Station")), Venue.class);
assertThat(foundVenue, is(notNullValue()));
List<Venue> venues = template.findAll(Venue.class);
assertThat(venues.size(), is(12));
Collection<?> names = (Collection<?>) parser.parseExpression("![name]").getValue(venues);
assertThat(names.size(), is(12));
}
public void indexCreated() {
List<DBObject> indexInfo = getIndexInfo(Venue.class);
LOGGER.debug(indexInfo);
assertThat(indexInfo.size(), is(2));
assertThat(indexInfo.get(1).get("name").toString(), is("location_2d"));
assertThat(indexInfo.get(1).get("ns").toString(), is("database.newyork"));

View File

@@ -69,6 +69,7 @@ public class MappingTests {
MongoCollectionUtils.getPreferredCollectionName(PersonMultiDimArrays.class),
MongoCollectionUtils.getPreferredCollectionName(PersonMultiCollection.class),
MongoCollectionUtils.getPreferredCollectionName(PersonWithDbRef.class),
MongoCollectionUtils.getPreferredCollectionName(PersonWithLongDBRef.class),
MongoCollectionUtils.getPreferredCollectionName(PersonNullProperties.class),
MongoCollectionUtils.getPreferredCollectionName(Account.class),
MongoCollectionUtils.getPreferredCollectionName(PrimitiveId.class),
@@ -353,7 +354,31 @@ public class MappingTests {
Person p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("New Town"));
}
@Test
@SuppressWarnings("rawtypes")
public void testUpsert() {
Address addr = new Address();
addr.setLines(new String[]{"1234 W. 1st Street", "Apt. 12"});
addr.setCity("Anytown");
addr.setPostalCode(12345);
addr.setCountry("USA");
Person p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertNull(p2);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), Person.class);
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
template.dropCollection(Person.class);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), "person");
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
}
@Test
public void testOrQuery() {
PersonWithObjectId p1 = new PersonWithObjectId(1, "first", "");
@@ -361,8 +386,6 @@ public class MappingTests {
PersonWithObjectId p2 = new PersonWithObjectId(2, "second", "");
template.save(p2);
Query one = query(where("ssn").is(1));
Query two = query(where("ssn").is(2));
List<PersonWithObjectId> results = template.find(new Query(
new Criteria().orOperator(where("ssn").is(1), where("ssn").is(2))), PersonWithObjectId.class);

View File

@@ -32,7 +32,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
*/
public class AbstractMongoEventListenerUnitTest {
public class AbstractMongoEventListenerUnitTests {
@Test
public void invokesCallbackForEventForPerson() {
@@ -115,6 +115,17 @@ public class AbstractMongoEventListenerUnitTest {
assertThat(personListener.invokedOnAfterLoad, is(false));
assertThat(contactListener.invokedOnAfterLoad, is(true));
}
/**
* @see DATADOC-333
*/
@Test
@SuppressWarnings({ "rawtypes", "unchecked" })
public void handlesUntypedImplementations() {
UntypedEventListener listener = new UntypedEventListener();
listener.onApplicationEvent(new MongoMappingEvent(new Object(), new BasicDBObject()));
}
class SamplePersonEventListener extends AbstractMongoEventListener<Person> {
@@ -163,4 +174,9 @@ public class AbstractMongoEventListenerUnitTest {
invokedOnAfterLoad = true;
}
}
@SuppressWarnings("rawtypes")
class UntypedEventListener extends AbstractMongoEventListener {
}
}

View File

@@ -0,0 +1,73 @@
package org.springframework.data.mongodb.core.mapreduce;
public class ContentAndVersion {
private String id;
private String document_id;
private String content;
private String author;
private Long version;
private Long value;
public String getAuthor() {
return author;
}
public void setAuthor(String author) {
this.author = author;
}
public String getDocumentId() {
return document_id;
}
public Long getValue() {
return value;
}
public void setValue(Long value) {
this.value = value;
}
public void setDocumentId(String documentId) {
this.document_id = documentId;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getContent() {
return content;
}
public void setContent(String content) {
this.content = content;
}
public Long getVersion() {
return version;
}
public void setVersion(Long version) {
this.version = version;
}
@Override
public String toString() {
return "ContentAndVersion [id=" + id + ", document_id=" + document_id + ", content=" + content + ", author="
+ author + ", version=" + version + ", value=" + value + "]";
}
}

View File

@@ -0,0 +1,196 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import java.util.Arrays;
import java.util.HashSet;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.mapreduce.GroupBy.*;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class GroupByTests {
@Autowired
MongoDbFactory factory;
@Autowired
ApplicationContext applicationContext;
//@Autowired
//MongoTemplate mongoTemplate;
MongoTemplate mongoTemplate;
@Autowired
@SuppressWarnings("unchecked")
public void setMongo(Mongo mongo) throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(XObject.class)));
mappingContext.afterPropertiesSet();
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.afterPropertiesSet();
this.mongoTemplate = new MongoTemplate(factory, mappingConverter);
mongoTemplate.setApplicationContext(applicationContext);
}
@Before
public void setUp() {
cleanDb();
}
@After
public void cleanUp() {
cleanDb();
}
protected void cleanDb() {
mongoTemplate.dropCollection(mongoTemplate.getCollectionName(XObject.class));
mongoTemplate.dropCollection("group_test_collection");
}
@Test
public void singleKeyCreation() {
DBObject gc = new GroupBy("a").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void multipleKeyCreation() {
DBObject gc = GroupBy.key("a","b").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1 , \"b\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1 , \"b\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void keyFunctionCreation() {
DBObject gc = GroupBy.keyFunction("classpath:keyFunction.js").getGroupByObject();
String expected = "{ \"$keyf\" : \"classpath:keyFunction.js\" , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void SimpleGroup() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.key("x").initialDocument(new BasicDBObject("count", 0)).reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithKeyFunction() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("function(doc) { return { x : doc.x }; }").initialDocument("{ count: 0 }").reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithFunctionsAsResources() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithQueryAndFunctionsAsResources() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group(where("x").gt(0),
"group_test_collection",
keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
private void assertMapReduceResults(GroupByResults<XObject> results) {
DBObject dboRawResults = results.getRawResults();
String expected = "{ \"serverUsed\" : \"127.0.0.1:27017\" , \"retval\" : [ { \"x\" : 1.0 , \"count\" : 2.0} , { \"x\" : 2.0 , \"count\" : 1.0} , { \"x\" : 3.0 , \"count\" : 3.0}] , \"count\" : 6.0 , \"keys\" : 3 , \"ok\" : 1.0}";
Assert.assertEquals(expected, dboRawResults.toString());
int numResults = 0;
for (XObject xObject : results) {
if (xObject.getX() == 1) {
Assert.assertEquals(2, xObject.getCount(), 0.001);
}
if (xObject.getX() == 2) {
Assert.assertEquals(1, xObject.getCount(), 0.001);
}
if (xObject.getX() == 3) {
Assert.assertEquals(3, xObject.getCount(), 0.001);
}
numResults++;
}
Assert.assertEquals(3, numResults);
Assert.assertEquals(6, results.getCount(), 0.001);
Assert.assertEquals(3, results.getKeys());
}
private void createGroupByData() {
DBCollection c = mongoTemplate.getDb().getCollection("group_test_collection");
c.save(new BasicDBObject("x", 1));
c.save(new BasicDBObject("x", 1));
c.save(new BasicDBObject("x", 2));
c.save(new BasicDBObject("x", 3));
c.save(new BasicDBObject("x", 3));
c.save(new BasicDBObject("x", 3));
}
}

View File

@@ -60,6 +60,7 @@ public class MapReduceTests {
MongoTemplate template;
@Autowired
MongoDbFactory factory;
MongoTemplate mongoTemplate;
@Autowired
@@ -87,6 +88,8 @@ public class MapReduceTests {
protected void cleanDb() {
template.dropCollection(template.getCollectionName(ValueObject.class));
template.dropCollection("jmr2");
template.dropCollection("jmr2_out");
template.dropCollection("jmr1_out");
template.dropCollection("jmr1");
}
@@ -95,11 +98,134 @@ public class MapReduceTests {
@Ignore
public void testForDocs() {
createMapReduceData();
MapReduceResults<ValueObject> results = mongoTemplate.mapReduce("jmr1", mapFunction, reduceFunction, ValueObject.class);
MapReduceResults<ValueObject> results = mongoTemplate.mapReduce("jmr1", mapFunction, reduceFunction,
ValueObject.class);
for (ValueObject valueObject : results) {
System.out.println(valueObject);
}
}
@Test
public void testIssue260() {
createContentAndVersionData();
String map = "function () { emit(this.document_id, this.version); }";
String reduce = "function (key, values) { return Math.max.apply(Math, values); }";
MapReduceResults<ContentAndVersion> results = mongoTemplate.mapReduce("jmr2", map, reduce,
new MapReduceOptions().outputCollection("jmr2_out"), ContentAndVersion.class);
int size = 0;
for (ContentAndVersion cv : results) {
if (cv.getId().equals("Resume")) {
assertEquals(6, cv.getValue().longValue());
}
if (cv.getId().equals("Schema")) {
assertEquals(2, cv.getValue().longValue());
}
if (cv.getId().equals("mongoDB How-To")) {
assertEquals(2, cv.getValue().longValue());
}
size++;
}
assertEquals(3,size);
}
@Test
public void testIssue260Part2() {
createNumberAndVersionData();
String map = "function () { emit(this.number, this.version); }";
String reduce = "function (key, values) { return Math.max.apply(Math, values); }";
MapReduceResults<NumberAndVersion> results =
mongoTemplate.mapReduce("jmr2", map, reduce, new MapReduceOptions().outputCollection("jmr2_out"), NumberAndVersion.class);
int size = 0;
for (NumberAndVersion nv : results) {
if (nv.getId().equals("1")) {
assertEquals(2, nv.getValue().longValue());
}
if (nv.getId().equals("2")) {
assertEquals(6, nv.getValue().longValue());
}
if (nv.getId().equals("3")) {
assertEquals(2, nv.getValue().longValue());
}
size++;
}
assertEquals(3,size);
}
private void createNumberAndVersionData() {
NumberAndVersion nv1 = new NumberAndVersion();
nv1.setNumber(1L);
nv1.setVersion(1L);
template.save(nv1, "jmr2");
NumberAndVersion nv2 = new NumberAndVersion();
nv2.setNumber(1L);
nv2.setVersion(2L);
template.save(nv2, "jmr2");
NumberAndVersion nv3 = new NumberAndVersion();
nv3.setNumber(2L);
nv3.setVersion(6L);
template.save(nv3, "jmr2");
NumberAndVersion nv4 = new NumberAndVersion();
nv4.setNumber(3L);
nv4.setVersion(1L);
template.save(nv4, "jmr2");
NumberAndVersion nv5 = new NumberAndVersion();
nv5.setNumber(3L);
nv5.setVersion(2L);
template.save(nv5, "jmr2");
}
private void createContentAndVersionData() {
/*
{ "_id" : 1, "document_id" : "mongoDB How-To", "author" : "Amos King", "content" : "...", "version" : 1 }
{ "_id" : 2, "document_id" : "mongoDB How-To", "author" : "Amos King", "content" : "...", "version" : 1.1 }
{ "_id" : 3, "document_id" : "Resume", "author" : "Author", "content" : "...", "version" : 6 }
{ "_id" : 4, "document_id" : "Schema", "author" : "Someone Else", "content" : "...", "version" : 0.9 }
{ "_id" : 5, "document_id" : "Schema", "author" : "Someone Else", "content" : "...", "version" : 1 }
*/
ContentAndVersion cv1 = new ContentAndVersion();
cv1.setDocumentId("mongoDB How-To");
cv1.setAuthor("Amos King");
cv1.setContent("...");
cv1.setVersion(1L);
template.save(cv1, "jmr2");
ContentAndVersion cv2 = new ContentAndVersion();
cv2.setDocumentId("mongoDB How-To");
cv2.setAuthor("Amos King");
cv2.setContent("...");
cv2.setVersion(2L);
template.save(cv2, "jmr2");
ContentAndVersion cv3 = new ContentAndVersion();
cv3.setDocumentId("Resume");
cv3.setAuthor("Author");
cv3.setContent("...");
cv3.setVersion(6L);
template.save(cv3, "jmr2");
ContentAndVersion cv4 = new ContentAndVersion();
cv4.setDocumentId("Schema");
cv4.setAuthor("Someone Else");
cv4.setContent("...");
cv4.setVersion(1L);
template.save(cv4, "jmr2");
ContentAndVersion cv5 = new ContentAndVersion();
cv5.setDocumentId("Schema");
cv5.setAuthor("Someone Else");
cv5.setContent("...");
cv5.setVersion(2L);
template.save(cv5, "jmr2");
}
@Test
public void testMapReduce() {
performMapReduce(false, false);
@@ -109,7 +235,7 @@ public class MapReduceTests {
public void testMapReduceInline() {
performMapReduce(true, false);
}
@Test
public void testMapReduceWithQuery() {
performMapReduce(false, true);
@@ -121,9 +247,9 @@ public class MapReduceTests {
Map<String, Object> scopeVariables = new HashMap<String, Object>();
scopeVariables.put("exclude", "a");
String mapWithExcludeFunction = "function(){ for ( var i=0; i<this.x.length; i++ ){ if(this.x[i] != exclude) emit( this.x[i] , 1 ); } }";
String mapWithExcludeFunction = "function(){ for ( var i=0; i<this.x.length; i++ ){ if(this.x[i] != exclude) emit( this.x[i] , 1 ); } }";
MapReduceResults<ValueObject> results = mongoTemplate.mapReduce("jmr1", mapWithExcludeFunction, reduceFunction,
new MapReduceOptions().scopeVariables(scopeVariables).outputTypeInline(), ValueObject.class);
Map<String, Float> m = copyToMap(results);
@@ -132,37 +258,40 @@ public class MapReduceTests {
assertEquals(2, m.get("c").intValue());
assertEquals(1, m.get("d").intValue());
}
@Test
@Test
public void testMapReduceExcludeQuery() {
createMapReduceData();
Query query = new Query(where("x").ne(new String[] { "a", "b" }));
MapReduceResults<ValueObject> results = mongoTemplate.mapReduce(query, "jmr1", mapFunction, reduceFunction, ValueObject.class);
MapReduceResults<ValueObject> results = mongoTemplate.mapReduce(query, "jmr1", mapFunction, reduceFunction,
ValueObject.class);
Map<String, Float> m = copyToMap(results);
assertEquals(3, m.size());
assertEquals(1, m.get("b").intValue());
assertEquals(2, m.get("c").intValue());
assertEquals(1, m.get("d").intValue());
}
private void performMapReduce(boolean inline, boolean withQuery) {
createMapReduceData();
MapReduceResults<ValueObject> results;
if (inline) {
if (withQuery) {
results = mongoTemplate.mapReduce(new Query(), "jmr1", "classpath:map.js", "classpath:reduce.js", ValueObject.class);
results = mongoTemplate.mapReduce(new Query(), "jmr1", "classpath:map.js", "classpath:reduce.js",
ValueObject.class);
} else {
results = mongoTemplate.mapReduce("jmr1", mapFunction, reduceFunction, ValueObject.class);
}
} else {
if (withQuery) {
results = mongoTemplate.mapReduce(new Query(), "jmr1", mapFunction, reduceFunction, options().outputCollection("jmr1_out"), ValueObject.class);
results = mongoTemplate.mapReduce(new Query(), "jmr1", mapFunction, reduceFunction,
options().outputCollection("jmr1_out"), ValueObject.class);
} else {
results = mongoTemplate.mapReduce("jmr1", mapFunction, reduceFunction, new MapReduceOptions().outputCollection("jmr1_out"), ValueObject.class);
results = mongoTemplate.mapReduce("jmr1", mapFunction, reduceFunction,
new MapReduceOptions().outputCollection("jmr1_out"), ValueObject.class);
}
}
Map<String, Float> m = copyToMap(results);

View File

@@ -0,0 +1,40 @@
package org.springframework.data.mongodb.core.mapreduce;
public class NumberAndVersion {
private String id;
private Long number;
private Long version;
private Long value;
public Long getValue() {
return value;
}
public void setValue(Long value) {
this.value = value;
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public Long getNumber() {
return number;
}
public void setNumber(Long number) {
this.number = number;
}
public Long getVersion() {
return version;
}
public void setVersion(Long version) {
this.version = version;
}
@Override
public String toString() {
return "NumberAndVersion [id=" + id + ", number=" + number + ", version=" + version + ", value=" + value + "]";
}
}

View File

@@ -0,0 +1,35 @@
package org.springframework.data.mongodb.core.mapreduce;
public class XObject {
private float x;
private float count;
public float getX() {
return x;
}
public void setX(float x) {
this.x = x;
}
public float getCount() {
return count;
}
public void setCount(float count) {
this.count = count;
}
@Override
public String toString() {
return "XObject [x=" + x + " count = " + count + "]";
}
}

View File

@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.core.query;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.math.BigInteger;
@@ -33,6 +35,7 @@ import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -59,7 +62,7 @@ public class QueryMapperUnitTests {
MappingMongoConverter converter = new MappingMongoConverter(factory, context);
converter.afterPropertiesSet();
mapper = new QueryMapper(converter.getConversionService());
mapper = new QueryMapper(converter);
}
@Test
@@ -89,13 +92,22 @@ public class QueryMapperUnitTests {
assertThat(result.get("_id"), is((Object) "1"));
}
@Test
public void handlesObjectIdCapableBigIntegerIdsCorrectly() {
ObjectId id = new ObjectId();
DBObject dbObject = new BasicDBObject("id", new BigInteger(id.toString(), 16));
DBObject result = mapper.getMappedObject(dbObject, null);
assertThat(result.get("_id"), is((Object) id));
}
/**
* @see DATADOC-278
* @see DATAMONGO-278
*/
@Test
public void translates$NeCorrectly() {
Criteria criteria = Criteria.where("foo").ne(new ObjectId().toString());
Criteria criteria = where("foo").ne(new ObjectId().toString());
DBObject result = mapper.getMappedObject(criteria.getCriteriaObject(), context.getPersistentEntity(Sample.class));
Object object = result.get("_id");
@@ -104,6 +116,50 @@ public class QueryMapperUnitTests {
assertThat(dbObject.get("$ne"), is(ObjectId.class));
}
/**
* @see DATAMONGO-326
*/
@Test
public void handlesEnumsCorrectly() {
Query query = query(where("foo").is(Enum.INSTANCE));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(String.class));
}
@Test
public void handlesEnumsInNotEqualCorrectly() {
Query query = query(where("foo").ne(Enum.INSTANCE));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
Object ne = ((DBObject) object).get("$ne");
assertThat(ne, is(String.class));
assertThat(ne.toString(), is(Enum.INSTANCE.name()));
}
@Test
public void handlesEnumsIn$InCorrectly() {
Query query = query(where("foo").in(Enum.INSTANCE));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
Object in = ((DBObject) object).get("$in");
assertThat(in, is(BasicDBList.class));
BasicDBList list = (BasicDBList) in;
assertThat(list.size(), is(1));
assertThat(list.get(0), is(String.class));
assertThat(list.get(0).toString(), is(Enum.INSTANCE.name()));
}
class Sample {
@Id
@@ -115,4 +171,8 @@ public class QueryMapperUnitTests {
@Id
private BigInteger id;
}
enum Enum {
INSTANCE;
}
}

View File

@@ -365,4 +365,22 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
Metrics.KILOMETERS));
assertThat(results.getContent().isEmpty(), is(false));
}
/**
* @see DATAMONGO-323
*/
@Test
public void considersSortForAnnotatedQuery() {
List<Person> result = repository.findByAgeLessThan(60, new Sort("firstname"));
assertThat(result.size(), is(7));
assertThat(result.get(0), is(alicia));
assertThat(result.get(1), is(boyd));
assertThat(result.get(2), is(carter));
assertThat(result.get(3), is(dave));
assertThat(result.get(4), is(leroi));
assertThat(result.get(5), is(oliver));
assertThat(result.get(6), is(stefan));
}
}

View File

@@ -72,6 +72,9 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
List<Person> findByFirstnameLike(String firstname);
List<Person> findByFirstnameLikeOrderByLastnameAsc(String firstname, Sort sort);
@Query("{'age' : { '$lt' : ?0 } }")
List<Person> findByAgeLessThan(int age, Sort sort);
/**
* Returns a page of {@link Person}s with a lastname mathing the given one (*-wildcards supported).

View File

@@ -3,9 +3,12 @@ package org.springframework.data.mongodb.repository.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.repository.AbstractPersonRepositoryIntegrationTests;
import org.springframework.test.context.ContextConfiguration;
@@ -18,10 +21,21 @@ import org.springframework.test.context.ContextConfiguration;
@ContextConfiguration
public class MongoNamespaceIntegrationTests extends AbstractPersonRepositoryIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
@Override
public void setUp() {
super.setUp();
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void assertDefaultMappingContextIsWired() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("MongoNamespaceIntegrationTests-context.xml",
reader.loadBeanDefinitions(new ClassPathResource("MongoNamespaceIntegrationTests-context.xml",
getClass()));
BeanDefinition definition = factory.getBeanDefinition("personRepository");
assertThat(definition, is(notNullValue()));

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import static org.mockito.Mockito.*;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import java.util.Arrays;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.BasicDBList;
/**
* Unit tests for {@link ConvertingParameterAccessor}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class ConvertingParameterAccessorUnitTests {
@Mock
MongoDbFactory factory;
@Mock
MongoParameterAccessor accessor;
MongoMappingContext context;
MappingMongoConverter converter;
@Before
public void setUp() {
context = new MongoMappingContext();
converter = new MappingMongoConverter(factory, context);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullWriter() {
new MappingMongoConverter(null, context);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullContext() {
new MappingMongoConverter(factory, null);
}
@Test
public void convertsCollectionUponAccess() {
when(accessor.getBindableValue(0)).thenReturn(Arrays.asList("Foo"));
ConvertingParameterAccessor parameterAccessor = new ConvertingParameterAccessor(converter, accessor);
Object result = parameterAccessor.getBindableValue(0);
BasicDBList reference = new BasicDBList();
reference.add("Foo");
assertThat(result, is((Object) reference));
}
}

View File

@@ -30,7 +30,6 @@ import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
@@ -50,9 +49,6 @@ import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import org.springframework.data.repository.query.parser.PartTree;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit test for {@link MongoQueryCreator}.
*
@@ -70,16 +66,14 @@ public class MongoQueryCreatorUnitTests {
@Before
public void setUp() throws SecurityException, NoSuchMethodException {
context = new MongoMappingContext();
doAnswer(new Answer<Void>() {
public Void answer(InvocationOnMock invocation) throws Throwable {
DBObject dbObject = (DBObject) invocation.getArguments()[1];
dbObject.put("value", new BasicDBObject("value", "value"));
return null;
doAnswer(new Answer<Object>() {
public Object answer(InvocationOnMock invocation) throws Throwable {
return invocation.getArguments()[0];
}
}).when(converter).write(any(), Mockito.any(DBObject.class));
}).when(converter).convertToMongoType(any());
}
@Test
@@ -157,18 +151,66 @@ public class MongoQueryCreatorUnitTests {
}
/**
* DATADOC-291
* @see DATAMONGO-291
*/
@Test
public void honoursMappingInformationForPropertyPaths() {
PartTree partTree = new PartTree("findByUsername", User.class);
MongoQueryCreator creator = new MongoQueryCreator(partTree, getAccessor(converter, "Oliver"), context);
Query reference = query(where("foo").is("Oliver"));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsExistsClauseCorrectly() {
PartTree tree = new PartTree("findByAgeExists", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, true), context);
Query query = query(where("age").exists(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsRegexClauseCorrectly() {
PartTree tree = new PartTree("findByFirstNameRegex", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, ".*"), context);
Query query = query(where("firstName").regex(".*"));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsTrueClauseCorrectly() {
PartTree tree = new PartTree("findByActiveTrue", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsFalseClauseCorrectly() {
PartTree tree = new PartTree("findByActiveFalse", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(false));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
when(converter.convertToMongoType("Dave")).thenReturn("Dave");
@@ -191,9 +233,9 @@ public class MongoQueryCreatorUnitTests {
List<Person> findByLocationNearAndFirstname(Point location, Distance maxDistance, String firstname);
}
class User {
@Field("foo")
String username;
}

View File

@@ -30,10 +30,8 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.support.MappingMongoEntityInformation;
/**
* Unit test for {@link MongoRepositoryFactory}.
@@ -45,7 +43,7 @@ public class MongoRepositoryFactoryUnitTests {
@Mock
MongoTemplate template;
@Mock
MongoConverter converter;
@@ -55,7 +53,7 @@ public class MongoRepositoryFactoryUnitTests {
@Mock
@SuppressWarnings("rawtypes")
MongoPersistentEntity entity;
@Before
@SuppressWarnings({ "rawtypes", "unchecked" })
public void setUp() {
@@ -63,12 +61,6 @@ public class MongoRepositoryFactoryUnitTests {
when(converter.getMappingContext()).thenReturn((MappingContext) mappingContext);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsInvalidIdType() throws Exception {
MongoRepositoryFactory factory = new MongoRepositoryFactory(template);
factory.getRepository(SampleRepository.class);
}
@Test
@SuppressWarnings("unchecked")
public void usesMappingMongoEntityInformationIfMappingContextSet() {
@@ -80,8 +72,4 @@ public class MongoRepositoryFactoryUnitTests {
MongoEntityInformation<Person, Serializable> entityInformation = factory.getEntityInformation(Person.class);
assertTrue(entityInformation instanceof MappingMongoEntityInformation);
}
private interface SampleRepository extends MongoRepository<Person, Long> {
}
}

View File

@@ -18,11 +18,21 @@ package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.QPerson;
import org.springframework.data.mongodb.repository.support.QueryDslMongoRepository.SpringDataMongodbSerializer;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mysema.query.types.path.StringPath;
/**
@@ -30,10 +40,20 @@ import com.mysema.query.types.path.StringPath;
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class SpringDataMongodbSerializerUnitTests {
MongoMappingContext context = new MongoMappingContext();
SpringDataMongodbSerializer serializer = new QueryDslMongoRepository.SpringDataMongodbSerializer(context);
@Mock
MongoDbFactory dbFactory;
MongoConverter converter;
SpringDataMongodbSerializer serializer;
@Before
public void setUp() {
MongoMappingContext context = new MongoMappingContext();
converter = new MappingMongoConverter(dbFactory, context);
serializer = new QueryDslMongoRepository.SpringDataMongodbSerializer(converter);
}
@Test
public void uses_idAsKeyForIdProperty() {
@@ -47,4 +67,29 @@ public class SpringDataMongodbSerializerUnitTests {
StringPath path = QPerson.person.address.street;
assertThat(serializer.getKeyForPath(path, path.getMetadata()), is("street"));
}
@Test
public void convertsComplexObjectOnSerializing() {
Address address = new Address();
address.street = "Foo";
address.zipCode = "01234";
DBObject result = serializer.asDBObject("foo", address);
assertThat(result, is(BasicDBObject.class));
BasicDBObject dbObject = (BasicDBObject) result;
Object value = dbObject.get("foo");
assertThat(value, is(notNullValue()));
assertThat(value, is(BasicDBObject.class));
Object reference = converter.convertToMongoType(address);
assertThat(value, is(reference));
}
class Address {
String street;
@Field("zip_code")
String zipCode;
}
}

View File

@@ -0,0 +1 @@
function(doc, prev) { prev.count += 1 }

View File

@@ -0,0 +1 @@
function(doc) { return { x : doc.x }; }

View File

@@ -5,7 +5,9 @@
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:mapping-converter id="converter" db-factory-ref="factory" />
<mongo:mapping-converter id="converter" db-factory-ref="factory">
<mongo:custom-converters base-package="org.springframework.data.mongodb.config" />
</mongo:mapping-converter>
<mongo:db-factory id="factory" />

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="rack1" />
<mongo:mongo id="mongo">
<mongo:options max-auto-connect-retry-time="27" />
</mongo:mongo>
<mongo:db-factory id="second" write-concern="REPLICAS_SAFE" />
<!-- now part of the namespace
<bean class="org.springframework.beans.factory.config.CustomEditorConfigurer">
<property name="customEditors">
<map>
<entry key="com.mongodb.WriteConcern" value="org.springframework.data.mongodb.config.WriteConcernPropertyEditor"/>
</map>
</property>
</bean>
-->
</beans>

View File

@@ -2,12 +2,23 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" write-concern="SAFE" mongo-ref="mongo" />
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="SAFE" />
<mongo:mongo id="mongo">
<mongo:options max-auto-connect-retry-time="27" />
</mongo:mongo>
</beans>
<!-- now part of the namespace
<bean class="org.springframework.beans.factory.config.CustomEditorConfigurer">
<property name="customEditors">
<map>
<entry key="com.mongodb.WriteConcern" value="org.springframework.data.mongodb.config.WriteConcernPropertyEditor"/>
</map>
</property>
</bean>
-->
</beans>

View File

@@ -6,5 +6,7 @@
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:mongo id="mongo" host="localhost" port="42" write-concern="SAFE" />
<mongo:mongo id="mongo2" replica-set="127.0.0.1:4711,127.0.0.1:4712" />
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory uri="mongodb://localhost/database.myCollection"/>
</beans>

View File

@@ -0,0 +1,19 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<mongo:db-factory id="mongoDbFactory"
host="localhost"
port="27117"
dbname="database"/>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
</bean>
</beans>

View File

@@ -7,6 +7,11 @@
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<context:property-placeholder location="classpath:replicaSet.properties"/>
<mongo:mongo id="manyReplicaSetMongo" replica-set="${mongo.hosts}"/>
<mongo:mongo id="replicaSetMongo" replica-set="127.0.0.1:10001,localhost:10002"/>
<!--

View File

@@ -18,7 +18,7 @@
</property>
</bean>
<mongo:repositories base-package="org.springframework.data.**.repository">
<mongo:repositories base-package="org.springframework.data.mongodb.repository">
<repository:exclude-filter type="regex" expression=".*MongoRepository"/>
</mongo:repositories>

View File

@@ -0,0 +1 @@
mongo.hosts=192.168.174.130:27017,192.168.174.130:27018,192.168.174.130:27019

View File

@@ -7,10 +7,7 @@ Import-Package:
Export-Template:
org.springframework.data.mongodb.*;version="${project.version}"
Import-Template:
org.springframework.*;version="${org.springframework.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${data.commons.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo.version:[=.=,+1.0.0)}",
com.mongodb.*;version="0",
com.mysema.query.*;version="[2.1.1, 3.0.0)";resolution:=optional,
javax.annotation.processing.*;version="0",
javax.tools.*;version="0",
@@ -18,4 +15,7 @@ Import-Template:
org.apache.commons.collections15.*;version="[4.0.0,5.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.bson.*;version="0",
org.springframework.*;version="${org.springframework.version.30:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${data.commons.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -52,7 +52,7 @@
<xi:include href="introduction/why-sd-doc.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/master/src/docbkx/repositories.xml">
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/1.2.0.RELEASE/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -72,7 +72,7 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/master/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.2.0.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
</part>

View File

@@ -69,13 +69,14 @@
<section>
<title>How the '_id' field is handled in the mapping layer</title>
<para>Mongo requires that you have an '_id' field for all documents. If
you don't provide one the driver will assign a ObjectId with a generated
value. The "_id" field can be of any type the, other than arrays, so
long as it is unique. The driver naturally supports all primitive types
and Dates. When using the <classname>MongoMappingConverter</classname>
there are certain rules that govern how properties from the Java class
is mapped to this '_id' field.</para>
<para>MongoDB requires that you have an '_id' field for all documents.
If you don't provide one the driver will assign a ObjectId with a
generated value. The "_id" field can be of any type the, other than
arrays, so long as it is unique. The driver naturally supports all
primitive types and Dates. When using the
<classname>MongoMappingConverter</classname> there are certain rules
that govern how properties from the Java class is mapped to this '_id'
field.</para>
<para>The following outlines what field will be mapped to the '_id'
document field:</para>
@@ -212,7 +213,7 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
getUserCredentials()</literal> to provide the username and password
information to connect to the database.</para>
<para>Spring's Mongo namespace enables you to easily enable mapping
<para>Spring's MongoDB namespace enables you to easily enable mapping
functionality in XML</para>
<example>
@@ -370,11 +371,18 @@ public class Person {
Language statement to transform a key's value retrieved in the
database before it is used to construct a domain object.</para>
</listitem>
<listitem>
<para><literal>@Field</literal> - applied at the field level and
described the name of the field as it will be represented in the
MongoDB BSON document thus allowing the name to be different than
the fieldname of the class.</para>
</listitem>
</itemizedlist>
<para>The mapping metadata infrastructure is defined in a seperate
spring-data-commons project that is technology agnostic. Specific
subclasses are using in the Mongo support to support annotation based
subclasses are using in the MongoDB support to support annotation based
metadata. Other strategies are also possible to put in place if there is
demand.</para>
@@ -388,16 +396,24 @@ public class Person&lt;T extends Address&gt; {
@Id
private String id;
@Indexed(unique = true)
private Integer ssn;
@Field("fName")
private String firstName;
@Indexed
private String lastName;
private Integer age;
@Transient
private Integer accountTotal;
@DBRef
private List&lt;Account&gt; accounts;
private T address;

Some files were not shown because too many files have changed in this diff Show More