Compare commits

..

58 Commits

Author SHA1 Message Date
Oliver Gierke
8d18729898 DATAMONGO-340 - Prepare 1.0.0.RC1. 2011-12-07 01:09:58 +01:00
Oliver Gierke
b5e0b2bec2 DATAMONGO-340 - Polished reference documentation.
Added section ids to generate stable URLs for HTML documentation.
2011-12-07 01:09:58 +01:00
Oliver Gierke
9eb47827c1 DATAMONGO-337 - Added Criteria.nin(…) and ….all(…) taking a Collection. 2011-12-07 01:09:57 +01:00
Oliver Gierke
f97ab25411 DATAMONGO-338 - Updated reference documentation regarding new keywords. 2011-12-07 01:09:57 +01:00
Oliver Gierke
6616761f50 DATAMONGO-322 - MongoTemplates refuses to save entities with unset id if not auto-generateable.
If an entity is handed into the template to be saved or inserted we now check that the auto-generated ObjectId can actually be applied to the id property after saving the object.
2011-12-07 01:09:57 +01:00
Mark Pollack
89de566893 Add findAndModify to docs, update test to include findAndModify with upsert 2011-12-06 13:04:24 -05:00
Mark Pollack
ea1f090b40 Add docs for index ops and clean up 'bare' references to Mongo, change to MongoDB 2011-12-06 11:09:23 -05:00
Oliver Gierke
b5958fb5cc DATAMONGO-338 - Query parser implementation go Regex, Exists, True and False keywords. 2011-12-06 17:01:56 +01:00
Oliver Gierke
75b7aff80a DATAMONGO-318 - Don't throw exceptions for updates not affecting any documents.
Throwing an exception if an update does not affect any documents doesn't make sense in all cases. Removed throwing an exception by default but made the relevant method (handleAnyWriteResultErrors(…)) protected so that subclasses might override this behavior.
2011-12-06 15:15:13 +01:00
Oliver Gierke
7da0fcdd0c DATAMONGO-199 - Fixed bug in CachingMongoPersistentProperty. 2011-12-06 14:48:25 +01:00
Oliver Gierke
c88b6d89db DATAMONGO-251 - Polishing.
JavaDoc, Formatting. Made dependencies in DefaultIndexOperations final. Reduced dependency to MongoOperations instead of depending on MongoTemplate directly. Added not-null assertion to constructor of DIO.
2011-12-06 14:33:45 +01:00
Oliver Gierke
de1540aadc DATAMONGO-234 - Polishing.
Removed unused imports, corrected whitespace, formatting.
2011-12-06 14:24:51 +01:00
Oliver Gierke
d1b24d6cfb DATAMONGO-332 - Updated reference documentation to list correct dependencies.
Fixed formatting of log output along the way.
2011-12-06 14:06:59 +01:00
Mark Pollack
e85f3d3299 DATAMONGO-251 - Support geting index information on a collection or mapped class. 2011-12-06 02:25:13 -05:00
Mark Pollack
ef6e08d3f4 DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-12-06 00:26:18 -05:00
Oliver Gierke
21010fbd49 DATACMNS-91 - Reject null parameters in SimpleMongoRepository.
According to the specification in CrudRepository we now reject null values for ids and entities in CRUD methods.
2011-12-02 13:34:44 +01:00
Oliver Gierke
4325d6c9fa Reactivated accidentally disabled unit tests. 2011-12-02 13:02:57 +01:00
Oliver Gierke
bc16ccfded DATACMNS-77 - Using constants from ClassTypeInformation inside MappingMongoConverter. 2011-12-02 11:33:37 +01:00
Oliver Gierke
04f5f9f662 DATACMNS-103 - Adapt changes in BeanWrapper.
Removed obsolete exception handling code.
2011-12-02 10:08:48 +01:00
Oliver Gierke
b1f1b8efaa DATAMONGO-321 - Overhaul of id handling.
Cleaned up the id handling on query mapping and mapping in general. We now only try to convert id values into an ObjectId and store it as is using potentially registered custom converters. Register BigInteger<->String converters by default now.
2011-12-01 17:50:36 +01:00
Oliver Gierke
de300e2643 DATAMONGO-328 - Set required MongoDB version to 0.
The MANIFEST.MF in current MongoDB driver version is broken in terms of not stating package versions. Thus we unfortunately cannot refer to a particular version range but have to use the generic 0 as required version.
2011-12-01 17:05:28 +01:00
Oliver Gierke
20088b83d9 Removed compiler warnings. 2011-12-01 16:47:40 +01:00
Oliver Gierke
58f200f15e DATAMONGO-335 - Set up hybrid Spring 3.1/3.0.6 build.
Also see DATACMNS-103.
2011-12-01 16:06:38 +01:00
Oliver Gierke
8718700249 DATAMONGO-334 - Switched to use http://repo.springsource.org as repository.
Fixed versions of build plugins along the way.
2011-12-01 16:05:08 +01:00
Oliver Gierke
f4063d1679 DATAMONGO-333 - Default to Object for AbstractMongoEventlistener domain type.
In case an extension of AbstractMongoEventListener does not define a parameter type we now default to Object as handled domain type as we'd cause a NullPointerException if not.
2011-12-01 12:16:27 +01:00
Oliver Gierke
ef063613c7 DATAMONGO-325 - MongoTemplate now correctly refuses not found map reduce JavaScript files.
We now check whether a URL was passed in as map and/or reduce function and throw an exception in case the file either does not exist or cannot be read.
2011-11-30 22:53:56 +01:00
Oliver Gierke
2eda0f1701 DATAMONGO-185 - Expose hints on Query.
Query now exposes a withHint(…) method which will be applied to the DBCursor on query execution. Reduced CursorPreparer's visibility to the package and removed methods exposing it from MongoOperations.
2011-11-30 22:29:59 +01:00
Oliver Gierke
ec7b65e21d DATAMONGO-331 - Fixed typo in WriteConcern enumeration for db-factory element. 2011-11-30 18:27:33 +01:00
Oliver Gierke
c7f7571f3f DATAMONGO-326 - QueryMapper now delegates type conversion to MongoConverter.
QueryMapper now delegates to a MongoConverter instead of a plain ConversionService and invokes optional conversion on it. This optional conversion now removes type information from the created DBObject.
2011-11-30 17:56:44 +01:00
Oliver Gierke
9f71af42e8 DATAMONGO-329 - Fixed Collection and Map value handling for more open properties.
The decision whether a property value was handled as Collection or Map was based on inspecting the property's type which failed for classes using very open property declarations such as:

class MyClass {

  Object something;
}

We now rather inspect the value type instead of the property.
2011-11-30 16:20:25 +01:00
Oliver Gierke
92775170e1 DATAMONGO-301 - Allow classpath-scanning for Converters.
<mongo:custom-conversions /> now has a base-package attribute that scans for Converter and GenericConverter beans. Added <tool:exports /> metadata for MappingMongoConverter.
2011-11-30 15:26:18 +01:00
Oliver Gierke
4c7e338770 Adapt refactorings in SD Commons. 2011-11-28 18:07:33 +01:00
Oliver Gierke
e3fff52d17 DATAMONGO-298 - CustomConversions now also considers sub-types of Number as simple.
CustomConversions now delegates to MongoSimpleTypes.HOLDER.isSimpleType(…) instead of maintaining an additional list of Mongo-primitive types. Added DBObject to the list of Mongo-primitive types.
2011-11-24 15:20:49 +01:00
Oliver Gierke
5477ab20b2 DATAMONGO-324 - Added shortcut in MappingMongoConverter to allow reading DBObjects without conversion.
Added check in MappingMongoConverter.read(…) to shortcut object conversion if the requested type is DBObject.
2011-11-24 12:59:47 +01:00
Oliver Gierke
4cf3567f42 DATAMONGO-310 - MappingMongoConverter now creates native Mongo types for Maps and Collections in convertToMongoType(…).
MappingMongoConverter.convertToMongoType(…) not only converts elements of collections and maps but also converts the wrapper into the appropriate MongoDB type (BasicDBList, BasicDBObject).
2011-11-23 13:15:50 +01:00
Oliver Gierke
b26bb62a63 DATAMONGO-305 - Removed synchronization from Query class.
As Query is not intended to be thread-safe at all, we can safely remove the synchronized blocks from sort() and fields().
2011-11-23 12:48:39 +01:00
Oliver Gierke
f156d7b5af DATAMONGO-312 - MappingMongoConverter handles complex enum types correctly.
If an Enum implements abstract methods, the Class object derived from ${ENUM}.getClass() does not return true for ….isEnum(). Thus we have to rather check Enum.class.isAssignableFrom(…) as this catches this scenario as well. Also see DATACMNS-99 for a related fix in simple type handling in the core infrastructure.
2011-11-23 11:58:09 +01:00
Oliver Gierke
7bf3643902 DATAMONGO-304 - Removed document subpackage from Log4jAppender module. 2011-11-23 11:07:47 +01:00
Oliver Gierke
201ae3e92d Polished unit test. 2011-11-23 10:58:35 +01:00
Oliver Gierke
07556ec58c DATAMONGO-323 - Annotated repository queries consider dynamic sort now.
Applying a Sort parameter handed into a repository query method now for string based (aka. @Query annotated) queries.
2011-11-23 10:58:22 +01:00
Oliver Gierke
39807b17e1 DATAMONGO-309 - MappingMongoConverter now correctly maps Arrays as Map values.
We now not only convert collection values of Maps into BasicDBLists but arrays as well.
2011-11-23 10:28:40 +01:00
Oliver Gierke
4913fe26ac DATAMONGO-296 - Hook into Querydsl serialization to get predicate parameters converted.
Overrode MongoDbSerializer.asDBObject(…) and delegate to our MongoConverter to potentially convert predicate parameters. Upgraded to Querydsl 2.2.5.
2011-11-23 10:15:46 +01:00
Oliver Gierke
7642a719ff Polishing.
Removed unused imports, removed compiler warnings, polished JavaDoc.
2011-11-23 09:26:19 +01:00
Oliver Gierke
6c1ce576a4 DATACMNS-98 - Reflect refactoring.
MongoQueryMethod now uses RepositoryMetadata.getReturnedDomainType(…) instead of the static method of ClassUtils.
2011-11-21 19:21:22 +01:00
Mark Pollack
c99882201d DATAMONGO-234 - MongoTemplate should support the findAndModify operation to update version fields 2011-11-17 17:38:54 -05:00
Mark Pollack
ce6a64e4a9 DATAMONGO-308 - Add support for upsert methods 2011-11-16 16:25:58 -05:00
Mark Pollack
2fcc323bcd DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:55:35 -05:00
Mark Pollack
17c7b1d2b5 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods 2011-11-16 14:30:56 -05:00
Mark Pollack
cfefe46cd4 DATAMONGO-213 - Add WriteConcern to arguments of MongoOperations.update*() methods
DATAMONGO-320 - Remove use of slaveOk boolean option in MongoTemplate as it is deprecated. Replace with ReadPreference
2011-11-16 13:28:13 -05:00
Mark Pollack
64921ddad1 DATAMONGO-319 - WriteConcern not parsed correctly in namespace handlers
DATAMONGO-311 - Update MongoDB driver to v 2.7.x
2011-11-15 16:54:02 -05:00
Mark Pollack
edda1764fe DATAMONGO-311 - Update MongoDB driver to v 2.7.x
Still investigating write_concern compatiblity as mentioned in the ticket
2011-11-15 12:44:25 -05:00
Mark Pollack
8113b79109 DATAMONGO-315 - MongoTemplate.findOne(query) methods ignore SortOrder on query 2011-11-14 23:26:16 -05:00
Mark Pollack
9fde4dff3e DATAMONGO-195 - Add description of @Field mapping annotation to reference docs 2011-11-14 22:53:08 -05:00
Mark Pollack
d4b3e2b99d DATAMONGO-306 - NullPointerException if mongo factory created via URI with out credentials 2011-11-14 22:48:07 -05:00
Mark Pollack
68a31d75f3 DATAMONGO-313 [Refactoring] - Use MongoOperations interface instead of MongoTemplate class 2011-11-14 22:20:50 -05:00
Mark Pollack
3e15c21419 DATAMONGO-208 - Add suppoprt for group() operation on collection in MongoOperations 2011-11-14 22:08:29 -05:00
Mark Pollack
e9f253d34f DATAMONGO-316 - Replica Set configuration via properties file throws ArrayIndexOutOfBoundsException 2011-11-14 16:38:18 -05:00
Thomas Risberg
80aa057acb Preparing for snapshot builds 2011-11-04 09:24:45 -04:00
92 changed files with 3722 additions and 920 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongo-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.M5</version>
<version>1.0.0.RC1</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
@@ -23,7 +23,7 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>

View File

@@ -5,7 +5,7 @@ and connects directly to the MongoDB server using the driver. It has no dependen
To use it, configure a host, port, (optionally) applicationId, and database property in your Log4J configuration:
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
@@ -32,7 +32,7 @@ An example log entry might look like:
{
"_id" : ObjectId("4d89341a8ef397e06940d5cd"),
"applicationId" : "my.application",
"name" : "org.springframework.data.document.mongodb.log4j.AppenderTest",
"name" : "org.springframework.data.mongodb.log4j.AppenderTest",
"level" : "DEBUG",
"timestamp" : ISODate("2011-03-23T16:53:46.778Z"),
"properties" : {

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;

View File

@@ -14,7 +14,7 @@
* limitations under the License.
*/
package org.springframework.data.document.mongodb.log4j;
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;

View File

@@ -1,6 +1,6 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.springframework.data.document.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost

View File

@@ -6,7 +6,7 @@
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.M5</version>
<version>1.0.0.RC1</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -16,8 +16,10 @@
<org.mockito.version>1.8.4</org.mockito.version>
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version>3.0.6.RELEASE</org.springframework.version>
<data.commons.version>1.2.0.M2</data.commons.version>
<org.springframework.version.30>3.0.6.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.2.0.BUILD-SNAPSHOT</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
@@ -92,42 +94,42 @@
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-expression</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework.version}</version>
<version>${org.springframework.version.range}</version>
<scope>test</scope>
</dependency>
@@ -289,6 +291,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
@@ -308,19 +311,18 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<configuration>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/Abstract*.java</exclude>
</excludes>
<junitArtifactName>junit:junit</junitArtifactName>
</configuration>
</plugin>
<plugin>
<artifactId>maven-source-plugin</artifactId>
<version>2.1.2</version>
<executions>
<execution>
<id>attach-sources</id>
@@ -375,24 +377,24 @@
<pluginRepository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
<url>http://repo.springsource.org/release</url>
</pluginRepository>
</pluginRepositories>
<repositories>
<repository>
<id>repository.springframework.maven.release</id>
<name>Spring Framework Maven Release Repository</name>
<url>http://maven.springframework.org/release</url>
<url>http://repo.springsource.org/release</url>
</repository>
<repository>
<id>repository.springframework.maven.milestone</id>
<name>Spring Framework Maven Milestone Repository</name>
<url>http://maven.springframework.org/milestone</url>
<url>http://repo.springsource.org/milestone</url>
</repository>
<repository>
<id>repository.springframework.maven.snapshot</id>
<name>Spring Framework Maven Snapshot Repository</name>
<url>http://maven.springframework.org/snapshot</url>
<url>http://repo.springsource.org/snapshot</url>
</repository>
</repositories>
<reporting>

View File

@@ -5,15 +5,15 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<version>1.0.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.6.5</mongo.version>
<querydsl.version>2.2.4</querydsl.version>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.2.5</querydsl.version>
</properties>
<dependencies>

View File

@@ -18,6 +18,9 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.io.IOException;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
@@ -36,13 +39,20 @@ import org.springframework.beans.factory.support.ManagedSet;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.type.classreading.MetadataReader;
import org.springframework.core.type.classreading.MetadataReaderFactory;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -133,15 +143,29 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
if (customConvertersElements.size() == 1) {
Element customerConvertersElement = customConvertersElements.get(0);
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<BeanMetadataElement>();
List<Element> converterElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (converterElements != null) {
for (Element listenerElement : converterElements) {
converterBeans.add(parseConverter(listenerElement, parserContext));
}
}
// Scan for Converter and GenericConverter beans in the given base-package
String packageToScan = customerConvertersElement.getAttribute(BASE_PACKAGE);
if (StringUtils.hasText(packageToScan)) {
ClassPathScanningCandidateComponentProvider provider = new ClassPathScanningCandidateComponentProvider(true);
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class), new AssignableTypeFilter(
GenericConverter.class)));
for (BeanDefinition candidate : provider.findCandidateComponents(packageToScan)) {
converterBeans.add(candidate);
}
}
BeanDefinitionBuilder conversionsBuilder = BeanDefinitionBuilder.rootBeanDefinition(CustomConversions.class);
conversionsBuilder.addConstructorArgValue(converterBeans);
@@ -194,4 +218,39 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
"Element <converter> must specify 'ref' or contain a bean definition for the converter", element);
return null;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*
* @author Oliver Gierke
*/
private static class NegatingFilter implements TypeFilter {
private final Set<TypeFilter> delegates;
/**
* Creates a new {@link NegatingFilter} with the given delegates.
*
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {
Assert.notNull(filters);
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory) throws IOException {
for (TypeFilter delegate : delegates) {
if (delegate.match(metadataReader, metadataReaderFactory)) {
return false;
}
}
return true;
}
}
}

View File

@@ -15,15 +15,20 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.BeanNames.DB_FACTORY;
import static org.springframework.data.mongodb.config.ParsingUtils.getSourceBeanDefinition;
import java.util.Map;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
@@ -60,7 +65,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
@@ -85,6 +90,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
//Register property editor to parse WriteConcern
registerWriteConcernPropertyEditor(parserContext.getRegistry());
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
@@ -106,6 +115,14 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return BeanDefinitionReaderUtils.registerWithGeneratedName(mongoBuilder.getBeanDefinition(),
parserContext.getRegistry());
}
private void registerWriteConcernPropertyEditor(BeanDefinitionRegistry registry) {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.WriteConcern", "org.springframework.data.mongodb.config.WriteConcernPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
/**
* Returns a {@link BeanDefinition} for a {@link UserCredentials} object.

View File

@@ -16,9 +16,15 @@
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoFactoryBean;
@@ -26,7 +32,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for &lt;mongo;gt; definitions. If no name
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
*/
@@ -39,17 +45,40 @@ public class MongoParser extends AbstractSingleBeanDefinitionParser {
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
super.doParse(element, builder);
ParsingUtils.setPropertyValue(element, builder, "port", "port");
ParsingUtils.setPropertyValue(element, builder, "host", "host");
ParsingUtils.setPropertyValue(element, builder, "write-concern", "writeConcern");
ParsingUtils.parseMongoOptions(parserContext, element, builder);
ParsingUtils.parseReplicaSet(parserContext, element, builder);
ParsingUtils.parseMongoOptions(element, builder);
ParsingUtils.parseReplicaSet(element, builder);
registerServerAddressPropertyEditor(parserContext.getRegistry());
registerWriteConcernPropertyEditor(parserContext.getRegistry());
}
/**
* One should only register one bean definition but want to have the convenience of using AbstractSingleBeanDefinitionParser but have the side effect of
* registering a 'default' property editor with the container.
* @param parserContext the ParserContext to
*/
private void registerServerAddressPropertyEditor(BeanDefinitionRegistry registry) {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("java.util.List", "org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
private void registerWriteConcernPropertyEditor(BeanDefinitionRegistry registry) {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.WriteConcern", "org.springframework.data.mongodb.config.WriteConcernPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {

View File

@@ -19,15 +19,12 @@ package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
import com.mongodb.ServerAddress;
abstract class ParsingUtils {
/**
@@ -38,22 +35,11 @@ abstract class ParsingUtils {
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
static boolean parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
ManagedList<Object> serverAddresses = new ManagedList<Object>();
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
for (String element2 : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(element2, ":");
BeanDefinitionBuilder defBuilder = BeanDefinitionBuilder.genericBeanDefinition(ServerAddress.class);
defBuilder.addConstructorArgValue(hostAndPort[0]);
defBuilder.addConstructorArgValue(hostAndPort[1]);
serverAddresses.add(defBuilder.getBeanDefinition());
}
if (!serverAddresses.isEmpty()) {
mongoBuilder.addPropertyValue("replicaSetSeeds", serverAddresses);
}
mongoBuilder.addPropertyValue("replicaSetSeeds", replicaSetString);
}
return true;
@@ -64,7 +50,7 @@ abstract class ParsingUtils {
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(ParserContext parserContext, Element element, BeanDefinitionBuilder mongoBuilder) {
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;

View File

@@ -0,0 +1,53 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.List;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
/**
* Parse a string to a List<ServerAddress>. The format is host1:port1,host2:port2,host3:port3.
* @author Mark Pollack
*
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
/**
* Parse a string to a List<ServerAddress>
*/
public void setAsText(String replicaSetString) {
List<ServerAddress> serverAddresses = new ArrayList<ServerAddress>();
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
for (String element2 : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(element2, ":");
try {
serverAddresses.add(new ServerAddress(hostAndPort[0], Integer.parseInt(hostAndPort[1])));
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Could not parse port " + hostAndPort[1], e );
} catch (UnknownHostException e) {
throw new IllegalArgumentException("Could not parse host " + hostAndPort[0], e );
}
}
setValue(serverAddresses);
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.WriteConcern;
/**
* Parse a string to a {@link WriteConcern}. If it is a well know {@link String} as identified by the
* {@link WriteConcern#valueOf(String)}, use the well known {@link WriteConcern} value, otherwise pass the string as is
* to the constructor of the write concern. There is no support for other constructor signatures when parsing from a
* string value.
*
* @author Mark Pollack
*/
public class WriteConcernPropertyEditor extends PropertyEditorSupport {
/**
* Parse a string to a List<ServerAddress>
*/
@Override
public void setAsText(String writeConcernString) {
WriteConcern writeConcern = WriteConcern.valueOf(writeConcernString);
if (writeConcern != null) {
// have a well known string
setValue(writeConcern);
} else {
// pass on the string to the constructor
setValue(new WriteConcern(writeConcernString));
}
}
}

View File

@@ -22,7 +22,7 @@ import com.mongodb.DBCursor;
*
* @author Oliver Gierke
*/
public interface CursorPreparer {
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.

View File

@@ -0,0 +1,162 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* Default implementation of {@link IndexOperations}.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class DefaultIndexOperations implements IndexOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.dropIndex(name);
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@SuppressWarnings("unchecked")
private List<IndexInfo> getIndexData(List<DBObject> dbObjectList) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
Map<String, Order> keyOrderMap = new LinkedHashMap<String, Order>();
DBObject keyDbObject = (DBObject) ix.get("key");
Iterator<?> entries = keyDbObject.toMap().entrySet().iterator();
while (entries.hasNext()) {
Entry<Object, Integer> thisEntry = (Entry<Object, Integer>) entries.next();
String key = thisEntry.getKey().toString();
int value = thisEntry.getValue();
if (value == 1) {
keyOrderMap.put(key, Order.ASCENDING);
} else {
keyOrderMap.put(key, Order.DESCENDING);
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(keyOrderMap, name, unique, dropDuplicates, sparse));
}
return indexInfoList;
}
});
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
public class FindAndModifyOptions {
boolean returnNew;
boolean upsert;
boolean remove;
/**
* Static factory method to create a FindAndModifyOptions instance
*
* @return a new instance
*/
public static FindAndModifyOptions options() {
return new FindAndModifyOptions();
}
public FindAndModifyOptions returnNew(boolean returnNew) {
this.returnNew = returnNew;
return this;
}
public FindAndModifyOptions upsert(boolean upsert) {
this.upsert = upsert;
return this;
}
public FindAndModifyOptions remove(boolean remove) {
this.remove = remove;
return this;
}
public boolean isReturnNew() {
return returnNew;
}
public boolean isUpsert() {
return upsert;
}
public boolean isRemove() {
return remove;
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface IndexOperations {
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the collection indicated by the entity
* class. If not it will be created.
*
* @param indexDefinition must not be {@literal null}.
*/
void ensureIndex(IndexDefinition indexDefinition);
/**
* Drops an index from this collection.
*
* @param name name of index to drop
*/
void dropIndex(String name);
/**
* Drops all indices from this collection.
*/
void dropAllIndexes();
/**
* Clears all indices that have not yet been applied to this collection.
*/
void resetIndexCache();
/**
* Returns the index information on the collection.
*
* @return index information on the collection
*/
List<IndexInfo> getIndexInfo();
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Represents an action taken against the collection. Used by {@link WriteConcernResolver} to determine a custom
* WriteConcern based on this information.
*
* Properties that will always be not-null are collectionName and defaultWriteConcern.
* The EntityClass is null only for the MongoActionOperaton.INSERT_LIST.
*
* INSERT, SAVE have null query,
* REMOVE has null document
* INSERT_LIST has null entityClass, document, and query.
*
* @author Mark Pollack
*
*/
public class MongoAction {
private String collectionName;
private WriteConcern defaultWriteConcern;
private Class<?> entityClass;
private MongoActionOperation mongoActionOperation;
private DBObject query;
private DBObject document;
/**
* Create an instance of a MongoAction
* @param defaultWriteConcern the default write concern
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name
* @param entityClass the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityClass, DBObject document, DBObject query) {
super();
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;
this.collectionName = collectionName;
this.entityClass = entityClass;
this.query = query;
this.document = document;
}
public String getCollectionName() {
return collectionName;
}
public WriteConcern getDefaultWriteConcern() {
return defaultWriteConcern;
}
public Class<?> getEntityClass() {
return entityClass;
}
public MongoActionOperation getMongoActionOperation() {
return mongoActionOperation;
}
public DBObject getQuery() {
return query;
}
public DBObject getDocument() {
return document;
}
}

View File

@@ -13,21 +13,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.TypeMapper;
package org.springframework.data.mongodb.core;
/**
* Interfaces for components being able to provide a {@link TypeMapper}.
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the
* WriteConcern to use for a given mutating operation
*
* @author Oliver Gierke
* @author Mark Pollack
* @see MongoAction
*
*/
public interface TypeKeyAware {
/**
* Returns the {@link TypeMapper}.
*
* @return the {@link TypeMapper} or {@literal null} if none available.
*/
boolean isTypeKey(String key);
public enum MongoActionOperation {
REMOVE,
UPDATE,
INSERT,
INSERT_LIST,
SAVE
}

View File

@@ -19,21 +19,24 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.CommandResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteResult;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
@@ -48,6 +51,7 @@ public interface MongoOperations {
/**
* The collection name used for the specified class by this template.
*
* @param entityClass must not be {@literal null}.
* @return
*/
String getCollectionName(Class<?> entityClass);
@@ -68,7 +72,7 @@ public interface MongoOperations {
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
@@ -77,7 +81,7 @@ public interface MongoOperations {
* @param options query options to use
*/
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
@@ -87,18 +91,6 @@ public interface MongoOperations {
* @param dch the handler that will extract results, one document at a time
*/
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler using the
* provided CursorPreparer.
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
*/
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer);
/**
* Executes a {@link DbCallback} translating any exceptions as necessary.
@@ -237,6 +229,20 @@ public interface MongoOperations {
*/
void dropCollection(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection associated with the given entity class
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -265,11 +271,39 @@ public interface MongoOperations {
* @return the converted collection
*/
<T> List<T> findAll(Class<T> entityClass, String collectionName);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parameterized type of the returned list
* @return The results of the group operation
*/
<T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parameterized type of the returned list
* @return The results of the group operation
*/
<T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -277,11 +311,12 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
/**
* Execute a map-reduce operation that takes additional map-reduce options.
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -289,12 +324,13 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of INLINE
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of
* INLINE
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -303,10 +339,12 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
/**
* Execute a map-reduce operation that takes a query and additional map-reduce options
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -315,7 +353,8 @@ public interface MongoOperations {
* @param entityClass The parameterized type of the returned list
* @return The results of the map reduce operation
*/
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass );
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
MapReduceOptions mapReduceOptions, Class<T> entityClass);
/**
* Returns {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
@@ -338,23 +377,6 @@ public interface MongoOperations {
*/
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass, String collectionName);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the collection indicated by the entity
* class. If not it will be created.
*
* @param indexDefinition
* @param entityClass class that determines the collection to use
*/
void ensureIndex(IndexDefinition indexDefinition, Class<?> entityClass);
/**
* Ensure that an index for the provided {@link IndexDefinition} exists. If not it will be created.
*
* @param index
* @param collectionName
*/
void ensureIndex(IndexDefinition indexDefinition, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
@@ -425,26 +447,6 @@ public interface MongoOperations {
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of SimpleMongoConverter will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
* @param collectionName name of the collection to retrieve the objects from
*
* @return the List of converted objects.
*/
<T> List<T> find(Query query, Class<T> entityClass, CursorPreparer preparer, String collectionName);
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
@@ -468,6 +470,15 @@ public interface MongoOperations {
*/
<T> T findById(Object id, Class<T> entityClass, String collectionName);
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
@@ -512,7 +523,7 @@ public interface MongoOperations {
* @return
*/
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
*
@@ -521,7 +532,7 @@ public interface MongoOperations {
* @return
*/
long count(Query query, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -612,6 +623,29 @@ public interface MongoOperations {
*/
void save(Object objectToSave, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, Class<?> entityClass);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult upsert(Query query, Update update, String collectionName);
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
@@ -620,6 +654,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, Class<?> entityClass);
@@ -631,6 +666,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateFirst(Query query, Update update, String collectionName);
@@ -642,6 +678,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(Query query, Update update, Class<?> entityClass);
@@ -653,6 +690,7 @@ public interface MongoOperations {
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
WriteResult updateMulti(Query query, Update update, String collectionName);
@@ -662,7 +700,7 @@ public interface MongoOperations {
* @param object
*/
void remove(Object object);
/**
* Removes the given object from the given collection.
*

View File

@@ -96,6 +96,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
*
* Defaults to false
*/
@SuppressWarnings("deprecation")
private boolean slaveOk = MONGO_OPTIONS.slaveOk;
/**
@@ -204,6 +205,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.slaveOk = slaveOk;
}
@SuppressWarnings("deprecation")
public void afterPropertiesSet() {
MONGO_OPTIONS.connectionsPerHost = connectionsPerHost;
MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;

View File

@@ -2,6 +2,8 @@
* Copyright 2010-2011 the original author or authors.
*
* Licensed under t
import org.springframework.util.ResourceUtils;
import org.springframework.data.convert.EntityReader;
he Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +21,7 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Collection;
@@ -30,20 +33,6 @@ import java.util.Map;
import java.util.Scanner;
import java.util.Set;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
@@ -72,27 +61,47 @@ import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.jca.cci.core.ConnectionCallback;
import org.springframework.util.Assert;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.CommandResult;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -122,16 +131,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private WriteConcern writeConcern = null;
private WriteConcernResolver writeConcernResolver = new DefaultWriteConcernResolver();
/*
* WriteResultChecking to be used for write operations if it has been
* specified. Otherwise we should not do any checking.
*/
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
/*
* Flag used to indicate use of slaveOk() for any operations on collections.
/**
* Set the ReadPreference when operating on a collection. See {@link #prepareCollection(DBCollection)}
*/
private boolean slaveOk = false;
private ReadPreference readPreference = null;
private final MongoConverter mongoConverter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
@@ -181,11 +192,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param mongoConverter
*/
public MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter) {
Assert.notNull(mongoDbFactory);
this.mongoDbFactory = mongoDbFactory;
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
this.mapper = new QueryMapper(this.mongoConverter.getConversionService());
this.mapper = new QueryMapper(this.mongoConverter);
// We always have a mapping context in the converter, whether it's a simple one or not
mappingContext = this.mongoConverter.getMappingContext();
@@ -220,12 +232,22 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* TODO: document properly
* Configures the {@link WriteConcernResolver} to be used with the template.
*
* @param slaveOk
* @param writeConcernResolver
*/
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver;
}
/**
* Used by @{link {@link #prepareCollection(DBCollection)} to set the {@link ReadPreference} before any operations are
* performed.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
@@ -293,16 +315,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, null);
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer) {
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a
* {@link DocumentCallbackHandler} using the provided CursorPreparer.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification, must not be {@literal null}.
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
* limits, skips and so on).
*/
protected void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer) {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
}
@@ -399,24 +437,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
// Indexing methods
public void ensureIndex(IndexDefinition indexDefinition, Class<?> entityClass) {
ensureIndex(indexDefinition, determineCollectionName(entityClass));
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(this, collectionName);
}
public void ensureIndex(final IndexDefinition indexDefinition, String collectionName) {
execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
}
return null;
}
});
public IndexOperations indexOps(Class<?> entityClass) {
return new DefaultIndexOperations(this, determineCollectionName(entityClass));
}
// Find methods that take a Query to express the query and that return a single object.
@@ -426,7 +452,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findOne(Query query, Class<T> entityClass, String collectionName) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass);
if (query.getSortObject() == null) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass);
} else {
query.limit(1);
List<T> results = find(query, entityClass, collectionName);
return (results.isEmpty() ? null : results.get(0));
}
}
// Find methods that take a Query to express the query and that return a List of objects.
@@ -436,36 +468,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> List<T> find(final Query query, Class<T> entityClass, String collectionName) {
CursorPreparer cursorPreparer = null;
if (query.getSkip() > 0 || query.getLimit() > 0 || query.getSortObject() != null) {
cursorPreparer = new CursorPreparer() {
public DBCursor prepare(DBCursor cursor) {
DBCursor cursorToUse = cursor;
try {
if (query.getSkip() > 0) {
cursorToUse = cursorToUse.skip(query.getSkip());
}
if (query.getLimit() > 0) {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
return cursorToUse;
}
};
}
CursorPreparer cursorPreparer = query == null ? null : new QueryCursorPreparer(query);
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, cursorPreparer);
}
public <T> List<T> find(Query query, Class<T> entityClass, CursorPreparer preparer, String collectionName) {
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass, preparer);
}
public <T> T findById(Object id, Class<T> entityClass) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityClass);
return findById(id, entityClass, persistentEntity.getCollection());
@@ -502,6 +508,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new GeoResults<T>(result, new Distance(averageDistance, near.getMetric()));
}
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, determineCollectionName(entityClass));
}
public <T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, collectionName);
}
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass) {
return findAndModify(query, update, options, entityClass, determineCollectionName(entityClass));
}
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
// collection in the database.
@@ -522,13 +546,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public long count(final Query query, String collectionName) {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
return execute(collectionName, new CollectionCallback<Long>() {
public Long doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.count(dbObject);
@@ -569,8 +593,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param collection
*/
protected void prepareCollection(DBCollection collection) {
if (this.slaveOk) {
collection.slaveOk();
if (this.readPreference != null) {
collection.setReadPreference(readPreference);
}
}
@@ -581,18 +605,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param writeConcern any WriteConcern already configured or null
* @return The prepared WriteConcern or null
*/
protected WriteConcern prepareWriteConcern(WriteConcern writeConcern) {
return writeConcern;
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
return writeConcernResolver.resolve(mongoAction);
}
protected <T> void doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = insertDBObject(collectionName, dbDoc);
Object id = insertDBObject(collectionName, dbDoc, objectToSave.getClass());
populateIdIfNecessary(objectToSave, id);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
@@ -670,25 +697,30 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doSave(String collectionName, T objectToSave, MongoWriter<T> writer) {
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = saveDBObject(collectionName, dbDoc);
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());
populateIdIfNecessary(objectToSave, id);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
}
protected Object insertDBObject(String collectionName, final DBObject dbDoc) {
protected Object insertDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("insert DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDoc);
} else {
@@ -699,7 +731,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
protected List<ObjectId> insertDBObjectList(String collectionName, final List<DBObject> dbDocList) {
protected List<ObjectId> insertDBObjectList(final String collectionName, final List<DBObject> dbDocList) {
if (dbDocList.isEmpty()) {
return Collections.emptyList();
}
@@ -709,7 +741,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insert(dbDocList);
} else {
@@ -732,13 +766,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return ids;
}
protected Object saveDBObject(String collectionName, final DBObject dbDoc) {
protected Object saveDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("save DBObject containing fields: " + dbDoc.keySet());
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.save(dbDoc);
} else {
@@ -749,6 +785,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
});
}
public WriteResult upsert(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, true, false);
}
public WriteResult upsert(Query query, Update update, String collectionName) {
return doUpdate(collectionName, query, update, null, true, false);
}
public WriteResult updateFirst(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
}
@@ -787,13 +831,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
WriteResult wr;
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
entityClass, updateObj, queryObj);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
if (multi) {
wr = collection.updateMulti(queryObj, updateObj);
} else {
wr = collection.update(queryObj, updateObj);
}
wr = collection.update(queryObj, updateObj, upsert, multi);
} else {
wr = collection.update(queryObj, updateObj, upsert, multi, writeConcernToUse);
}
@@ -811,18 +853,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
remove(getIdQueryFor(object), object.getClass());
}
public void remove(Object object, String collection) {
Assert.hasText(collection);
if (object == null) {
return;
}
remove(getIdQueryFor(object), collection);
}
/**
* Returns a {@link Query} for the given entity by its id.
*
@@ -830,9 +872,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @return
*/
private Query getIdQueryFor(Object object) {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
@@ -843,15 +885,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ConversionService service = mongoConverter.getConversionService();
Object idProperty = null;
try {
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
}
private void assertUpdateableIdIfNotSet(Object entity) {
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entity.getClass());
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
if (idProperty == null) {
return;
}
ConversionService service = mongoConverter.getConversionService();
Object idValue = BeanWrapper.create(entity, service).getProperty(idProperty, Object.class, true);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(String.format(
"Cannot autogenerate id of type %s for entity of type %s!", idProperty.getType().getName(), entity.getClass()
.getName()));
}
}
@@ -860,7 +913,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
doRemove(determineCollectionName(entityClass), query, entityClass);
}
protected <T> void doRemove(String collectionName, final Query query, Class<T> entityClass) {
protected <T> void doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null");
}
@@ -870,7 +923,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject dboq = mapper.getMappedObject(queryObject, entity);
WriteResult wr = null;
WriteConcern writeConcernToUse = prepareWriteConcern(writeConcern);
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + queryObject + " in collection: " + collection.getName());
}
@@ -959,21 +1014,98 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MapReduceResults<T> mapReduceResult = new MapReduceResults<T>(mappedResults, commandResult);
return mapReduceResult;
}
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
return group(null, inputCollectionName, groupBy, entityClass);
}
public <T> GroupByResults<T> group(Criteria criteria, String inputCollectionName, GroupBy groupBy,
Class<T> entityClass) {
DBObject dbo = groupBy.getGroupByObject();
dbo.put("ns", inputCollectionName);
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
if (dbo.containsField("initial")) {
Object initialObj = dbo.get("initial");
if (initialObj instanceof String) {
String initialAsString = replaceWithResourceIfNecessary((String) initialObj);
dbo.put("initial", JSON.parse(initialAsString));
}
}
if (dbo.containsField("$reduce")) {
dbo.put("$reduce", replaceWithResourceIfNecessary(dbo.get("$reduce").toString()));
}
if (dbo.containsField("$keyf")) {
dbo.put("$keyf", replaceWithResourceIfNecessary(dbo.get("$keyf").toString()));
}
if (dbo.containsField("finalize")) {
dbo.put("finalize", replaceWithResourceIfNecessary(dbo.get("finalize").toString()));
}
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
}
@SuppressWarnings("unchecked")
Iterable<DBObject> resultSet = (Iterable<DBObject>) commandResult.get("retval");
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
for (DBObject dbObject : resultSet) {
mappedResults.add(callback.doWith(dbObject));
}
GroupByResults<T> groupByResult = new GroupByResults<T>(mappedResults, commandResult);
return groupByResult;
}
protected String replaceWithResourceIfNecessary(String function) {
String func = function;
if (this.resourceLoader != null) {
if (this.resourceLoader != null && ResourceUtils.isUrl(function)) {
Resource functionResource = resourceLoader.getResource(func);
if (!functionResource.exists()) {
throw new InvalidDataAccessApiUsageException(String.format("Resource %s not found!", function));
}
try {
Resource functionResource = resourceLoader.getResource(func);
if (functionResource.exists()) {
return new Scanner(functionResource.getInputStream()).useDelimiter("\\A").next();
}
} catch (Exception e) {
// ignore - could be embedded JavaScript text
return new Scanner(functionResource.getInputStream()).useDelimiter("\\A").next();
} catch (IOException e) {
throw new InvalidDataAccessApiUsageException(String.format("Cannot read map-reduce file %s!", function), e);
}
}
return func;
}
@@ -1178,6 +1310,31 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
protected <T> T doFindAndModify(String collectionName, DBObject query, DBObject fields, DBObject sort,
Class<T> entityClass, Update update, FindAndModifyOptions options) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (options == null) {
options = new FindAndModifyOptions();
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject updateObj = update.getUpdateObject();
for (String key : updateObj.keySet()) {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " and update: " + updateObj + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mapper.getMappedObject(query, entity), fields, sort,
updateObj, options), new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
/**
* Populates the id property of the saved object, if it's not set already.
*
@@ -1316,7 +1473,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return null;
}
private String determineCollectionName(Class<?> entityClass) {
String determineCollectionName(Class<?> entityClass) {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
@@ -1334,30 +1491,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Checks and handles any errors.
* <p/>
* TODO: current implementation logs errors - will be configurable to log warning, errors or throw exception in later
* versions
* Current implementation logs errors. Future version may make this configurable to log warning, errors or throw
* exception.
*/
private void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
protected void handleAnyWriteResultErrors(WriteResult wr, DBObject query, String operation) {
if (WriteResultChecking.NONE == this.writeResultChecking) {
return;
}
String error = wr.getError();
int n = wr.getN();
if (error != null) {
String message = "Execution of '" + operation + (query == null ? "" : "' using '" + query.toString() + "' query")
+ " failed: " + error;
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.error(message);
}
} else if (n == 0) {
String message = "Execution of '" + operation + (query == null ? "" : "' using '" + query.toString() + "' query")
+ " did not succeed: 0 documents updated";
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
} else {
LOGGER.warn(message);
return;
}
}
}
@@ -1469,6 +1623,29 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private static class FindAndModifyCallback implements CollectionCallback<DBObject> {
private final DBObject query;
private final DBObject fields;
private final DBObject sort;
private final DBObject update;
private final FindAndModifyOptions options;
public FindAndModifyCallback(DBObject query, DBObject fields, DBObject sort, DBObject update,
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.options = options;
}
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.findAndModify(query, fields, sort, options.isRemove(), update, options.isReturnNew(),
options.isUpsert());
}
}
/**
* Simple internal callback to allow operations on a {@link DBObject}.
*
@@ -1510,6 +1687,60 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private class DefaultWriteConcernResolver implements WriteConcernResolver {
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
public QueryCursorPreparer(Query query) {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.DBCursor)
*/
public DBCursor prepare(DBCursor cursor) {
if (query == null) {
return cursor;
}
if (query.getSkip() <= 0 && query.getLimit() <= 0 && query.getSortObject() == null
&& !StringUtils.hasText(query.getHint())) {
return cursor;
}
DBCursor cursorToUse = cursor;
try {
if (query.getSkip() > 0) {
cursorToUse = cursorToUse.skip(query.getSkip());
}
if (query.getLimit() > 0) {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
}
return cursorToUse;
}
}
/**
* {@link DbObjectCallback} that assumes a {@link GeoResult} to be created, delegates actual content unmarshalling to
* a delegate and creates a {@link GeoResult} from the result.
@@ -1543,4 +1774,5 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new GeoResult<T>(doWith, new Distance(distance, metric));
}
}
}

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
@@ -25,6 +24,7 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
@@ -40,15 +40,17 @@ import com.mongodb.DBObject;
public class QueryMapper {
private final ConversionService conversionService;
private final MongoConverter converter;
/**
* Creates a new {@link QueryMapper} with the given {@link ConversionService}.
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
*
* @param conversionService must not be {@literal null}.
* @param converter must not be {@literal null}.
*/
public QueryMapper(ConversionService conversionService) {
Assert.notNull(conversionService);
this.conversionService = conversionService;
public QueryMapper(MongoConverter converter) {
Assert.notNull(converter);
this.conversionService = converter.getConversionService();
this.converter = converter;
}
/**
@@ -105,7 +107,7 @@ public class QueryMapper {
value = convertId(value);
}
newDbo.put(newKey, value);
newDbo.put(newKey, converter.convertToMongoType(value));
}
return newDbo;
@@ -117,22 +119,14 @@ public class QueryMapper {
* @param id
* @return
*/
@SuppressWarnings("unchecked")
public Object convertId(Object id) {
for (Class<?> type : Arrays.asList(ObjectId.class, String.class)) {
if (id.getClass().isAssignableFrom(type)) {
return id;
}
try {
return conversionService.convert(id, type);
} catch (ConversionException e) {
// Ignore
}
try {
return conversionService.convert(id, ObjectId.class);
} catch (ConversionException e) {
// Ignore
}
return id;
return converter.convertToMongoType(id);
}
}

View File

@@ -82,9 +82,8 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws UnknownHostException
* @see MongoURI
*/
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), String.valueOf(uri.getPassword())));
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())));
}
/**
@@ -95,6 +94,10 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
public WriteConcern getWriteConcern() {
return writeConcern;
}
/*
* (non-Javadoc)
@@ -127,4 +130,12 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
public void destroy() throws Exception {
mongo.close();
}
public static String parseChars(char[] chars) {
if (chars == null) {
return null;
} else {
return String.valueOf(chars);
}
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* A strategy interface to determine the WriteConcern to use for a given MongoDbAction.
*
* Return the passed in default WriteConcern (a property on MongoAction) if no determination can be made.
*
* @author Mark Pollack
*
*/
public interface WriteConcernResolver {
/**
* Resolve the WriteConcern given the MongoAction
* @param action describes the context of the Mongo action. Contains a default WriteConcern to use if one should not be resolved.
* @return a WriteConcern based on the passed in MongoAction value, maybe null
*/
WriteConcern resolve(MongoAction action);
}

View File

@@ -23,10 +23,8 @@ import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObjectIdConverter;
/**
@@ -46,6 +44,7 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
@@ -80,12 +79,6 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, String.class)) {
conversionService.addConverter(BigIntegerToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, BigInteger.class)) {
conversionService.addConverter(StringToBigIntegerConverter.INSTANCE);
}
conversions.registerConvertersIn(conversionService);
}

View File

@@ -17,13 +17,11 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.core.GenericTypeResolver;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
@@ -33,12 +31,12 @@ import org.springframework.core.convert.converter.GenericConverter.ConvertiblePa
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Value object to capture custom conversion. That is essentially a {@link List} of converters and some additional logic
* around them. The converters are pretty much builds up two sets of types which Mongo basic types {@see #MONGO_TYPES}
@@ -50,10 +48,6 @@ import com.mongodb.DBObject;
*/
public class CustomConversions {
@SuppressWarnings({ "unchecked" })
private static final List<Class<?>> MONGO_TYPES = Arrays.asList(Number.class, Date.class, ObjectId.class,
String.class, DBObject.class);
private final Set<ConvertiblePair> readingPairs;
private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes;
@@ -85,6 +79,8 @@ public class CustomConversions {
this.converters.add(CustomToStringConverter.INSTANCE);
this.converters.add(BigDecimalToStringConverter.INSTANCE);
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.addAll(converters);
for (Object c : this.converters) {
@@ -279,8 +275,8 @@ public class CustomConversions {
* @param type
* @return
*/
private static boolean isMongoBasicType(Class<?> type) {
return MONGO_TYPES.contains(type);
private boolean isMongoBasicType(Class<?> type) {
return MongoSimpleTypes.HOLDER.isSimpleType(type);
}
private enum CustomToStringConverter implements GenericConverter {

View File

@@ -17,20 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.lang.reflect.Array;
import java.lang.reflect.InvocationTargetException;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -74,16 +70,7 @@ import com.mongodb.DBRef;
* @author Oliver Gierke
* @author Jon Brisbin
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, TypeKeyAware {
@SuppressWarnings("rawtypes")
private static final TypeInformation<Map> MAP_TYPE_INFORMATION = ClassTypeInformation.from(Map.class);
@SuppressWarnings("rawtypes")
private static final TypeInformation<Collection> COLLECTION_TYPE_INFORMATION = ClassTypeInformation
.from(Collection.class);
private static final List<Class<?>> VALID_ID_TYPES = Arrays.asList(new Class<?>[] { ObjectId.class, String.class,
BigInteger.class, byte[].class });
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware {
protected static final Log log = LogFactory.getLog(MappingMongoConverter.class);
@@ -101,6 +88,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
@SuppressWarnings("deprecation")
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
@@ -112,7 +100,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext);
this.idMapper = new QueryMapper(conversionService);
this.idMapper = new QueryMapper(this);
}
/**
@@ -128,14 +116,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
mappingContext) : typeMapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeKeyAware#isTypeKey(java.lang.String)
*/
public boolean isTypeKey(String key) {
return typeMapper.isTypeKey(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.EntityConverter#getMappingContext()
@@ -184,6 +164,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
return conversionService.convert(dbo, rawType);
}
if (DBObject.class.isAssignableFrom(rawType)) {
return (S) dbo;
}
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo);
@@ -266,13 +250,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
Object obj = getValueInternal(prop, dbo, spelCtx, prop.getSpelExpression());
try {
wrapper.setProperty(prop, obj, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(prop, obj, useFieldAccessOnly);
}
});
@@ -338,12 +316,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (Map.class.isAssignableFrom(obj.getClass())) {
writeMapInternal((Map<Object, Object>) obj, dbo, MAP_TYPE_INFORMATION);
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
writeCollectionInternal((Collection<?>) obj, COLLECTION_TYPE_INFORMATION, (BasicDBList) dbo);
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return;
}
@@ -367,46 +345,24 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
// Write the ID
final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) {
Object idObj = null;
Class<?>[] targetClasses = new Class<?>[] { ObjectId.class, String.class, Object.class };
for (Class<?> targetClass : targetClasses) {
try {
idObj = wrapper.getProperty(idProperty, targetClass, useFieldAccessOnly);
if (null != idObj) {
break;
}
} catch (ConversionException ignored) {
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
if (null != idObj) {
dbo.put("_id", idObj);
} else {
if (!VALID_ID_TYPES.contains(idProperty.getType())) {
throw new MappingException("Invalid data type " + idProperty.getType().getName()
+ " for Id property. Should be one of " + VALID_ID_TYPES);
}
try {
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {
}
}
// Write the properties
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
return;
}
Object propertyObj;
try {
propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
Object propertyObj = wrapper.getProperty(prop, prop.getType(), useFieldAccessOnly);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
@@ -421,14 +377,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType();
Object propertyObj;
try {
propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
Object propertyObj = wrapper.getProperty(inverseProp, type, useFieldAccessOnly);
if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp);
}
@@ -444,16 +393,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
String name = prop.getFieldName();
TypeInformation<?> valueType = ClassTypeInformation.from(obj.getClass());
TypeInformation<?> type = prop.getTypeInformation();
if (prop.isCollection()) {
if (valueType.isCollectionLike()) {
DBObject collectionInternal = createCollection(asCollection(obj), prop);
dbo.put(name, collectionInternal);
return;
}
TypeInformation<?> type = prop.getTypeInformation();
if (prop.isMap()) {
if (valueType.isMap()) {
BasicDBObject mapDbObj = new BasicDBObject();
writeMapInternal((Map<Object, Object>) obj, mapDbObj, type);
dbo.put(name, mapDbObj);
@@ -589,12 +538,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
String simpleKey = key.toString();
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection) {
} else if (val instanceof Collection || val.getClass().isArray()) {
dbo.put(simpleKey,
writeCollectionInternal((Collection<?>) val, propertyType.getMapValueType(), new BasicDBList()));
writeCollectionInternal(asCollection(val), propertyType.getMapValueType(), new BasicDBList()));
} else {
DBObject newDbo = new BasicDBObject();
writeInternal(val, newDbo, propertyType);
TypeInformation<?> valueTypeInfo = propertyType.isMap() ? propertyType.getMapValueType() : ClassTypeInformation.OBJECT;
writeInternal(val, newDbo, valueTypeInfo);
dbo.put(simpleKey, newDbo);
}
} else {
@@ -653,7 +603,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (customTarget != null) {
return conversionService.convert(value, customTarget);
} else {
return value.getClass().isEnum() ? ((Enum<?>) value).name() : value;
return Enum.class.isAssignableFrom(value.getClass()) ? ((Enum<?>) value).name() : value;
}
}
@@ -676,7 +626,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return conversionService.convert(value, target);
}
if (target.isEnum()) {
if (Enum.class.isAssignableFrom(target)) {
return Enum.valueOf((Class<Enum>) target, value.toString());
}
@@ -691,18 +641,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
MongoPersistentProperty idProperty = targetEntity.getIdProperty();
Object id = null;
BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(target, conversionService);
try {
id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
Object id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
if (null == id) {
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
String collection = dbref.collection();
@@ -748,7 +691,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
&& ((DBObject) sourceValue).keySet().size() == 0) {
// It's empty
return Array.newInstance(prop.getComponentType(), 0);
} else if (prop.isCollection() && sourceValue instanceof BasicDBList) {
} else if (prop.isCollectionLike() && sourceValue instanceof BasicDBList) {
return readCollectionOrArray((TypeInformation<? extends Collection<?>>) prop.getTypeInformation(),
(BasicDBList) sourceValue);
}
@@ -888,46 +831,68 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (obj instanceof Map) {
Map<Object, Object> m = new HashMap<Object, Object>();
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
m.put(entry.getKey(), convertToMongoType(entry.getValue()));
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue()));
}
return m;
return result;
}
if (obj instanceof List) {
List<?> l = (List<?>) obj;
List<Object> newList = new ArrayList<Object>();
for (Object o : l) {
newList.add(convertToMongoType(o));
}
return newList;
return maybeConvertList((List<?>) obj);
}
if (obj.getClass().isArray()) {
return maybeConvertArray((Object[]) obj);
return maybeConvertList(Arrays.asList((Object[]) obj));
}
DBObject newDbo = new BasicDBObject();
this.write(obj, newDbo);
return newDbo;
return removeTypeInfoRecursively(newDbo);
}
public Object[] maybeConvertArray(Object[] src) {
Object[] newArr = new Object[src.length];
for (int i = 0; i < src.length; i++) {
newArr[i] = convertToMongoType(src[i]);
}
return newArr;
}
public BasicDBList maybeConvertList(BasicDBList dbl) {
public BasicDBList maybeConvertList(Iterable<?> source) {
BasicDBList newDbl = new BasicDBList();
Iterator<?> iter = dbl.iterator();
while (iter.hasNext()) {
Object o = iter.next();
newDbl.add(convertToMongoType(o));
for (Object element : source) {
newDbl.add(convertToMongoType(element));
}
return newDbl;
}
/**
* Removes the type information from the conversion result.
*
* @param object
* @return
*/
private Object removeTypeInfoRecursively(Object object) {
if (!(object instanceof DBObject)) {
return object;
}
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element);
}
} else {
removeTypeInfoRecursively(value);
}
}
if (keyToRemove != null) {
dbObject.removeField(keyToRemove);
}
return dbObject;
}
}

View File

@@ -20,11 +20,16 @@ import org.springframework.data.convert.TypeMapper;
import com.mongodb.DBObject;
/**
* Combining interface to express Mongo specific {@link TypeMapper} implementations will be {@link TypeKeyAware} as
* well.
* Mongo-specific {@link TypeMapper} exposing that {@link DBObject}s might contain a type key.
*
* @author Oliver Gierke
*/
public interface MongoTypeMapper extends TypeMapper<DBObject>, TypeKeyAware {
public interface MongoTypeMapper extends TypeMapper<DBObject> {
/**
* Returns whether the given key is the type key.
*
* @return
*/
boolean isTypeKey(String key);
}

View File

@@ -0,0 +1,115 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Map;
import org.springframework.data.mongodb.core.query.Order;
public class IndexInfo {
private final Map<String, Order> fieldSpec;
private String name;
private boolean unique = false;
private boolean dropDuplicates = false;
private boolean sparse = false;
public IndexInfo(Map<String, Order> fieldSpec, String name, boolean unique, boolean dropDuplicates, boolean sparse) {
super();
this.fieldSpec = fieldSpec;
this.name = name;
this.unique = unique;
this.dropDuplicates = dropDuplicates;
this.sparse = sparse;
}
public Map<String, Order> getFieldSpec() {
return fieldSpec;
}
public String getName() {
return name;
}
public boolean isUnique() {
return unique;
}
public boolean isDropDuplicates() {
return dropDuplicates;
}
public boolean isSparse() {
return sparse;
}
@Override
public String toString() {
return "IndexInfo [fieldSpec=" + fieldSpec + ", name=" + name + ", unique=" + unique + ", dropDuplicates="
+ dropDuplicates + ", sparse=" + sparse + "]";
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (dropDuplicates ? 1231 : 1237);
result = prime * result + ((fieldSpec == null) ? 0 : fieldSpec.hashCode());
result = prime * result + ((name == null) ? 0 : name.hashCode());
result = prime * result + (sparse ? 1231 : 1237);
result = prime * result + (unique ? 1231 : 1237);
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
IndexInfo other = (IndexInfo) obj;
if (dropDuplicates != other.dropDuplicates)
return false;
if (fieldSpec == null) {
if (other.fieldSpec != null)
return false;
} else if (!fieldSpec.equals(other.fieldSpec))
return false;
if (name == null) {
if (other.name != null)
return false;
} else if (!name.equals(other.name))
return false;
if (sparse != other.sparse)
return false;
if (unique != other.unique)
return false;
return true;
}
/**
* [{ "v" : 1 , "key" : { "_id" : 1} , "ns" : "database.person" , "name" : "_id_"},
{ "v" : 1 , "key" : { "age" : -1} , "ns" : "database.person" , "name" : "age_-1" , "unique" : true , "dropDups" : true}]
*/
}

View File

@@ -53,7 +53,7 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
if (this.isIdProperty == null) {
this.isIdProperty = super.isIdProperty();
}
return this.isIdProperty;
}
@@ -67,7 +67,7 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
if (this.fieldName == null) {
this.fieldName = super.getFieldName();
}
return super.getFieldName();
return this.fieldName;
}
}

View File

@@ -19,11 +19,13 @@ import java.math.BigInteger;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import java.util.regex.Pattern;
import org.bson.types.CodeWScope;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
@@ -33,19 +35,21 @@ import com.mongodb.DBRef;
*/
public abstract class MongoSimpleTypes {
public static final Set<Class<?>> SUPPORTED_ID_CLASSES;
public static final Set<Class<?>> AUTOGENERATED_ID_TYPES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableSet(classes);
AUTOGENERATED_ID_TYPES = Collections.unmodifiableSet(classes);
Set<Class<?>> simpleTypes = new HashSet<Class<?>>();
simpleTypes.add(DBRef.class);
simpleTypes.add(ObjectId.class);
simpleTypes.add(CodeWScope.class);
simpleTypes.add(DBObject.class);
simpleTypes.add(Pattern.class);
MONGO_SIMPLE_TYPES = Collections.unmodifiableSet(simpleTypes);
}

View File

@@ -36,7 +36,8 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
* Creates a new {@link AbstractMongoEventListener}.
*/
public AbstractMongoEventListener() {
this.domainClass = GenericTypeResolver.resolveTypeArgument(this.getClass(), AbstractMongoEventListener.class);
Class<?> typeArgument = GenericTypeResolver.resolveTypeArgument(this.getClass(), AbstractMongoEventListener.class);
this.domainClass = typeArgument == null ? Object.class : typeArgument;
}
/*

View File

@@ -0,0 +1,118 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Collects the parameters required to perform a group operation on a collection. The query condition and the input collection are specified on the group method as method arguments
* to be consistent with other operations, e.g. map-reduce.
*
* @author Mark Pollack
*
*/
public class GroupBy {
private DBObject dboKeys;
private String keyFunction;
private String initial;
private DBObject initialDbObject;
private String reduce;
private String finalize;
public GroupBy(String... keys) {
DBObject dbo = new BasicDBObject();
for (String key : keys) {
dbo.put(key, 1);
}
dboKeys = dbo;
}
// NOTE GroupByCommand does not handle keyfunction.
public GroupBy(String key, boolean isKeyFunction) {
DBObject dbo = new BasicDBObject();
if (isKeyFunction) {
keyFunction = key;
} else {
dbo.put(key, 1);
dboKeys = dbo;
}
}
public static GroupBy keyFunction(String key) {
return new GroupBy(key, true);
}
public static GroupBy key(String... keys) {
return new GroupBy(keys);
}
public GroupBy initialDocument(String initialDocument) {
initial = initialDocument;
return this;
}
public GroupBy initialDocument(DBObject initialDocument) {
initialDbObject = initialDocument;
return this;
}
public GroupBy reduceFunction(String reduceFunction) {
reduce = reduceFunction;
return this;
}
public GroupBy finalizeFunction(String finalizeFunction) {
finalize = finalizeFunction;
return this;
}
public DBObject getGroupByObject() {
// return new GroupCommand(dbCollection, dboKeys, condition, initial, reduce, finalize);
BasicDBObject dbo = new BasicDBObject();
if (dboKeys != null) {
dbo.put("key", dboKeys);
}
if (keyFunction != null) {
dbo.put("$keyf", keyFunction);
}
dbo.put("$reduce", reduce);
dbo.put("initial", initialDbObject);
if (initial != null) {
dbo.put("initial", initial);
}
if (finalize != null) {
dbo.put("finalize", finalize);
}
return dbo;
}
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of executing a group operation.
*
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
*/
public class GroupByResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private DBObject rawResults;
private double count;
private int keys;
private String serverUsed;
public GroupByResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = mappedResults;
this.rawResults = rawResults;
parseKeys();
parseCount();
parseServerUsed();
}
public double getCount() {
return count;
}
public int getKeys() {
return keys;
}
public String getServerUsed() {
return serverUsed;
}
public Iterator<T> iterator() {
return mappedResults.iterator();
}
public DBObject getRawResults() {
return rawResults;
}
private void parseCount() {
Object object = rawResults.get("count");
if (object instanceof Double) {
count = (Double) object;
}
}
private void parseKeys() {
Object object = rawResults.get("keys");
if (object instanceof Integer) {
keys = (Integer) object;
}
}
private void parseServerUsed() {
//"serverUsed" : "127.0.0.1:27017"
Object object = rawResults.get("serverUsed");
if (object instanceof String) {
serverUsed = (String) object;
}
}
}

View File

@@ -44,10 +44,9 @@ public class MapReduceOptions {
/**
* Static factory method to create a Criteria using the provided key
* Static factory method to create a MapReduceOptions instance
*
* @param key
* @return
* @return a new instance
*/
public static MapReduceOptions options() {
return new MapReduceOptions();

View File

@@ -22,6 +22,12 @@ import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Collects the results of performing a MapReduce operations.
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
*/
public class MapReduceResults<T> implements Iterable<T> {
private final List<T> mappedResults;

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
@@ -191,6 +192,10 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria nin(Object... o) {
return nin(Arrays.asList(o));
}
public Criteria nin(Collection<?> o) {
criteria.put("$nin", o);
return this;
}
@@ -217,6 +222,10 @@ public class Criteria implements CriteriaDefinition {
* @return
*/
public Criteria all(Object... o) {
return all(Arrays.asList(o));
}
public Criteria all(Collection<?> o) {
criteria.put("$all", o);
return this;
}

View File

@@ -19,9 +19,11 @@ import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
public class Query {
@@ -30,6 +32,7 @@ public class Query {
private Sort sort;
private int skip;
private int limit;
private String hint;
/**
* Static factory method to create a Query using the provided criteria
@@ -63,10 +66,8 @@ public class Query {
}
public Field fields() {
synchronized (this) {
if (fieldSpec == null) {
this.fieldSpec = new Field();
}
if (fieldSpec == null) {
this.fieldSpec = new Field();
}
return this.fieldSpec;
}
@@ -81,12 +82,23 @@ public class Query {
return this;
}
/**
* Configures the query to use the given hint when being executed.
*
* @param name must not be {@literal null} or empty.
* @return
*/
public Query withHint(String name) {
Assert.hasText(name, "Hint must not be empty or null!");
this.hint = name;
return this;
}
public Sort sort() {
synchronized (this) {
if (this.sort == null) {
this.sort = new Sort();
}
if (this.sort == null) {
this.sort = new Sort();
}
return this.sort;
}
@@ -122,6 +134,10 @@ public class Query {
return this.limit;
}
public String getHint() {
return hint;
}
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}

View File

@@ -15,14 +15,14 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.springframework.data.mongodb.repository.query.QueryUtils.*;
import static org.springframework.data.mongodb.repository.query.QueryUtils.applyPagination;
import java.util.List;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
import org.springframework.data.mongodb.core.geo.GeoResults;
@@ -46,21 +46,21 @@ import com.mongodb.DBObject;
public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoTemplate template;
private final MongoOperations mongoOperations;
/**
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoTemplate}.
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoOperations}.
*
* @param method
* @param template
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoTemplate template) {
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations template) {
Assert.notNull(template);
Assert.notNull(method);
this.method = method;
this.template = template;
this.mongoOperations = template;
}
/* (non-Javadoc)
@@ -79,7 +79,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
public Object execute(Object[] parameters) {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(template.getConverter(), accessor));
Query query = createQuery(new ConvertingParameterAccessor(mongoOperations.getConverter(), accessor));
if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
@@ -110,7 +110,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
String collectionName = metadata.getCollectionName();
return template.find(query, metadata.getJavaType(), collectionName);
return mongoOperations.find(query, metadata.getJavaType(), collectionName);
}
}
@@ -164,7 +164,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoEntityInformation<?, ?> metadata = method.getEntityInformation();
int count = getCollectionCursor(metadata.getCollectionName(), query.getQueryObject()).count();
List<?> result = template.find(applyPagination(query, pageable), metadata.getJavaType(),
List<?> result = mongoOperations.find(applyPagination(query, pageable), metadata.getJavaType(),
metadata.getCollectionName());
return new PageImpl(result, pageable, count);
@@ -172,7 +172,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private DBCursor getCollectionCursor(String collectionName, final DBObject query) {
return template.execute(collectionName, new CollectionCallback<DBCursor>() {
return mongoOperations.execute(collectionName, new CollectionCallback<DBCursor>() {
public DBCursor doInCollection(DBCollection collection) {
@@ -197,7 +197,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityInformation<?, ?> entityInformation = method.getEntityInformation();
return template.findOne(query, entityInformation.getJavaType());
return mongoOperations.findOne(query, entityInformation.getJavaType());
}
}
@@ -234,7 +234,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
}
MongoEntityInformation<?,?> entityInformation = method.getEntityInformation();
GeoResults<?> results = template.geoNear(nearQuery, entityInformation.getJavaType(), entityInformation.getCollectionName());
GeoResults<?> results = mongoOperations.geoNear(nearQuery, entityInformation.getJavaType(), entityInformation.getCollectionName());
return isListOfGeoResult() ? results.getContent() : results;
}

View File

@@ -20,13 +20,10 @@ import java.util.Iterator;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.TypeKeyAware;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.repository.query.ParameterAccessor;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
import org.springframework.util.Assert;
/**
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
@@ -41,9 +38,14 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
/**
* Creates a new {@link ConvertingParameterAccessor} with the given {@link MongoWriter} and delegate.
*
* @param writer
* @param writer must not be {@literal null}.
* @param delegate must not be {@literal null}.
*/
public ConvertingParameterAccessor(MongoWriter<?> writer, MongoParameterAccessor delegate) {
Assert.notNull(writer);
Assert.notNull(delegate);
this.writer = writer;
this.delegate = delegate;
}
@@ -105,49 +107,7 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
* @return
*/
private Object getConvertedValue(Object value) {
if (!(writer instanceof TypeKeyAware)) {
return value;
}
return removeTypeInfoRecursively(writer.convertToMongoType(value), ((TypeKeyAware) writer));
}
/**
* Removes the type information from the conversion result.
*
* @param object
* @return
*/
private Object removeTypeInfoRecursively(Object object, TypeKeyAware typeKeyAware) {
if (!(object instanceof DBObject) || typeKeyAware == null) {
return object;
}
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeKeyAware.isTypeKey(key)) {
keyToRemove = key;
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element, typeKeyAware);
}
} else {
removeTypeInfoRecursively(value, typeKeyAware);
}
}
if (keyToRemove != null) {
dbObject.removeField(keyToRemove);
}
return dbObject;
return writer.convertToMongoType(value);
}
/**

View File

@@ -134,7 +134,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
*/
@Override
protected Query or(Query base, Query query) {
return new OrQuery(new Query[] {base, query});
return new OrQuery(new Query[] { base, query });
}
/*
@@ -192,6 +192,14 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Query> {
case LIKE:
String value = parameters.next().toString();
return criteria.regex(toLikeRegex(value));
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
return criteria.exists((Boolean) parameters.next());
case TRUE:
return criteria.is(true);
case FALSE:
return criteria.is(false);
case NEAR:
Distance distance = accessor.getMaxDistance();

View File

@@ -27,7 +27,6 @@ import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.Parameters;
import org.springframework.data.repository.query.QueryMethod;
import org.springframework.data.repository.util.ClassUtils;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
@@ -57,7 +56,7 @@ public class MongoQueryMethod extends QueryMethod {
super(method, metadata);
Assert.notNull(entityInformationCreator, "DefaultEntityInformationCreator must not be null!");
this.method = method;
this.entityInformation = entityInformationCreator.getEntityInformation(ClassUtils.getReturnedDomainClass(method),
this.entityInformation = entityInformationCreator.getEntityInformation(metadata.getReturnedDomainClass(method),
getDomainClass());
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Query;
@@ -40,12 +41,12 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
* @param method
* @param template
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoTemplate template) {
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, template);
super(method, mongoOperations);
this.tree = new PartTree(method.getName(), method.getEntityInformation().getJavaType());
this.isGeoNearQuery = method.isGeoNearQuery();
this.context = template.getConverter().getMappingContext();
this.context = mongoOperations.getConverter().getMappingContext();
}
/**

View File

@@ -21,7 +21,7 @@ import java.util.regex.Pattern;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -44,14 +44,14 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param method
* @param template
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoTemplate template) {
super(method, template);
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations) {
super(method, mongoOperations);
this.query = query;
this.fieldSpec = method.getFieldSpecification();
}
public StringBasedMongoQuery(MongoQueryMethod method, MongoTemplate template) {
this(method.getAnnotatedQuery(), method, template);
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
this(method.getAnnotatedQuery(), method, mongoOperations);
}
/*
@@ -74,6 +74,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
} else {
query = new BasicQuery(queryString);
}
QueryUtils.applySorting(query, accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s", query.getQueryObject()));

View File

@@ -86,7 +86,7 @@ class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTr
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
operations.ensureIndex(index, metadata.getCollectionName());
operations.indexOps(metadata.getCollectionName()).ensureIndex(index);
LOG.debug(String.format("Created %s!", index));
}

View File

@@ -21,8 +21,8 @@ import java.io.Serializable;
import java.lang.reflect.Method;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.query.EntityInformationCreator;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
@@ -37,7 +37,6 @@ import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* Factory to create {@link MongoRepository} instances.
@@ -46,7 +45,7 @@ import org.springframework.util.StringUtils;
*/
public class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoTemplate template;
private final MongoOperations mongoOperations;
private final EntityInformationCreator entityInformationCreator;
/**
@@ -55,11 +54,12 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @param template must not be {@literal null}
* @param mappingContext
*/
public MongoRepositoryFactory(MongoTemplate template) {
public MongoRepositoryFactory(MongoOperations mongoOperations) {
Assert.notNull(template);
this.template = template;
this.entityInformationCreator = new DefaultEntityInformationCreator(template.getConverter().getMappingContext());
Assert.notNull(mongoOperations);
this.mongoOperations = mongoOperations;
this.entityInformationCreator = new DefaultEntityInformationCreator(mongoOperations.getConverter()
.getMappingContext());
}
/*
@@ -84,9 +84,9 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainClass());
if (isQueryDslRepository(repositoryInterface)) {
return new QueryDslMongoRepository(entityInformation, template);
return new QueryDslMongoRepository(entityInformation, mongoOperations);
} else {
return new SimpleMongoRepository(entityInformation, template);
return new SimpleMongoRepository(entityInformation, mongoOperations);
}
}
@@ -110,41 +110,27 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* @author Oliver Gierke
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryLookupStrategy#resolveQuery(java.lang.reflect.Method, org.springframework.data.repository.core.RepositoryMetadata, org.springframework.data.repository.core.NamedQueries)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, template);
return new StringBasedMongoQuery(namedQuery, queryMethod, mongoOperations);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, template);
return new StringBasedMongoQuery(queryMethod, mongoOperations);
} else {
return new PartTreeMongoQuery(queryMethod, template);
return new PartTreeMongoQuery(queryMethod, mongoOperations);
}
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.RepositoryFactorySupport#validate(org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
protected void validate(RepositoryMetadata metadata) {
Class<?> idClass = metadata.getIdClass();
if (!MongoSimpleTypes.SUPPORTED_ID_CLASSES.contains(idClass)) {
throw new IllegalArgumentException(String.format("Unsupported id class! Only %s are supported!",
StringUtils.collectionToCommaDelimitedString(MongoSimpleTypes.SUPPORTED_ID_CLASSES)));
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getEntityInformation(java.lang.Class)

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import java.util.List;
import java.util.regex.Pattern;
import org.apache.commons.collections15.Transformer;
import org.springframework.data.domain.Page;
@@ -25,7 +26,9 @@ import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
@@ -34,6 +37,7 @@ import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
@@ -64,9 +68,9 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @param entityInformation
* @param template
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoTemplate template) {
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, template, SimpleEntityPathResolver.INSTANCE);
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
}
/**
@@ -74,17 +78,17 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* and {@link EntityPathResolver}.
*
* @param entityInformation
* @param template
* @param mongoOperations
* @param resolver
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoTemplate template,
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations,
EntityPathResolver resolver) {
super(entityInformation, template);
super(entityInformation, mongoOperations);
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.serializer = new SpringDataMongodbSerializer(template.getConverter().getMappingContext());
this.serializer = new SpringDataMongodbSerializer(mongoOperations.getConverter());
}
/*
@@ -229,14 +233,17 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
*/
static class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.mappingContext = mappingContext;
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
}
@Override
@@ -247,5 +254,11 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property.getFieldName();
}
@Override
protected DBObject asDBObject(String key, Object value) {
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}
}

View File

@@ -42,7 +42,7 @@ import org.springframework.util.Assert;
*/
public class SimpleMongoRepository<T, ID extends Serializable> implements PagingAndSortingRepository<T, ID> {
private final MongoTemplate template;
private final MongoOperations mongoOperations;
private final MongoEntityInformation<T, ID> entityInformation;
/**
@@ -51,12 +51,12 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
* @param metadata
* @param template
*/
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoTemplate template) {
public SimpleMongoRepository(MongoEntityInformation<T, ID> metadata, MongoOperations mongoOperations) {
Assert.notNull(template);
Assert.notNull(mongoOperations);
Assert.notNull(metadata);
this.entityInformation = metadata;
this.template = template;
this.mongoOperations = mongoOperations;
}
/*
@@ -66,8 +66,10 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
* org.springframework.data.repository.Repository#save(java.lang.Object)
*/
public T save(T entity) {
Assert.notNull(entity, "Entity must not be null!");
template.save(entity, entityInformation.getCollectionName());
mongoOperations.save(entity, entityInformation.getCollectionName());
return entity;
}
@@ -79,6 +81,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public List<T> save(Iterable<? extends T> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<T> result = new ArrayList<T>();
for (T entity : entities) {
@@ -98,7 +102,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public T findOne(ID id) {
Assert.notNull(id, "The given id must not be null!");
return template.findById(id, entityInformation.getJavaType());
return mongoOperations.findById(id, entityInformation.getJavaType());
}
private Query getIdQuery(Object id) {
@@ -119,7 +123,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
return template.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
return mongoOperations.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
entityInformation.getCollectionName()) != null;
}
@@ -130,7 +134,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public long count() {
return template.getCollection(entityInformation.getCollectionName()).count();
return mongoOperations.getCollection(entityInformation.getCollectionName()).count();
}
/*
@@ -139,7 +143,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void delete(ID id) {
Assert.notNull(id, "The given id must not be null!");
template.remove(getIdQuery(id), entityInformation.getJavaType());
mongoOperations.remove(getIdQuery(id), entityInformation.getJavaType());
}
/*
@@ -161,6 +165,8 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void delete(Iterable<? extends T> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
for (T entity : entities) {
delete(entity);
}
@@ -173,7 +179,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
public void deleteAll() {
template.remove(new Query(), entityInformation.getCollectionName());
mongoOperations.remove(new Query(), entityInformation.getCollectionName());
}
/*
@@ -246,7 +252,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
return Collections.emptyList();
}
return template.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/**
@@ -256,7 +262,7 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements Paging
*/
protected MongoOperations getMongoOperations() {
return this.template;
return this.mongoOperations;
}
/**

View File

@@ -11,7 +11,6 @@
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd">
<xsd:import namespace="http://www.springframework.org/schema/beans"/>
<xsd:import namespace="http://www.springframework.org/schema/tool"/>
<xsd:import namespace="http://www.springframework.org/schema/context"
@@ -93,7 +92,16 @@ The password to use when connecting to a MongoDB server.
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attributeGroup ref="writeConcern" />
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
@@ -140,9 +148,10 @@ The Mongo URI string.]]></xsd:documentation>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoConverter for getting rich mapping functionality.
]]></xsd:documentation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
@@ -150,14 +159,14 @@ Defines a MongoConverter for getting rich mapping functionality.
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject
]]>
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
@@ -259,6 +268,18 @@ The name of the Mongo object that determines what server to monitor. (by default
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
@@ -268,11 +289,13 @@ The name of the Mongo object that determines what server to monitor. (by default
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
@@ -288,7 +311,19 @@ The Mongo driver options
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[

View File

@@ -15,10 +15,26 @@
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Set;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.mapping.Account;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.stereotype.Component;
import com.mongodb.DBObject;
/**
* Integration tests for {@link MongoParser}.
@@ -26,13 +42,48 @@ import org.springframework.core.io.ClassPathResource;
* @author Oliver Gierke
*/
public class MappingMongoConverterParserIntegrationTests {
DefaultListableBeanFactory factory;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter.xml"));
}
@Test
public void allowsDbFactoryRefAttribute() {
ConfigurableListableBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/converter.xml"));
factory.getBeanDefinition("converter");
factory.getBean("converter");
}
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {
CustomConversions conversions = factory.getBean(CustomConversions.class);
assertThat(conversions.hasCustomWriteTarget(Person.class), is(true));
assertThat(conversions.hasCustomWriteTarget(Account.class), is(true));
}
@Component
public static class SampleConverter implements Converter<Person, DBObject> {
public DBObject convert(Person source) {
return null;
}
}
@Component
public static class SampleConverterFactory implements GenericConverter {
public Set<ConvertiblePair> getConvertibleTypes() {
return Collections.singleton(new ConvertiblePair(Account.class, DBObject.class));
}
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return null;
}
}
}

View File

@@ -18,19 +18,25 @@ package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.PropertyValue;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.ConstructorArgumentValues;
import org.springframework.beans.factory.config.ConstructorArgumentValues.ValueHolder;
import org.springframework.beans.factory.parsing.BeanDefinitionParsingException;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
/**
* Integration tests for {@link MongoDbFactoryParser}.
@@ -38,19 +44,79 @@ import com.mongodb.MongoURI;
* @author Oliver Gierke
*/
public class MongoDbFactoryParserIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void parsesWriteConcern() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
BeanDefinition definition = factory.getBeanDefinition("first");
List<PropertyValue> values = definition.getPropertyValues().getPropertyValueList();
assertThat(values, hasItem(new PropertyValue("writeConcern", "SAFE")));
public void testWriteConcern() throws Exception {
SimpleMongoDbFactory dbFactory = new SimpleMongoDbFactory(new Mongo("localhost"), "database");
dbFactory.setWriteConcern(WriteConcern.SAFE);
dbFactory.getDb();
assertThat(WriteConcern.SAFE, is(dbFactory.getWriteConcern()));
}
@Test
public void parsesWriteConcern() {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean.xml");
assertWriteConcern(ctx, WriteConcern.SAFE);
}
@Test
public void parsesCustomWriteConcern() {
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean-custom-write-concern.xml");
assertWriteConcern(ctx, new WriteConcern("rack1"));
}
/**
* @see DATAMONGO-331
*/
@Test
public void readsReplicasWriteConcernCorrectly() {
ApplicationContext ctx = new ClassPathXmlApplicationContext("namespace/db-factory-bean-custom-write-concern.xml");
MongoDbFactory factory = ctx.getBean("second", MongoDbFactory.class);
DB db = factory.getDb();
assertThat(db.getWriteConcern(), is(WriteConcern.REPLICAS_SAFE));
}
private void assertWriteConcern(ClassPathXmlApplicationContext ctx, WriteConcern expectedWriteConcern) {
SimpleMongoDbFactory dbFactory = ctx.getBean("first", SimpleMongoDbFactory.class);
DB db = dbFactory.getDb();
assertThat("db", is(db.getName()));
MyWriteConcern myDbFactoryWriteConcern = new MyWriteConcern(dbFactory.getWriteConcern());
MyWriteConcern myDbWriteConcern = new MyWriteConcern(db.getWriteConcern());
MyWriteConcern myExpectedWriteConcern = new MyWriteConcern(expectedWriteConcern);
assertThat(myDbFactoryWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myExpectedWriteConcern));
assertThat(myDbWriteConcern, equalTo(myDbFactoryWriteConcern));
}
//This test will fail since equals in WriteConcern uses == for _w and not .equals
public void testWriteConcernEquality() {
String s1 = new String("rack1");
String s2 = new String("rack1");
WriteConcern wc1 = new WriteConcern(s1);
WriteConcern wc2 = new WriteConcern(s2);
assertThat(wc1, equalTo(wc2));
}
@Test
public void createsDbFactoryBean() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/db-factory-bean.xml"));
factory.getBean("first");
}
@@ -60,7 +126,7 @@ public class MongoDbFactoryParserIntegrationTests {
@Test
public void parsesMaxAutoConnectRetryTimeCorrectly() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/db-factory-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/db-factory-bean.xml"));
Mongo mongo = factory.getBean(Mongo.class);
assertThat(mongo.getMongoOptions().maxAutoConnectRetryTime, is(27L));
}
@@ -71,7 +137,7 @@ public class MongoDbFactoryParserIntegrationTests {
@Test
public void setsUpMongoDbFactoryUsingAMongoUri() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/mongo-uri.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongoDbFactory");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
@@ -80,11 +146,30 @@ public class MongoDbFactoryParserIntegrationTests {
assertThat(argument, is(notNullValue()));
}
/**
* @see DATADOC-306
*/
@Test
public void setsUpMongoDbFactoryUsingAMongoUriWithoutCredentials() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-no-credentials.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongoDbFactory");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
assertThat(constructorArguments.getArgumentCount(), is(1));
ValueHolder argument = constructorArguments.getArgumentValue(0, MongoURI.class);
assertThat(argument, is(notNullValue()));
MongoDbFactory dbFactory = factory.getBean("mongoDbFactory", MongoDbFactory.class);
DB db = dbFactory.getDb();
assertThat(db.getName(), is("database"));
}
/**
* @see DATADOC-295
*/
@Test(expected = BeanDefinitionParsingException.class)
public void rejectsUriPlusDetailedConfiguration() {
new XmlBeanFactory(new ClassPathResource("namespace/mongo-uri-and-details.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-and-details.xml"));
}
}

View File

@@ -45,18 +45,35 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
public void testParsingMongoWithReplicaSets() throws Exception {
assertTrue(ctx.containsBean("replicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&replicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("127.0.0.1", replicaSetSeeds.get(0).getHost());
assertEquals(10001, replicaSetSeeds.get(0).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
}
@Test
public void testParsingWithPropertyPlaceHolder() throws Exception {
assertTrue(ctx.containsBean("manyReplicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&manyReplicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("192.168.174.130", replicaSetSeeds.get(0).getHost());
assertEquals(27017, replicaSetSeeds.get(0).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(1).getHost());
assertEquals(27018, replicaSetSeeds.get(1).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(2).getHost());
assertEquals(27019, replicaSetSeeds.get(2).getPort());
}
@Test
@Ignore("CI infrastructure does not yet support replica sets")
public void testMongoWithReplicaSets() {
@@ -67,10 +84,10 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
assertEquals("localhost", servers.get(1).getHost());
assertEquals(10001, servers.get(0).getPort());
assertEquals(10002, servers.get(1).getPort());
MongoTemplate template = new MongoTemplate(mongo, "admin");
CommandResult result = template.executeCommand("{replSetGetStatus : 1}");
assertEquals("blort", result.getString("set"));
}
}

View File

@@ -68,6 +68,7 @@ public class MongoNamespaceTests {
}
@Test
@SuppressWarnings("deprecation")
public void testMongoSingletonWithPropertyPlaceHolders() throws Exception {
assertTrue(ctx.containsBean("mongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&mongo");

View File

@@ -20,11 +20,13 @@ import static org.hamcrest.Matchers.*;
import java.util.List;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.PropertyValue;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.io.ClassPathResource;
/**
@@ -33,11 +35,20 @@ import org.springframework.core.io.ClassPathResource;
* @author Oliver Gierke
*/
public class MongoParserIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void readsMongoAttributesCorrectly() {
ConfigurableListableBeanFactory factory = new XmlBeanFactory(new ClassPathResource("namespace/mongo-bean.xml"));
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-bean.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongo");
List<PropertyValue> values = definition.getPropertyValues().getPropertyValueList();

View File

@@ -0,0 +1,56 @@
package org.springframework.data.mongodb.config;
import com.mongodb.WriteConcern;
public class MyWriteConcern {
public MyWriteConcern(WriteConcern wc) {
this._w = wc.getWObject();
this._continueOnErrorForInsert = wc.getContinueOnErrorForInsert();
this._fsync = wc.getFsync();
this._j = wc.getJ();
this._wtimeout = wc.getWtimeout();
}
Object _w = 0;
int _wtimeout = 0;
boolean _fsync = false;
boolean _j = false;
boolean _continueOnErrorForInsert = false;
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (_continueOnErrorForInsert ? 1231 : 1237);
result = prime * result + (_fsync ? 1231 : 1237);
result = prime * result + (_j ? 1231 : 1237);
result = prime * result + ((_w == null) ? 0 : _w.hashCode());
result = prime * result + _wtimeout;
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
MyWriteConcern other = (MyWriteConcern) obj;
if (_continueOnErrorForInsert != other._continueOnErrorForInsert)
return false;
if (_fsync != other._fsync)
return false;
if (_j != other._j)
return false;
if (_w == null) {
if (other._w != null)
return false;
} else if (!_w.equals(other._w))
return false;
if (_wtimeout != other._wtimeout)
return false;
return true;
}
}

View File

@@ -39,7 +39,6 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -47,6 +46,7 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
@@ -61,6 +61,8 @@ import com.mongodb.DBObject;
import com.mongodb.DBRef;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
/**
@@ -88,7 +90,7 @@ public class MongoTemplateTests {
CustomConversions conversions = new CustomConversions(Arrays.asList(DateToDateTimeConverter.INSTANCE,
DateTimeToDateConverter.INSTANCE));
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(PersonWith_idPropertyOfTypeObjectId.class,
PersonWith_idPropertyOfTypeString.class, PersonWithIdPropertyOfTypeObjectId.class,
@@ -101,7 +103,7 @@ public class MongoTemplateTests {
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.setCustomConversions(conversions);
mappingConverter.afterPropertiesSet();
this.mappingTemplate = new MongoTemplate(factory, mappingConverter);
}
@@ -109,14 +111,15 @@ public class MongoTemplateTests {
public void setUp() {
cleanDb();
}
@After
public void cleanUp() {
cleanDb();
}
protected void cleanDb() {
template.dropCollection(template.getCollectionName(Person.class));
template.dropCollection(template.getCollectionName(PersonWithAList.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeObjectId.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeString.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeObjectId.class));
@@ -141,7 +144,7 @@ public class MongoTemplateTests {
}
@Test
public void updateFailure() throws Exception {
public void bogusUpdateDoesNotTriggerException() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(factory);
mongoTemplate.setWriteResultChecking(WriteResultChecking.EXCEPTION);
@@ -152,10 +155,7 @@ public class MongoTemplateTests {
Query q = new Query(Criteria.where("BOGUS").gt(22));
Update u = new Update().set("firstName", "Sven");
thrown.expect(DataIntegrityViolationException.class);
thrown.expectMessage(endsWith("0 documents updated"));
mongoTemplate.updateFirst(q, u, Person.class);
}
@Test
@@ -168,11 +168,10 @@ public class MongoTemplateTests {
p2.setAge(40);
template.insert(p2);
template.ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP), Person.class);
template.indexOps(Person.class).ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP));
DBCollection coll = template.getCollection(template.getCollectionName(Person.class));
List<DBObject> indexInfo = coll.getIndexInfo();
assertThat(indexInfo.size(), is(2));
String indexKey = null;
boolean unique = false;
@@ -187,6 +186,16 @@ public class MongoTemplateTests {
assertThat(indexKey, is("{ \"age\" : -1}"));
assertThat(unique, is(true));
assertThat(dropDupes, is(true));
List<IndexInfo> indexInfoList = template.indexOps(Person.class).getIndexInfo();
System.out.println(indexInfoList);
assertThat(indexInfoList.size(), is(2));
IndexInfo ii = indexInfoList.get(1);
assertThat(ii.isUnique(), is(true));
assertThat(ii.isDropDuplicates(), is(true));
assertThat(ii.isSparse(), is(false));
assertThat(ii.getFieldSpec().containsKey("age"), is(true));
assertThat(ii.getFieldSpec().containsValue(Order.DESCENDING), is(true));
}
@Test
@@ -389,6 +398,56 @@ public class MongoTemplateTests {
assertThat(template.findAll(entityClass).size(), is(count));
}
/**
* @see DATAMONGO-234
*/
@Test
public void testFindAndUpdate() {
template.insert(new Person("Tom", 21));
template.insert(new Person("Dick", 22));
template.insert(new Person("Harry", 23));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().inc("age", 1);
Person p = template.findAndModify(query, update, Person.class); // return old
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(24));
p = template.findAndModify(query, update, Person.class, "person");
assertThat(p.getAge(), is(24));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(25));
p = template.findAndModify(query, update, new FindAndModifyOptions().returnNew(true), Person.class);
assertThat(p.getAge(), is(26));
p = template.findAndModify(query, update, null, Person.class, "person");
assertThat(p.getAge(), is(26));
p = template.findOne(query, Person.class);
assertThat(p.getAge(), is(27));
Query query2 = new Query(Criteria.where("firstName").is("Mary"));
p = template.findAndModify(query2, update, new FindAndModifyOptions().returnNew(true).upsert(true), Person.class);
assertThat(p.getFirstName(), is("Mary"));
assertThat(p.getAge(), is(1));
}
@Test
public void testFindAndUpdateUpsert() {
template.insert(new Person("Tom", 21));
template.insert(new Person("Dick", 22));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().set("age", 23);
Person p = template.findAndModify(query, update, new FindAndModifyOptions().upsert(true).returnNew(true),
Person.class);
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
}
@Test
public void testFindAndRemove() throws Exception {
@@ -745,8 +804,32 @@ public class MongoTemplateTests {
}
@Test
public void testUsingSlaveOk() throws Exception {
this.template.execute("slaveOkTest", new CollectionCallback<Object>() {
public void testFindOneWithSort() {
PersonWithAList p = new PersonWithAList();
p.setFirstName("Sven");
p.setAge(22);
template.insert(p);
PersonWithAList p2 = new PersonWithAList();
p2.setFirstName("Erik");
p2.setAge(21);
template.insert(p2);
PersonWithAList p3 = new PersonWithAList();
p3.setFirstName("Mark");
p3.setAge(40);
template.insert(p3);
// test query with a sort
Query q2 = new Query(Criteria.where("age").gt(10));
q2.sort().on("age", Order.DESCENDING);
PersonWithAList p5 = template.findOne(q2, PersonWithAList.class);
assertThat(p5.getFirstName(), is("Mark"));
}
@Test
public void testUsingReadPreference() throws Exception {
this.template.execute("readPref", new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
assertThat(collection.getOptions(), is(0));
assertThat(collection.getDB().getOptions(), is(0));
@@ -754,10 +837,10 @@ public class MongoTemplateTests {
}
});
MongoTemplate slaveTemplate = new MongoTemplate(factory);
slaveTemplate.setSlaveOk(true);
slaveTemplate.execute("slaveOkTest", new CollectionCallback<Object>() {
slaveTemplate.setReadPreference(ReadPreference.SECONDARY);
slaveTemplate.execute("readPref", new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
assertThat(collection.getOptions(), is(4));
assertThat(collection.getReadPreference(), is(ReadPreference.SECONDARY));
assertThat(collection.getDB().getOptions(), is(0));
return null;
}
@@ -793,6 +876,53 @@ public class MongoTemplateTests {
assertThat(result.getFirstName(), is("Carter"));
}
@Test
public void testWriteConcernResolver() {
PersonWithIdPropertyOfTypeObjectId person = new PersonWithIdPropertyOfTypeObjectId();
person.setId(new ObjectId());
person.setFirstName("Dave");
template.setWriteConcern(WriteConcern.NONE);
template.save(person);
WriteResult result = template.updateFirst(query(where("id").is(person.getId())), update("firstName", "Carter"),
PersonWithIdPropertyOfTypeObjectId.class);
WriteConcern lastWriteConcern = result.getLastConcern();
assertThat(lastWriteConcern, equalTo(WriteConcern.NONE));
FsyncSafeWriteConcernResolver resolver = new FsyncSafeWriteConcernResolver();
template.setWriteConcernResolver(resolver);
Query q = query(where("_id").is(person.getId()));
Update u = update("firstName", "Carter");
result = template.updateFirst(q, u, PersonWithIdPropertyOfTypeObjectId.class);
lastWriteConcern = result.getLastConcern();
assertThat(lastWriteConcern, equalTo(WriteConcern.FSYNC_SAFE));
MongoAction lastMongoAction = resolver.getMongoAction();
assertThat(lastMongoAction.getCollectionName(), is("personWithIdPropertyOfTypeObjectId"));
assertThat(lastMongoAction.getDefaultWriteConcern(), equalTo(WriteConcern.NONE));
assertThat(lastMongoAction.getDocument(), notNullValue());
assertThat(lastMongoAction.getEntityClass().toString(), is(PersonWithIdPropertyOfTypeObjectId.class.toString()));
assertThat(lastMongoAction.getMongoActionOperation(), is(MongoActionOperation.UPDATE));
assertThat(lastMongoAction.getQuery(), equalTo(q.getQueryObject()));
assertThat(lastMongoAction.getDocument(), equalTo(u.getUpdateObject()));
}
private class FsyncSafeWriteConcernResolver implements WriteConcernResolver {
private MongoAction mongoAction;
public WriteConcern resolve(MongoAction action) {
this.mongoAction = action;
return WriteConcern.FSYNC_SAFE;
}
public MongoAction getMongoAction() {
return mongoAction;
}
}
/**
* @see DATADOC-246
*/
@@ -827,7 +957,7 @@ public class MongoTemplateTests {
}
});
assertEquals(3, names.size());
//template.remove(new Query(), Person.class);
// template.remove(new Query(), Person.class);
}
/**
@@ -855,7 +985,7 @@ public class MongoTemplateTests {
});
assertEquals(1, names.size());
//template.remove(new Query(), Person.class);
// template.remove(new Query(), Person.class);
}
/**
@@ -863,19 +993,19 @@ public class MongoTemplateTests {
*/
@Test
public void countsDocumentsCorrectly() {
assertThat(template.count(new Query(), Person.class), is(0L));
Person dave = new Person("Dave");
Person carter = new Person("Carter");
template.save(dave);
template.save(carter);
assertThat(template.count(null, Person.class), is(2L));
assertThat(template.count(query(where("firstName").is("Carter")), Person.class), is(1L));
}
/**
* @see DATADOC-183
*/
@@ -883,7 +1013,7 @@ public class MongoTemplateTests {
public void countRejectsNullEntityClass() {
template.count(null, (Class<?>) null);
}
/**
* @see DATADOC-183
*/
@@ -891,7 +1021,7 @@ public class MongoTemplateTests {
public void countRejectsEmptyCollectionName() {
template.count(null, "");
}
/**
* @see DATADOC-183
*/
@@ -902,11 +1032,11 @@ public class MongoTemplateTests {
@Test
public void returnsEntityWhenQueryingForDateTime() {
DateTime dateTime = new DateTime(2011, 3, 3, 12, 0, 0, 0);
TestClass testClass = new TestClass(dateTime);
mappingTemplate.save(testClass);
List<TestClass> testClassList = mappingTemplate.find(new Query(Criteria.where("myDate").is(dateTime.toDate())),
TestClass.class);
assertThat(testClassList.size(), is(1));
@@ -918,18 +1048,18 @@ public class MongoTemplateTests {
*/
@Test
public void removesEntityFromCollection() {
template.remove(new Query(), "mycollection");
Person person = new Person("Dave");
template.save(person, "mycollection");
assertThat(template.findAll(TestClass.class, "mycollection").size(), is(1));
template.remove(person, "mycollection");
assertThat(template.findAll(Person.class, "mycollection").isEmpty(), is(true));
}
public class TestClass {
private DateTime myDate;
@@ -954,7 +1084,7 @@ public class MongoTemplateTests {
}
static enum DateToDateTimeConverter implements Converter<Date, DateTime> {
INSTANCE;
public DateTime convert(Date source) {

View File

@@ -15,21 +15,37 @@
*/
package org.springframework.data.mongodb.core;
import static org.junit.Assert.assertTrue;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.context.support.GenericApplicationContext;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* Unit tests for {@link MongoTemplate}.
*
@@ -46,9 +62,15 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Mock
DB db;
@Mock
DBCollection collection;
@Before
public void setUp() {
this.template = new MongoTemplate(mongo, "database");
when(mongo.getDB("database")).thenReturn(db);
when(db.getCollection(Mockito.any(String.class))).thenReturn(collection);
}
@Test(expected = IllegalArgumentException.class)
@@ -75,6 +97,62 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertTrue(ReflectionTestUtils.getField(template, "mongoConverter") instanceof MappingMongoConverter);
}
@Test(expected = InvalidDataAccessApiUsageException.class)
public void rejectsNotFoundMapReduceResource() {
template.setApplicationContext(new GenericApplicationContext());
template.mapReduce("foo", "classpath:doesNotExist.js", "function() {}", Person.class);
}
/**
* @see DATAMONGO-322
*/
@Test(expected = InvalidDataAccessApiUsageException.class)
public void rejectsEntityWithNullIdIfNotSupportedIdType() {
Object entity = new NotAutogenerateableId();
template.save(entity);
}
/**
* @see DATAMONGO-322
*/
@Test
public void storesEntityWithSetIdAlthoughNotAutogenerateable() {
NotAutogenerateableId entity = new NotAutogenerateableId();
entity.id = 1;
template.save(entity);
}
/**
* @see DATAMONGO-322
*/
@Test
public void autogeneratesIdForEntityWithAutogeneratableId() {
MongoTemplate template = spy(this.template);
doReturn(new ObjectId()).when(template).saveDBObject(Mockito.any(String.class), Mockito.any(DBObject.class),
Mockito.any(Class.class));
AutogenerateableId entity = new AutogenerateableId();
template.save(entity);
assertThat(entity.id, is(notNullValue()));
}
class AutogenerateableId {
@Id
BigInteger id;
}
class NotAutogenerateableId {
@Id
Integer id;
}
/**
* Mocks out the {@link MongoTemplate#getDb()} method to return the {@link DB} mock instead of executing the actual
* behaviour.

View File

@@ -27,15 +27,28 @@ public class Person {
private Person friend;
private boolean active = true;
public Person() {
this.id = new ObjectId();
}
@Override
public String toString() {
return "Person [id=" + id + ", firstName=" + firstName + ", age=" + age + ", friend=" + friend + "]";
}
public Person(ObjectId id, String firstname) {
this.id = id;
this.firstName = firstname;
}
public Person(String firstname, int age) {
this();
this.firstName = firstname;
this.age = age;
}
public Person(String firstname) {
this();
this.firstName = firstname;
@@ -69,6 +82,13 @@ public class Person {
this.friend = friend;
}
/**
* @return the active
*/
public boolean isActive() {
return active;
}
/*
* (non-Javadoc)
*

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate.QueryCursorPreparer;
import org.springframework.data.mongodb.core.query.Query;
import com.mongodb.DBCursor;
/**
* Unit tests for {@link QueryCursorPreparer}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryCursorPreparerUnitTests {
@Mock
MongoDbFactory factory;
@Mock
DBCursor cursor;
/**
* @see DATAMONGO-185
*/
@Test
public void appliesHintsCorrectly() {
Query query = query(where("foo").is("bar")).withHint("hint");
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query);
preparer.prepare(cursor);
verify(cursor).hint("hint");
}
}

View File

@@ -55,7 +55,7 @@ public class CustomConversionsUnitTests {
}
/**
* @see DATADOC-240
* @see DATAMONGO-240
*/
@Test
public void considersObjectIdToBeSimpleType() {
@@ -68,7 +68,7 @@ public class CustomConversionsUnitTests {
}
/**
* @see DATADOC-240
* @see DATAMONGO-240
*/
@Test
public void considersCustomConverterForSimpleType() {
@@ -95,6 +95,7 @@ public class CustomConversionsUnitTests {
@Test
public void populatesConversionServiceCorrectly() {
@SuppressWarnings("deprecation")
GenericConversionService conversionService = ConversionServiceFactory.createDefaultConversionService();
assertThat(conversionService.canConvert(String.class, UUID.class), is(false));
@@ -106,7 +107,7 @@ public class CustomConversionsUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void doesNotConsiderTypeSimpleIfOnlyReadConverterIsRegistered() {
@@ -114,6 +115,18 @@ public class CustomConversionsUnitTests {
assertThat(conversions.isSimpleType(UUID.class), is(false));
}
/**
* @see DATAMONGO-298
*/
@Test
public void discoversConvertersForSubtypesOfMongoTypes() {
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToIntegerConverter.INSTANCE));
assertThat(conversions.hasCustomReadTarget(String.class, Integer.class), is(true));
assertThat(conversions.hasCustomWriteTarget(String.class, Integer.class), is(true));
}
enum UuidToStringConverter implements Converter<UUID, String> {
INSTANCE;
@@ -142,4 +155,11 @@ public class CustomConversionsUnitTests {
return 0L;
}
}
enum StringToIntegerConverter implements Converter<String, Integer> {
INSTANCE;
public Integer convert(String source) {
return 0;
}
}
}

View File

@@ -119,7 +119,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-130
* @see DATAMONGO-130
*/
@Test
public void writesMapTypeCorrectly() {
@@ -133,7 +133,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-130
* @see DATAMONGO-130
*/
@Test
public void readsMapWithCustomKeyTypeCorrectly() {
@@ -146,7 +146,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-128
* @see DATAMONGO-128
*/
@Test
public void usesDocumentsStoredTypeIfSubtypeOfRequest() {
@@ -159,7 +159,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-128
* @see DATAMONGO-128
*/
@Test
public void ignoresDocumentsStoredTypeIfCompletelyDifferentTypeRequested() {
@@ -185,7 +185,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-136
* @see DATAMONGO-136
*/
@Test
public void writesEnumsCorrectly() {
@@ -201,7 +201,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-209
* @see DATAMONGO-209
*/
@Test
public void writesEnumCollectionCorrectly() {
@@ -220,7 +220,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-136
* @see DATAMONGO-136
*/
@Test
public void readsEnumsCorrectly() {
@@ -231,7 +231,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-209
* @see DATAMONGO-209
*/
@Test
public void readsEnumCollectionsCorrectly() {
@@ -248,14 +248,14 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-144
* @see DATAMONGO-144
*/
@Test
public void considersFieldNameWhenWriting() {
Person person = new Person();
person.firstname ="Oliver";
person.firstname = "Oliver";
DBObject result = new BasicDBObject();
converter.write(person, result);
@@ -264,7 +264,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-144
* @see DATAMONGO-144
*/
@Test
public void considersFieldNameWhenReading() {
@@ -276,7 +276,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-145
* @see DATAMONGO-145
*/
@Test
public void writesCollectionWithInterfaceCorrectly() {
@@ -299,7 +299,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-145
* @see DATAMONGO-145
*/
@Test
public void readsCollectionWithInterfaceCorrectly() {
@@ -336,7 +336,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-161
* @see DATAMONGO-161
*/
@Test
public void readsNestedMapsCorrectly() {
@@ -363,7 +363,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATACMNS-42, DATADOC-171
* @see DATACMNS-42, DATAMONGO-171
*/
@Test
public void writesClassWithBigDecimal() {
@@ -381,7 +381,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATACMNS-42, DATADOC-171
* @see DATACMNS-42, DATAMONGO-171
*/
@Test
public void readsClassWithBigDecimal() {
@@ -417,7 +417,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-192
* @see DATAMONGO-192
*/
@Test
public void readsEmptySetsCorrectly() {
@@ -446,7 +446,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-207
* @see DATAMONGO-207
*/
@Test
public void convertsCustomEmptyMapCorrectly() {
@@ -461,7 +461,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-211
* @see DATAMONGO-211
*/
@Test
public void maybeConvertHandlesNullValuesCorrectly() {
@@ -495,7 +495,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-228
* @see DATAMONGO-228
*/
@Test
public void writesNullValuesForMaps() {
@@ -530,7 +530,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void writesMapOfListsCorrectly() {
@@ -554,7 +554,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void readsMapListValuesCorrectly() {
@@ -568,7 +568,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void writesMapsOfObjectsCorrectly() {
@@ -593,7 +593,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-235
* @see DATAMONGO-235
*/
@Test
public void readsMapOfObjectsListValuesCorrectly() {
@@ -607,7 +607,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapListNestedValuesCorrectly() {
@@ -621,12 +621,12 @@ public class MappingMongoConverterUnitTests {
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, source);
Object firstObjectInFoo = ((List<?>) result.mapOfObjects.get("Foo")).get(0);
assertThat(firstObjectInFoo, is(instanceOf(Map.class)));
assertThat((String)((Map<?,?>) firstObjectInFoo).get("Hello"), is(equalTo("World")));
assertThat((String) ((Map<?, ?>) firstObjectInFoo).get("Hello"), is(equalTo("World")));
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapDoublyNestedValuesCorrectly() {
@@ -646,7 +646,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-245
* @see DATAMONGO-245
*/
@Test
public void readsMapListDoublyNestedValuesCorrectly() {
@@ -668,7 +668,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void writesListOfMapsCorrectly() {
@@ -676,7 +676,7 @@ public class MappingMongoConverterUnitTests {
Map<String, Locale> map = Collections.singletonMap("Foo", Locale.ENGLISH);
CollectionWrapper wrapper = new CollectionWrapper();
wrapper.listOfMaps = new ArrayList<Map<String,Locale>>();
wrapper.listOfMaps = new ArrayList<Map<String, Locale>>();
wrapper.listOfMaps.add(map);
DBObject result = new BasicDBObject();
@@ -692,7 +692,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void readsListOfMapsCorrectly() {
@@ -713,12 +713,12 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-259
* @see DATAMONGO-259
*/
@Test
public void writesPlainMapOfCollectionsCorrectly() {
Map<String,List<Locale>> map = Collections.singletonMap("Foo", Arrays.asList(Locale.US));
Map<String, List<Locale>> map = Collections.singletonMap("Foo", Arrays.asList(Locale.US));
DBObject result = new BasicDBObject();
converter.write(map, result);
@@ -733,7 +733,7 @@ public class MappingMongoConverterUnitTests {
}
/**
* @see DATADOC-285
* @see DATAMONGO-285
*/
@Test
@SuppressWarnings({ "unchecked", "rawtypes" })
@@ -760,6 +760,102 @@ public class MappingMongoConverterUnitTests {
assertEquals(list.get(1), listFromMongo.get(1));
}
/**
* @see DATAMONGO-309
*/
@Test
@SuppressWarnings({ "unchecked" })
public void writesArraysAsMapValuesCorrectly() {
ClassWithMapProperty wrapper = new ClassWithMapProperty();
wrapper.mapOfObjects = new HashMap<String, Object>();
wrapper.mapOfObjects.put("foo", new String[] { "bar" });
DBObject result = new BasicDBObject();
converter.write(wrapper, result);
Object mapObject = result.get("mapOfObjects");
assertThat(mapObject, is(BasicDBObject.class));
DBObject map = (DBObject) mapObject;
Object valueObject = map.get("foo");
assertThat(valueObject, is(BasicDBList.class));
List<Object> list = (List<Object>) valueObject;
assertThat(list.size(), is(1));
assertThat(list, hasItem((Object) "bar"));
}
/**
* @see DATAMONGO-324
*/
@Test
public void writesDbObjectCorrectly() {
DBObject dbObject = new BasicDBObject();
dbObject.put("foo", "bar");
DBObject result = new BasicDBObject();
converter.write(dbObject, result);
result.removeField(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY);
assertThat(dbObject, is(result));
}
/**
* @see DATAMONGO-324
*/
@Test
public void readsDbObjectCorrectly() {
DBObject dbObject = new BasicDBObject();
dbObject.put("foo", "bar");
DBObject result = converter.read(DBObject.class, dbObject);
assertThat(result, is(dbObject));
}
/**
* @see DATAMONGO-329
*/
@Test
public void writesMapAsGenericFieldCorrectly() {
Map<String, A<String>> objectToSave = new HashMap<String, A<String>>();
objectToSave.put("test", new A<String>("testValue"));
A<Map<String, A<String>>> a = new A<Map<String, A<String>>>(objectToSave);
DBObject result = new BasicDBObject();
converter.write(a, result);
assertThat((String) result.get(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY), is(A.class.getName()));
assertThat((String) result.get("valueType"), is(HashMap.class.getName()));
DBObject object = (DBObject) result.get("value");
assertThat(object, is(notNullValue()));
DBObject inner = (DBObject) object.get("test");
assertThat(inner, is(notNullValue()));
assertThat((String) inner.get(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY), is(A.class.getName()));
assertThat((String) inner.get("valueType"), is(String.class.getName()));
assertThat((String) inner.get("value"), is("testValue"));
}
@Test
public void writesIntIdCorrectly() {
ClassWithIntId value = new ClassWithIntId();
value.id = 5;
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("_id"), is((Object) 5));
}
class GenericType<T> {
T content;
}
@@ -771,7 +867,19 @@ public class MappingMongoConverterUnitTests {
}
enum SampleEnum {
FIRST, SECOND;
FIRST {
@Override
void method() {
}
},
SECOND {
@Override
void method() {
}
};
abstract void method();
}
class Address {
@@ -779,6 +887,7 @@ public class MappingMongoConverterUnitTests {
String city;
}
interface Contact {
}
@@ -812,7 +921,7 @@ public class MappingMongoConverterUnitTests {
}
class ClassWithNestedMaps {
Map<String, Map<String, Map<String, String>>> nestedMaps;
Map<String, Map<String, Map<String, String>>> nestedMaps;
}
class BirthDateContainer {
@@ -839,6 +948,23 @@ public class MappingMongoConverterUnitTests {
@Id
BigInteger id;
}
class A<T> {
String valueType;
T value;
public A(T value) {
this.valueType = value.getClass().getName();
this.value = value;
}
}
class ClassWithIntId {
@Id
int id;
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {

View File

@@ -73,7 +73,7 @@ public class GeoSpatialTests {
applicationContext = new AnnotationConfigApplicationContext(GeoSpatialAppConfig.class);
template = applicationContext.getBean(MongoTemplate.class);
template.setWriteConcern(WriteConcern.FSYNC_SAFE);
template.ensureIndex(new GeospatialIndex("location"), Venue.class);
template.indexOps(Venue.class).ensureIndex(new GeospatialIndex("location"));
indexCreated();
addVenues();
parser = new SpelExpressionParser();

View File

@@ -69,6 +69,7 @@ public class MappingTests {
MongoCollectionUtils.getPreferredCollectionName(PersonMultiDimArrays.class),
MongoCollectionUtils.getPreferredCollectionName(PersonMultiCollection.class),
MongoCollectionUtils.getPreferredCollectionName(PersonWithDbRef.class),
MongoCollectionUtils.getPreferredCollectionName(PersonWithLongDBRef.class),
MongoCollectionUtils.getPreferredCollectionName(PersonNullProperties.class),
MongoCollectionUtils.getPreferredCollectionName(Account.class),
MongoCollectionUtils.getPreferredCollectionName(PrimitiveId.class),
@@ -353,7 +354,31 @@ public class MappingTests {
Person p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("New Town"));
}
@Test
@SuppressWarnings("rawtypes")
public void testUpsert() {
Address addr = new Address();
addr.setLines(new String[]{"1234 W. 1st Street", "Apt. 12"});
addr.setCity("Anytown");
addr.setPostalCode(12345);
addr.setCountry("USA");
Person p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertNull(p2);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), Person.class);
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
template.dropCollection(Person.class);
template.upsert(query(where("ssn").is(1111).and("firstName").is("Query").and("lastName").is("Update")), update("address", addr), "person");
p2 = template.findOne(query(where("ssn").is(1111)), Person.class);
assertThat(p2.getAddress().getCity(), is("Anytown"));
}
@Test
public void testOrQuery() {
PersonWithObjectId p1 = new PersonWithObjectId(1, "first", "");
@@ -361,8 +386,6 @@ public class MappingTests {
PersonWithObjectId p2 = new PersonWithObjectId(2, "second", "");
template.save(p2);
Query one = query(where("ssn").is(1));
Query two = query(where("ssn").is(2));
List<PersonWithObjectId> results = template.find(new Query(
new Criteria().orOperator(where("ssn").is(1), where("ssn").is(2))), PersonWithObjectId.class);

View File

@@ -32,7 +32,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
*/
public class AbstractMongoEventListenerUnitTest {
public class AbstractMongoEventListenerUnitTests {
@Test
public void invokesCallbackForEventForPerson() {
@@ -115,6 +115,17 @@ public class AbstractMongoEventListenerUnitTest {
assertThat(personListener.invokedOnAfterLoad, is(false));
assertThat(contactListener.invokedOnAfterLoad, is(true));
}
/**
* @see DATADOC-333
*/
@Test
@SuppressWarnings({ "rawtypes", "unchecked" })
public void handlesUntypedImplementations() {
UntypedEventListener listener = new UntypedEventListener();
listener.onApplicationEvent(new MongoMappingEvent(new Object(), new BasicDBObject()));
}
class SamplePersonEventListener extends AbstractMongoEventListener<Person> {
@@ -163,4 +174,9 @@ public class AbstractMongoEventListenerUnitTest {
invokedOnAfterLoad = true;
}
}
@SuppressWarnings("rawtypes")
class UntypedEventListener extends AbstractMongoEventListener {
}
}

View File

@@ -0,0 +1,196 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import java.util.Arrays;
import java.util.HashSet;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.mapreduce.GroupBy.*;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class GroupByTests {
@Autowired
MongoDbFactory factory;
@Autowired
ApplicationContext applicationContext;
//@Autowired
//MongoTemplate mongoTemplate;
MongoTemplate mongoTemplate;
@Autowired
@SuppressWarnings("unchecked")
public void setMongo(Mongo mongo) throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(XObject.class)));
mappingContext.afterPropertiesSet();
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.afterPropertiesSet();
this.mongoTemplate = new MongoTemplate(factory, mappingConverter);
mongoTemplate.setApplicationContext(applicationContext);
}
@Before
public void setUp() {
cleanDb();
}
@After
public void cleanUp() {
cleanDb();
}
protected void cleanDb() {
mongoTemplate.dropCollection(mongoTemplate.getCollectionName(XObject.class));
mongoTemplate.dropCollection("group_test_collection");
}
@Test
public void singleKeyCreation() {
DBObject gc = new GroupBy("a").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void multipleKeyCreation() {
DBObject gc = GroupBy.key("a","b").getGroupByObject();
//String expected = "{ \"group\" : { \"ns\" : \"test\" , \"key\" : { \"a\" : 1 , \"b\" : 1} , \"cond\" : null , \"$reduce\" : null , \"initial\" : null }}";
String expected = "{ \"key\" : { \"a\" : 1 , \"b\" : 1} , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void keyFunctionCreation() {
DBObject gc = GroupBy.keyFunction("classpath:keyFunction.js").getGroupByObject();
String expected = "{ \"$keyf\" : \"classpath:keyFunction.js\" , \"$reduce\" : null , \"initial\" : null }";
Assert.assertEquals(expected, gc.toString());
}
@Test
public void SimpleGroup() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.key("x").initialDocument(new BasicDBObject("count", 0)).reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithKeyFunction() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("function(doc) { return { x : doc.x }; }").initialDocument("{ count: 0 }").reduceFunction("function(doc, prev) { prev.count += 1 }"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithFunctionsAsResources() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group("group_test_collection",
GroupBy.keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
@Test
public void SimpleGroupWithQueryAndFunctionsAsResources() {
createGroupByData();
GroupByResults<XObject> results;
results = mongoTemplate.group(where("x").gt(0),
"group_test_collection",
keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
assertMapReduceResults(results);
}
private void assertMapReduceResults(GroupByResults<XObject> results) {
DBObject dboRawResults = results.getRawResults();
String expected = "{ \"serverUsed\" : \"127.0.0.1:27017\" , \"retval\" : [ { \"x\" : 1.0 , \"count\" : 2.0} , { \"x\" : 2.0 , \"count\" : 1.0} , { \"x\" : 3.0 , \"count\" : 3.0}] , \"count\" : 6.0 , \"keys\" : 3 , \"ok\" : 1.0}";
Assert.assertEquals(expected, dboRawResults.toString());
int numResults = 0;
for (XObject xObject : results) {
if (xObject.getX() == 1) {
Assert.assertEquals(2, xObject.getCount(), 0.001);
}
if (xObject.getX() == 2) {
Assert.assertEquals(1, xObject.getCount(), 0.001);
}
if (xObject.getX() == 3) {
Assert.assertEquals(3, xObject.getCount(), 0.001);
}
numResults++;
}
Assert.assertEquals(3, numResults);
Assert.assertEquals(6, results.getCount(), 0.001);
Assert.assertEquals(3, results.getKeys());
}
private void createGroupByData() {
DBCollection c = mongoTemplate.getDb().getCollection("group_test_collection");
c.save(new BasicDBObject("x", 1));
c.save(new BasicDBObject("x", 1));
c.save(new BasicDBObject("x", 2));
c.save(new BasicDBObject("x", 3));
c.save(new BasicDBObject("x", 3));
c.save(new BasicDBObject("x", 3));
}
}

View File

@@ -60,6 +60,7 @@ public class MapReduceTests {
MongoTemplate template;
@Autowired
MongoDbFactory factory;
MongoTemplate mongoTemplate;
@Autowired

View File

@@ -0,0 +1,35 @@
package org.springframework.data.mongodb.core.mapreduce;
public class XObject {
private float x;
private float count;
public float getX() {
return x;
}
public void setX(float x) {
this.x = x;
}
public float getCount() {
return count;
}
public void setCount(float count) {
this.count = count;
}
@Override
public String toString() {
return "XObject [x=" + x + " count = " + count + "]";
}
}

View File

@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.core.query;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.math.BigInteger;
@@ -59,7 +61,7 @@ public class QueryMapperUnitTests {
MappingMongoConverter converter = new MappingMongoConverter(factory, context);
converter.afterPropertiesSet();
mapper = new QueryMapper(converter.getConversionService());
mapper = new QueryMapper(converter);
}
@Test
@@ -89,13 +91,22 @@ public class QueryMapperUnitTests {
assertThat(result.get("_id"), is((Object) "1"));
}
@Test
public void handlesObjectIdCapableBigIntegerIdsCorrectly() {
ObjectId id = new ObjectId();
DBObject dbObject = new BasicDBObject("id", new BigInteger(id.toString(), 16));
DBObject result = mapper.getMappedObject(dbObject, null);
assertThat(result.get("_id"), is((Object) id));
}
/**
* @see DATADOC-278
* @see DATAMONGO-278
*/
@Test
public void translates$NeCorrectly() {
Criteria criteria = Criteria.where("foo").ne(new ObjectId().toString());
Criteria criteria = where("foo").ne(new ObjectId().toString());
DBObject result = mapper.getMappedObject(criteria.getCriteriaObject(), context.getPersistentEntity(Sample.class));
Object object = result.get("_id");
@@ -104,6 +115,18 @@ public class QueryMapperUnitTests {
assertThat(dbObject.get("$ne"), is(ObjectId.class));
}
/**
* @see DATAMONGO-326
*/
@Test
public void handlesEnumsCorrectly() {
Query query = query(where("foo").is(Enum.INSTANCE));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(String.class));
}
class Sample {
@Id
@@ -115,4 +138,8 @@ public class QueryMapperUnitTests {
@Id
private BigInteger id;
}
enum Enum {
INSTANCE;
}
}

View File

@@ -365,4 +365,22 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
Metrics.KILOMETERS));
assertThat(results.getContent().isEmpty(), is(false));
}
/**
* @see DATAMONGO-323
*/
@Test
public void considersSortForAnnotatedQuery() {
List<Person> result = repository.findByAgeLessThan(60, new Sort("firstname"));
assertThat(result.size(), is(7));
assertThat(result.get(0), is(alicia));
assertThat(result.get(1), is(boyd));
assertThat(result.get(2), is(carter));
assertThat(result.get(3), is(dave));
assertThat(result.get(4), is(leroi));
assertThat(result.get(5), is(oliver));
assertThat(result.get(6), is(stefan));
}
}

View File

@@ -72,6 +72,9 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
List<Person> findByFirstnameLike(String firstname);
List<Person> findByFirstnameLikeOrderByLastnameAsc(String firstname, Sort sort);
@Query("{'age' : { '$lt' : ?0 } }")
List<Person> findByAgeLessThan(int age, Sort sort);
/**
* Returns a page of {@link Person}s with a lastname mathing the given one (*-wildcards supported).

View File

@@ -3,9 +3,12 @@ package org.springframework.data.mongodb.repository.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.xml.XmlBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionReader;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.io.ClassPathResource;
import org.springframework.data.mongodb.repository.AbstractPersonRepositoryIntegrationTests;
import org.springframework.test.context.ContextConfiguration;
@@ -18,10 +21,21 @@ import org.springframework.test.context.ContextConfiguration;
@ContextConfiguration
public class MongoNamespaceIntegrationTests extends AbstractPersonRepositoryIntegrationTests {
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Before
@Override
public void setUp() {
super.setUp();
factory = new DefaultListableBeanFactory();
reader = new XmlBeanDefinitionReader(factory);
}
@Test
public void assertDefaultMappingContextIsWired() {
XmlBeanFactory factory = new XmlBeanFactory(new ClassPathResource("MongoNamespaceIntegrationTests-context.xml",
reader.loadBeanDefinitions(new ClassPathResource("MongoNamespaceIntegrationTests-context.xml",
getClass()));
BeanDefinition definition = factory.getBeanDefinition("personRepository");
assertThat(definition, is(notNullValue()));

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import static org.mockito.Mockito.*;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import java.util.Arrays;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.BasicDBList;
/**
* Unit tests for {@link ConvertingParameterAccessor}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class ConvertingParameterAccessorUnitTests {
@Mock
MongoDbFactory factory;
@Mock
MongoParameterAccessor accessor;
MongoMappingContext context;
MappingMongoConverter converter;
@Before
public void setUp() {
context = new MongoMappingContext();
converter = new MappingMongoConverter(factory, context);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullWriter() {
new MappingMongoConverter(null, context);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullContext() {
new MappingMongoConverter(factory, null);
}
@Test
public void convertsCollectionUponAccess() {
when(accessor.getBindableValue(0)).thenReturn(Arrays.asList("Foo"));
ConvertingParameterAccessor parameterAccessor = new ConvertingParameterAccessor(converter, accessor);
Object result = parameterAccessor.getBindableValue(0);
BasicDBList reference = new BasicDBList();
reference.add("Foo");
assertThat(result, is((Object) reference));
}
}

View File

@@ -30,7 +30,6 @@ import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
@@ -50,9 +49,6 @@ import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import org.springframework.data.repository.query.parser.PartTree;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit test for {@link MongoQueryCreator}.
*
@@ -70,16 +66,14 @@ public class MongoQueryCreatorUnitTests {
@Before
public void setUp() throws SecurityException, NoSuchMethodException {
context = new MongoMappingContext();
doAnswer(new Answer<Void>() {
public Void answer(InvocationOnMock invocation) throws Throwable {
DBObject dbObject = (DBObject) invocation.getArguments()[1];
dbObject.put("value", new BasicDBObject("value", "value"));
return null;
doAnswer(new Answer<Object>() {
public Object answer(InvocationOnMock invocation) throws Throwable {
return invocation.getArguments()[0];
}
}).when(converter).write(any(), Mockito.any(DBObject.class));
}).when(converter).convertToMongoType(any());
}
@Test
@@ -157,18 +151,66 @@ public class MongoQueryCreatorUnitTests {
}
/**
* DATADOC-291
* @see DATAMONGO-291
*/
@Test
public void honoursMappingInformationForPropertyPaths() {
PartTree partTree = new PartTree("findByUsername", User.class);
MongoQueryCreator creator = new MongoQueryCreator(partTree, getAccessor(converter, "Oliver"), context);
Query reference = query(where("foo").is("Oliver"));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsExistsClauseCorrectly() {
PartTree tree = new PartTree("findByAgeExists", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, true), context);
Query query = query(where("age").exists(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsRegexClauseCorrectly() {
PartTree tree = new PartTree("findByFirstNameRegex", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, ".*"), context);
Query query = query(where("firstName").regex(".*"));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsTrueClauseCorrectly() {
PartTree tree = new PartTree("findByActiveTrue", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
/**
* @see DATAMONGO-338
*/
@Test
public void createsFalseClauseCorrectly() {
PartTree tree = new PartTree("findByActiveFalse", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(false));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
}
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
when(converter.convertToMongoType("Dave")).thenReturn("Dave");
@@ -191,9 +233,9 @@ public class MongoQueryCreatorUnitTests {
List<Person> findByLocationNearAndFirstname(Point location, Distance maxDistance, String firstname);
}
class User {
@Field("foo")
String username;
}

View File

@@ -30,10 +30,8 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.mongodb.repository.support.MappingMongoEntityInformation;
/**
* Unit test for {@link MongoRepositoryFactory}.
@@ -45,7 +43,7 @@ public class MongoRepositoryFactoryUnitTests {
@Mock
MongoTemplate template;
@Mock
MongoConverter converter;
@@ -55,7 +53,7 @@ public class MongoRepositoryFactoryUnitTests {
@Mock
@SuppressWarnings("rawtypes")
MongoPersistentEntity entity;
@Before
@SuppressWarnings({ "rawtypes", "unchecked" })
public void setUp() {
@@ -63,12 +61,6 @@ public class MongoRepositoryFactoryUnitTests {
when(converter.getMappingContext()).thenReturn((MappingContext) mappingContext);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsInvalidIdType() throws Exception {
MongoRepositoryFactory factory = new MongoRepositoryFactory(template);
factory.getRepository(SampleRepository.class);
}
@Test
@SuppressWarnings("unchecked")
public void usesMappingMongoEntityInformationIfMappingContextSet() {
@@ -80,8 +72,4 @@ public class MongoRepositoryFactoryUnitTests {
MongoEntityInformation<Person, Serializable> entityInformation = factory.getEntityInformation(Person.class);
assertTrue(entityInformation instanceof MappingMongoEntityInformation);
}
private interface SampleRepository extends MongoRepository<Person, Long> {
}
}

View File

@@ -18,11 +18,21 @@ package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.QPerson;
import org.springframework.data.mongodb.repository.support.QueryDslMongoRepository.SpringDataMongodbSerializer;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mysema.query.types.path.StringPath;
/**
@@ -30,10 +40,20 @@ import com.mysema.query.types.path.StringPath;
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class SpringDataMongodbSerializerUnitTests {
MongoMappingContext context = new MongoMappingContext();
SpringDataMongodbSerializer serializer = new QueryDslMongoRepository.SpringDataMongodbSerializer(context);
@Mock
MongoDbFactory dbFactory;
MongoConverter converter;
SpringDataMongodbSerializer serializer;
@Before
public void setUp() {
MongoMappingContext context = new MongoMappingContext();
converter = new MappingMongoConverter(dbFactory, context);
serializer = new QueryDslMongoRepository.SpringDataMongodbSerializer(converter);
}
@Test
public void uses_idAsKeyForIdProperty() {
@@ -47,4 +67,29 @@ public class SpringDataMongodbSerializerUnitTests {
StringPath path = QPerson.person.address.street;
assertThat(serializer.getKeyForPath(path, path.getMetadata()), is("street"));
}
@Test
public void convertsComplexObjectOnSerializing() {
Address address = new Address();
address.street = "Foo";
address.zipCode = "01234";
DBObject result = serializer.asDBObject("foo", address);
assertThat(result, is(BasicDBObject.class));
BasicDBObject dbObject = (BasicDBObject) result;
Object value = dbObject.get("foo");
assertThat(value, is(notNullValue()));
assertThat(value, is(BasicDBObject.class));
Object reference = converter.convertToMongoType(address);
assertThat(value, is(reference));
}
class Address {
String street;
@Field("zip_code")
String zipCode;
}
}

View File

@@ -0,0 +1 @@
function(doc, prev) { prev.count += 1 }

View File

@@ -0,0 +1 @@
function(doc) { return { x : doc.x }; }

View File

@@ -5,7 +5,9 @@
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:mapping-converter id="converter" db-factory-ref="factory" />
<mongo:mapping-converter id="converter" db-factory-ref="factory">
<mongo:custom-converters base-package="org.springframework.data.mongodb.config" />
</mongo:mapping-converter>
<mongo:db-factory id="factory" />

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="rack1" />
<mongo:mongo id="mongo">
<mongo:options max-auto-connect-retry-time="27" />
</mongo:mongo>
<mongo:db-factory id="second" write-concern="REPLICAS_SAFE" />
<!-- now part of the namespace
<bean class="org.springframework.beans.factory.config.CustomEditorConfigurer">
<property name="customEditors">
<map>
<entry key="com.mongodb.WriteConcern" value="org.springframework.data.mongodb.config.WriteConcernPropertyEditor"/>
</map>
</property>
</bean>
-->
</beans>

View File

@@ -2,12 +2,23 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" write-concern="SAFE" mongo-ref="mongo" />
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="SAFE" />
<mongo:mongo id="mongo">
<mongo:options max-auto-connect-retry-time="27" />
</mongo:mongo>
</beans>
<!-- now part of the namespace
<bean class="org.springframework.beans.factory.config.CustomEditorConfigurer">
<property name="customEditors">
<map>
<entry key="com.mongodb.WriteConcern" value="org.springframework.data.mongodb.config.WriteConcernPropertyEditor"/>
</map>
</property>
</bean>
-->
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory uri="mongodb://localhost/database.myCollection"/>
</beans>

View File

@@ -7,6 +7,11 @@
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<context:property-placeholder location="classpath:replicaSet.properties"/>
<mongo:mongo id="manyReplicaSetMongo" replica-set="${mongo.hosts}"/>
<mongo:mongo id="replicaSetMongo" replica-set="127.0.0.1:10001,localhost:10002"/>
<!--

View File

@@ -0,0 +1 @@
mongo.hosts=192.168.174.130:27017,192.168.174.130:27018,192.168.174.130:27019

View File

@@ -7,10 +7,7 @@ Import-Package:
Export-Template:
org.springframework.data.mongodb.*;version="${project.version}"
Import-Template:
org.springframework.*;version="${org.springframework.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${data.commons.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo.version:[=.=,+1.0.0)}",
com.mongodb.*;version="0",
com.mysema.query.*;version="[2.1.1, 3.0.0)";resolution:=optional,
javax.annotation.processing.*;version="0",
javax.tools.*;version="0",
@@ -18,4 +15,7 @@ Import-Template:
org.apache.commons.collections15.*;version="[4.0.0,5.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.bson.*;version="0",
org.springframework.*;version="${org.springframework.version.30:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${data.commons.version:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -52,7 +52,7 @@
<xi:include href="introduction/why-sd-doc.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/master/src/docbkx/repositories.xml">
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/1.2.0.RC1/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -72,7 +72,7 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/master/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.2.0.RC1/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
</part>

View File

@@ -69,13 +69,14 @@
<section>
<title>How the '_id' field is handled in the mapping layer</title>
<para>Mongo requires that you have an '_id' field for all documents. If
you don't provide one the driver will assign a ObjectId with a generated
value. The "_id" field can be of any type the, other than arrays, so
long as it is unique. The driver naturally supports all primitive types
and Dates. When using the <classname>MongoMappingConverter</classname>
there are certain rules that govern how properties from the Java class
is mapped to this '_id' field.</para>
<para>MongoDB requires that you have an '_id' field for all documents.
If you don't provide one the driver will assign a ObjectId with a
generated value. The "_id" field can be of any type the, other than
arrays, so long as it is unique. The driver naturally supports all
primitive types and Dates. When using the
<classname>MongoMappingConverter</classname> there are certain rules
that govern how properties from the Java class is mapped to this '_id'
field.</para>
<para>The following outlines what field will be mapped to the '_id'
document field:</para>
@@ -212,7 +213,7 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
getUserCredentials()</literal> to provide the username and password
information to connect to the database.</para>
<para>Spring's Mongo namespace enables you to easily enable mapping
<para>Spring's MongoDB namespace enables you to easily enable mapping
functionality in XML</para>
<example>
@@ -370,11 +371,18 @@ public class Person {
Language statement to transform a key's value retrieved in the
database before it is used to construct a domain object.</para>
</listitem>
<listitem>
<para><literal>@Field</literal> - applied at the field level and
described the name of the field as it will be represented in the
MongoDB BSON document thus allowing the name to be different than
the fieldname of the class.</para>
</listitem>
</itemizedlist>
<para>The mapping metadata infrastructure is defined in a seperate
spring-data-commons project that is technology agnostic. Specific
subclasses are using in the Mongo support to support annotation based
subclasses are using in the MongoDB support to support annotation based
metadata. Other strategies are also possible to put in place if there is
demand.</para>
@@ -388,16 +396,24 @@ public class Person&lt;T extends Address&gt; {
@Id
private String id;
@Indexed(unique = true)
private Integer ssn;
@Field("fName")
private String firstName;
@Indexed
private String lastName;
private Integer age;
@Transient
private Integer accountTotal;
@DBRef
private List&lt;Account&gt; accounts;
private T address;

View File

@@ -2,7 +2,7 @@
<!DOCTYPE chapter PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN"
"http://www.oasis-open.org/docbook/xml/4.4/docbookx.dtd">
<chapter id="mongo.repositories">
<title>Mongo repositories</title>
<title>MongoDB repositories</title>
<section id="mongo-repo-intro">
<title>Introduction</title>
@@ -60,7 +60,7 @@
add</para>
<example>
<title>General mongo repository Spring configuration</title>
<title>General MongoDB repository Spring configuration</title>
<programlisting language="xml">&lt;?xml version="1.0" encoding="UTF-8"?&gt;
&lt;beans xmlns="http://www.springframework.org/schema/beans"
@@ -129,7 +129,7 @@ public class PersonRepositoryTests {
<title>Query methods</title>
<para>Most of the data access operations you usually trigger on a
repository result a query being executed against the Mongo databases.
repository result a query being executed against the MongoDB databases.
Defining such a query is just a matter of declaring a method on the
repository interface</para>
@@ -235,6 +235,16 @@ public class PersonRepositoryTests {
regex)</entry>
</row>
<row>
<entry><literal>Regex</literal></entry>
<entry><methodname>findByFirstnameRegex(String
firstname)</methodname></entry>
<entry><code>{"firstname" : {"$regex" : firstname
}}</code></entry>
</row>
<row>
<entry>(No keyword)</entry>
@@ -279,7 +289,34 @@ public class PersonRepositoryTests {
box)</methodname></entry>
<entry><code>{"location" : {"$within" : {"$box" : [ [x1, y1],
x2, y2]}}}</code></entry>
x2, y2]}}}True</code></entry>
</row>
<row>
<entry><literal>IsTrue</literal>,
<literal>True</literal></entry>
<entry><code>findByActiveIsTrue()</code></entry>
<entry><code>{"active" : true}</code></entry>
</row>
<row>
<entry><literal>IsFalse</literal>,
<literal>False</literal></entry>
<entry><code>findByActiveIsFalse()</code></entry>
<entry><code>{"active" : false}</code></entry>
</row>
<row>
<entry><literal>Exists</literal></entry>
<entry><methodname>findByLocationExists(boolean
exists)</methodname></entry>
<entry><code>{"location" : {"$exists" : exists }}</code></entry>
</row>
</tbody>
</tgroup>
@@ -346,11 +383,11 @@ Distance distance = new Distance(200, Metrics.KILOMETERS);
</section>
<section>
<title>Mongo JSON based query methods and field restriction</title>
<title>MongoDB JSON based query methods and field restriction</title>
<para>By adding the annotation
<classname>org.springframework.data.mongodb.repository.Query</classname>
repository finder methods you can specify a Mongo JSON query string to
repository finder methods you can specify a MongoDB JSON query string to
use instead of having the query derived from the method name. For
example</para>
@@ -382,7 +419,7 @@ Distance distance = new Distance(200, Metrics.KILOMETERS);
<section>
<title>Type-safe Query methods</title>
<para>Mongo repository support integrates with the <ulink
<para>MongoDB repository support integrates with the <ulink
url="http://www.querydsl.com/">QueryDSL</ulink> project which provides a
means to perform type-safe queries in Java. To quote from the project
description, "Instead of writing queries as inline strings or

View File

@@ -88,9 +88,10 @@
<para>First you need to set up a running Mongodb server. Refer to the
<ulink url="http://www.mongodb.org/display/DOCS/Quickstart">Mongodb Quick
Start guide</ulink> for an explanation on how to startup a Mongo instance.
Once installed starting Mongo is typically a matter of executing the
following command: <literal>MONGO_HOME/bin/mongod</literal></para>
Start guide</ulink> for an explanation on how to startup a MongoDB
instance. Once installed starting MongoDB is typically a matter of
executing the following command:
<literal>MONGO_HOME/bin/mongod</literal></para>
<para>To create a Spring project in STS go to File -&gt; New -&gt; Spring
Template Project -&gt; Simple Spring Utility Project --&gt; press Yes when
@@ -203,10 +204,10 @@ public class MongoApp {
<para>This will produce the following output</para>
<programlisting>10:01:32,062 DEBUG apping.MongoPersistentEntityIndexCreator: 80 - Analyzing class class org.spring.example.Person for index information.
10:01:32,265 DEBUG work.data.mongodb.core.MongoTemplate: 631 - insert DBObject containing fields: [_class, age, name] in collection: Person
10:01:32,765 DEBUG work.data.mongodb.core.MongoTemplate:1243 - findOne using query: { "name" : "Joe"} in db.collection: database.Person
10:01:32,265 DEBUG ramework.data.mongodb.core.MongoTemplate: 631 - insert DBObject containing fields: [_class, age, name] in collection: Person
10:01:32,765 DEBUG ramework.data.mongodb.core.MongoTemplate:1243 - findOne using query: { "name" : "Joe"} in db.collection: database.Person
10:01:32,953 INFO org.spring.mongodb.example.MongoApp: 25 - Person [id=4ddbba3c0be56b7e1b210166, name=Joe, age=34]
10:01:32,984 DEBUG work.data.mongodb.core.MongoTemplate: 375 - Dropped collection [database.person]</programlisting>
10:01:32,984 DEBUG ramework.data.mongodb.core.MongoTemplate: 375 - Dropped collection [database.person]</programlisting>
<para>Even in this simple example, there are few things to take notice
of</para>
@@ -246,15 +247,15 @@ public class MongoApp {
<title>Required Jars</title>
The following jars are required to use Spring Data Mongo
The following jars are required to use Spring Data MongoDB
<itemizedlist>
<listitem>
<para>spring-data-mongodb-1.0.0.M4.jar</para>
<para>spring-data-mongodb-1.0.0.RC1.jar</para>
</listitem>
<listitem>
<para>spring-data-commons-1.2.0.M1.jar</para>
<para>spring-data-commons-1.2.0.RC1.jar</para>
</listitem>
</itemizedlist>
@@ -282,7 +283,7 @@ public class MongoApp {
</listitem>
<listitem>
<para>spring-beans-3.0.7.RELEASE.jar</para>
<para>spring-beans-3.0.6.RELEASE.jar</para>
</listitem>
<listitem>
@@ -301,7 +302,7 @@ public class MongoApp {
</section>
<section>
<section id="mongo.migrate-m2-m3">
<title>Migrating from M2 to M3</title>
<para>There were several API changes introduced in the M3 release. To
@@ -355,7 +356,7 @@ public class MongoApp {
</section>
</section>
<section>
<section id="mongo.examples-repo">
<title>Examples Repository</title>
<para>There is an <ulink
@@ -381,7 +382,7 @@ public class MongoApp {
here</ulink>.</para>
</note></para>
<section>
<section id="mongo.mongo-java-config">
<title>Registering a Mongo instance using Java based metadata</title>
<para>An example of using Java based bean metadata to register an
@@ -415,7 +416,7 @@ public class AppConfig {
compared to instantiating a <classname>com.mongodb.Mongo</classname>
instance directly, the FactoryBean approach does not throw a checked
exception and has the added advantage of also providing the container
with an ExceptionTranslator implementation that translates Mongo
with an ExceptionTranslator implementation that translates MongoDB
exceptions to exceptions in Spring's portable
<classname>DataAccessException</classname> hierarchy for data access
classes annoated with the <literal>@Repository</literal> annotation.
@@ -454,7 +455,7 @@ public class AppConfig {
</example>
</section>
<section>
<section id="mongo.mongo-xml-config">
<title>Registering a Mongo instance using XML based metadata</title>
<para>While you can use Spring's traditional
@@ -526,14 +527,14 @@ public class AppConfig {
</example></para>
</section>
<section>
<section id="mongo.mongo-db-factory">
<title>The MongoDbFactory interface</title>
<para>While com.mongodb.Mongo is the entry point to the MongoDB driver
API, connecting to a specific MongoDB database instance requires
additional information such as the database name and an optional
username and password. With that information you can obtain a
com.mongodb.DB object and access all the functionality of a specific
<para>While <classname>com.mongodb.Mongo</classname> is the entry point
to the MongoDB driver API, connecting to a specific MongoDB database
instance requires additional information such as the database name and
an optional username and password. With that information you can obtain
a com.mongodb.DB object and access all the functionality of a specific
MongoDB database instance. Spring provides the
<classname>org.springframework.data.mongodb.core.MongoDbFactory</classname>
interface shown below to bootstrap connectivity to the database.</para>
@@ -586,7 +587,7 @@ public class AppConfig {
section</link>.</para>
</section>
<section>
<section id="mongo.mongo-db-factory-java">
<title>Registering a MongoDbFactory instance using Java based
metadata</title>
@@ -626,7 +627,7 @@ public class MongoConfiguration {
<para></para>
</section>
<section>
<section id="mongo.mongo-db-factory-xml">
<title>Registering a MongoDbFactory instance using XML based
metadata</title>
@@ -644,8 +645,8 @@ public class MongoConfiguration {
the id attribute is specified.</para>
<para>You can also provide the host and port for the underlying
com.mongodb.Mongo instance as shown below, in addition to username and
password for the database.</para>
<classname>com.mongodb.Mongo</classname> instance as shown below, in
addition to username and password for the database.</para>
<programlisting language="xml">&lt;mongo:db-factory id="anotherMongoDbFactory"
host="localhost"
@@ -705,7 +706,7 @@ public class MongoConfiguration {
thread-safe and can be reused across multiple instances.</para>
</note>
<para>The mapping between Mongo documents and domain classes is done by
<para>The mapping between MongoDB documents and domain classes is done by
delegating to an implementation of the interface
<interfacename>MongoConverter</interfacename>. Spring provides two
implementations, <classname>SimpleMappingConverter</classname> and
@@ -750,17 +751,17 @@ public class MongoConfiguration {
</note></para>
<para>Another central feature of MongoTemplate is exception translation of
exceptions thrown in the Mongo Java driver into Spring's portable Data
exceptions thrown in the MongoDB Java driver into Spring's portable Data
Access Exception hierarchy. Refer to the section on <link
linkend="mongo.exception">exception translation</link> for more
information.</para>
<para>While there are many convenience methods on
<classname>MongoTemplate</classname> to help you easily perform common
tasks if you should need to access the Mongo driver API directly to access
functionality not explicitly exposed by the MongoTemplate you can use one
of several Execute callback methods to access underlying driver APIs. The
execute callbacks will give you a reference to either a
tasks if you should need to access the MongoDB driver API directly to
access functionality not explicitly exposed by the MongoTemplate you can
use one of several Execute callback methods to access underlying driver
APIs. The execute callbacks will give you a reference to either a
<classname>com.mongodb.Collection</classname> or a
<classname>com.mongodb.DB</classname> object. Please see the section
<ulink url="mongo.executioncallback">Execution Callbacks</ulink> for more
@@ -770,7 +771,7 @@ public class MongoConfiguration {
<classname>MongoTemplate</classname> in the context of the Spring
container.</para>
<section>
<section id="mongo-template.instantiating" label=" ">
<title>Instantiating MongoTemplate</title>
<para>You can use Java to create and register an instance of
@@ -801,8 +802,8 @@ public class AppConfig {
<listitem>
<para><emphasis role="bold">MongoTemplate </emphasis>
<literal>(Mongo mongo, String databaseName)</literal> - takes the
com.mongodb.Mongo object and the default database name to operate
against.</para>
<classname>com.mongodb.Mongo</classname> object and the default
database name to operate against.</para>
</listitem>
<listitem>
@@ -815,8 +816,9 @@ public class AppConfig {
<listitem>
<para><emphasis role="bold">MongoTemplate</emphasis>
<literal>(MongoDbFactory mongoDbFactory)</literal> - takes a
MongoDbFactory object that encapsulated the com.mongodb.Mongo
object, database name, and username and password.</para>
MongoDbFactory object that encapsulated the
<classname>com.mongodb.Mongo</classname> object, database name, and
username and password.</para>
</listitem>
<listitem>
@@ -840,8 +842,8 @@ public class AppConfig {
<para>Other optional properties that you might like to set when creating
a <classname>MongoTemplate</classname> are the default
<classname>WriteResultCheckingPolicy</classname>,
<classname>WriteConcern</classname>, and <classname>SlaveOk</classname>
write option.</para>
<classname>WriteConcern</classname>, and
<classname>ReadPreference</classname>.</para>
<note>
<para>The preferred way to reference the operations on
@@ -849,7 +851,7 @@ public class AppConfig {
<interfacename>MongoOperations</interfacename>.</para>
</note>
<section>
<section id="mongo-template.writeresultchecking">
<title>WriteResultChecking Policy</title>
<para>When in development it is very handy to either log or throw an
@@ -864,29 +866,64 @@ public class AppConfig {
use a <literal>WriteResultChecking</literal> value of NONE.</para>
</section>
<section>
<section id="mongo-template.writeconcern">
<title>WriteConcern</title>
<para>You can set the <classname>com.mongodb.WriteConcern</classname>
property that the <classname>MongoTemplate</classname> will use for
write operations if it has not yet been specified via the driver at a
higher level such as com.mongodb.Mongo. If MongoTemplate's
<classname>WriteConcern</classname> property is not set it will
default to the one set in the MongoDB driver's DB or Collection
setting.</para>
higher level such as <classname>com.mongodb.Mongo</classname>. If
MongoTemplate's <classname>WriteConcern</classname> property is not
set it will default to the one set in the MongoDB driver's DB or
Collection setting.</para>
</section>
<note>
<para>Setting the <classname>WriteConcern</classname> to different
values when saving an object will be provided in a future release.
This will most likely be handled using mapping metadata provided
either in the form of annotations on the domain object or by an
external fluent DSL.</para>
</note>
<section id="mongo-template.writeconcernresolver">
<title>WriteConcernResolver</title>
<para>For more advanced cases where you want to set different
<classname>WriteConcern</classname> values on a per-operation basis
(for remove, update, insert and save operations), a strategy interface
called <interfacename>WriteConcernResolver</interfacename> can be
configured on <classname>MongoTemplate</classname>. Since
<classname>MongoTemplate</classname> is used to persist POJOs, the
<interfacename>WriteConcernResolver</interfacename> lets you create a
policy that can map a specific POJO class to a
<classname>WriteConcern</classname> value. The
<interfacename>WriteConcernResolver</interfacename> interface is shown
below.</para>
<programlisting language="java">public interface WriteConcernResolver {
WriteConcern resolve(MongoAction action);
}</programlisting>
<para>The passed in argument, MongoAction, is what you use to
determine the <classname>WriteConcern</classname> value to be used or
to use the value of the Template itself as a default.
<classname>MongoAction</classname> contains the collection name being
written to, the <classname>java.lang.Class</classname> of the POJO,
the converted <classname>DBObject</classname>, as well as the
operation as an enumeration
(<classname>MongoActionOperation</classname>: REMOVE, UPDATE, INSERT,
INSERT_LIST, SAVE) and a few other pieces of contextual information.
For example,</para>
<programlisting>private class MyAppWriteConcernResolver implements WriteConcernResolver {
public WriteConcern resolve(MongoAction action) {
if (action.getEntityClass().getSimpleName().contains("Audit")) {
return WriteConcern.NONE;
} else if (action.getEntityClass().getSimpleName().contains("Metadata")) {
return WriteConcern.JOURNAL_SAFE;
}
return action.getDefaultWriteConcern();
}
}</programlisting>
</section>
</section>
</section>
<section>
<section id="mongo-template.save-update-remove">
<title>Saving, Updating, and Removing Documents</title>
<para><classname>MongoTemplate</classname> provides a simple way for you
@@ -1011,14 +1048,15 @@ DEBUG work.data.mongodb.core.MongoTemplate: 376 - Dropped collection [database.p
<para>The query stynax used in the example is explained in more detail in
the section <link linkend="mongo.query">Querying Documents</link>.</para>
<section>
<section id="mongo-template.id-handling">
<title>How the '_id' field is handled in the mapping layer</title>
<para>Mongo requires that you have an '_id' field for all documents. If
you don't provide one the driver will assign a ObjectId with a generated
value. When using the <classname>MongoMappingConverter</classname> there
are certain rules that govern how properties from the Java class is
mapped to this '_id' field.</para>
<para>MongoDB requires that you have an '_id' field for all documents.
If you don't provide one the driver will assign a
<classname>ObjectId</classname> with a generated value. When using the
<classname>MongoMappingConverter</classname> there are certain rules
that govern how properties from the Java class is mapped to this '_id'
field.</para>
<para>The following outlines what property will be mapped to the '_id'
document field:</para>
@@ -1046,17 +1084,20 @@ DEBUG work.data.mongodb.core.MongoTemplate: 376 - Dropped collection [database.p
<itemizedlist>
<listitem>
<para>An id property or field declared as a String in the Java class
will be converted to and stored as an ObjectId if possible using a
Spring Converter&lt;String, ObjectId&gt;. Valid conversion rules are
delegated to the Mongo Java driver. If it cannot be converted to an
ObjectId, then the value will be stored as a string in the
database.</para>
will be converted to and stored as an
<classname>ObjectId</classname> if possible using a Spring
<interfacename>Converter&lt;String, ObjectId&gt;</interfacename>.
Valid conversion rules are delegated to the MongoDB Java driver. If it
cannot be converted to an ObjectId, then the value will be stored as
a string in the database.</para>
</listitem>
<listitem>
<para>An id property or field declared as BigInteger in the Java
class will be converted to and stored as an ObjectId using a Spring
Converter&lt;BigInteger, ObjectId&gt;.</para>
<para>An id property or field declared as
<classname>BigInteger</classname> in the Java class will be
converted to and stored as an <classname>ObjectId</classname> using
a Spring <interfacename>Converter&lt;BigInteger,
ObjectId&gt;</interfacename>.</para>
</listitem>
</itemizedlist>
@@ -1072,15 +1113,16 @@ DEBUG work.data.mongodb.core.MongoTemplate: 376 - Dropped collection [database.p
domain classes.</para>
</section>
<section>
<section id="mongo-template.save-insert">
<title>Methods for saving and inserting documents</title>
<para>There are several convenient methods on
<classname>MongoTemplate</classname> for saving and inserting your
objects. To have more fine grained control over the conversion process
you can register Spring converters with the MappingMongoConverter, for
example Converter&lt;Person, DBObject&gt; and Converter&lt;DBObject,
Person&gt;.</para>
you can register Spring converters with the
<classname>MappingMongoConverter</classname>, for example
<interfacename>Converter&lt;Person, DBObject&gt;</interfacename> and
<interfacename>Converter&lt;DBObject, Person&gt;</interfacename>.</para>
<note>
<para>The difference between insert and save operations is that a save
@@ -1111,13 +1153,12 @@ DEBUG work.data.mongodb.core.MongoTemplate: 376 - Dropped collection [database.p
<programlisting language="java">import static org.springframework.data.mongodb.core.query.Criteria.where;
import static org.springframework.data.mongodb.core.query.Criteria.query;
...
Person p = new Person("Bob", 33);
mongoTemplate.insert(p);
Person p = new Person("Bob", 33);
mongoTemplate.insert(p);
Person qp = mongoTemplate.findOne(query(where("age").is(33)), Person.class);
</programlisting>
Person qp = mongoTemplate.findOne(query(where("age").is(33)), Person.class); </programlisting>
</example>
<para>The insert/save operations available to you are listed
@@ -1153,7 +1194,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.query;
</listitem>
</itemizedlist></para>
<section>
<section id="mongo-template.save-insert.collection">
<title>Which collection will my documents be saved into?</title>
<para>There are two ways to manage the collection name that is used
@@ -1166,7 +1207,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.query;
parameter for the selected MongoTemplate method calls.</para>
</section>
<section>
<section id="mongo-template.save-insert.individual">
<title>Inserting or saving individual objects</title>
<para>The MongoDB driver supports inserting a collection of documents
@@ -1195,7 +1236,7 @@ import static org.springframework.data.mongodb.core.query.Criteria.query;
</itemizedlist></para>
</section>
<section>
<section id="mongo-template.save-insert.batch">
<title>Inserting several objects in a batch</title>
<para>The MongoDB driver supports inserting a collection of documents
@@ -1248,7 +1289,7 @@ import static org.springframework.data.mongodb.core.query.Update;
<classname>Update</classname> object to provide a fluent style for the
API.</para>
<section>
<section id="mongodb-template-update.methods">
<title>Methods for executing updates for documents</title>
<para><itemizedlist>
@@ -1264,11 +1305,9 @@ import static org.springframework.data.mongodb.core.query.Update;
updated document.</para>
</listitem>
</itemizedlist></para>
<para></para>
</section>
<section>
<section id="mongodb-template-update.update">
<title>Methods for the Update class</title>
<para>The Update class can be used with a little 'syntax sugar' as its
@@ -1348,7 +1387,67 @@ import static org.springframework.data.mongodb.core.query.Update;
</section>
</section>
<section id="mongo-template.upserts">
<title>Upserting documents in a collection</title>
<para>Related to perfomring an <methodname>updateFirst</methodname>
operations, you can also perform an upsert operation which will perform
an insert if no document is found that matches the query. The document
that is inserted is a combination of the query document and the update
document. Here is an example</para>
<programlisting>template.upsert(query(where("ssn").is(1111).and("firstName").is("Joe").and("Fraizer").is("Update")), update("address", addr), Person.class);</programlisting>
</section>
<section>
<title>Finding and Upserting documents in a collection</title>
<para>The findAndModify method on DBCollection can update a document and
return either the old or newly updated document in a single operation.
MongoTemplate provides a findAndModify method that takes Query and
Update classes and converts from DBObject to your POJOs. Here are the
methods</para>
<programlisting language="java"> &lt;T&gt; T findAndModify(Query query, Update update, Class&lt;T&gt; entityClass);
&lt;T&gt; T findAndModify(Query query, Update update, Class&lt;T&gt; entityClass, String collectionName);
&lt;T&gt; T findAndModify(Query query, Update update, FindAndModifyOptions options, Class&lt;T&gt; entityClass);
&lt;T&gt; T findAndModify(Query query, Update update, FindAndModifyOptions options, Class&lt;T&gt; entityClass, String collectionName);</programlisting>
<para>As an example usage, we will insert of few Person objects into the
container and perform a simple findAndUpdate operation</para>
<programlisting language="java">mongoTemplate.insert(new Person("Tom", 21));
mongoTemplate.insert(new Person("Dick", 22));
mongoTemplate.insert(new Person("Harry", 23));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().inc("age", 1);
Person p = mongoTemplate.findAndModify(query, update, Person.class); // return's old person object
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
p = mongoTemplate.findOne(query, Person.class);
assertThat(p.getAge(), is(24));
// Now return the newly updated document when updating
p = template.findAndModify(query, update, new FindAndModifyOptions().returnNew(true), Person.class);
assertThat(p.getAge(), is(25));</programlisting>
<para>The <classname>FindAndModifyOptions</classname> lets you set the
options of returnNew, upsert, and remove. An example extending off the
previous code snippit is shown below</para>
<programlisting language="java">Query query2 = new Query(Criteria.where("firstName").is("Mary"));
p = mongoTemplate.findAndModify(query2, update, new FindAndModifyOptions().returnNew(true).upsert(true), Person.class);
assertThat(p.getFirstName(), is("Mary"));
assertThat(p.getAge(), is(1));</programlisting>
</section>
<section id="mongo-template.delete">
<title>Methods for removing documents</title>
<para>You can use several overloaded methods to remove an object from
@@ -1430,7 +1529,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
<classname>Criteria</classname> object to provide a fluent style for the
API.</para>
<section>
<section id="mongodb-template-query.criteria">
<title>Methods for the Criteria class</title>
<para>
@@ -1451,8 +1550,8 @@ import static org.springframework.data.mongodb.core.query.Query.query;
</listitem>
<listitem>
<para><literal>Criteria</literal> <emphasis role="bold">andOperator
</emphasis> <literal>(Criteria...
<para><literal>Criteria</literal> <emphasis
role="bold">andOperator </emphasis> <literal>(Criteria...
criteria)</literal>Creates an and query using the
<literal>$and</literal> operator for all of the provided
criteria (requires MongoDB 2.0 or later)</para>
@@ -1535,8 +1634,8 @@ import static org.springframework.data.mongodb.core.query.Query.query;
</listitem>
<listitem>
<para><literal>Criteria</literal> <emphasis role="bold">norOperator
</emphasis> <literal>(Criteria...
<para><literal>Criteria</literal> <emphasis
role="bold">norOperator </emphasis> <literal>(Criteria...
criteria)</literal>Creates an nor query using the
<literal>$nor</literal> operator for all of the provided
criteria</para>
@@ -1550,8 +1649,8 @@ import static org.springframework.data.mongodb.core.query.Query.query;
</listitem>
<listitem>
<para><literal>Criteria</literal> <emphasis role="bold">orOperator
</emphasis> <literal>(Criteria...
<para><literal>Criteria</literal> <emphasis
role="bold">orOperator </emphasis> <literal>(Criteria...
criteria)</literal>Creates an or query using the
<literal>$or</literal> operator for all of the provided
criteria</para>
@@ -1579,7 +1678,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
</section>
<para>There are also methods on the Criteria class for geospatial
queries. Here is al isting but look at the section on <link
queries. Here is a listing but look at the section on <link
linkend="mongo.geospatial">GeoSpatial Queries</link> to see them in
action.</para>
@@ -1594,7 +1693,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
<para><literal>Criteria</literal> <emphasis
role="bold">withinCenterSphere </emphasis> <literal>(Circle circle)
</literal>Creates a geospatial criterion using <literal>$within
$center</literal> operators. This is only available for Mongo 1.7
$center</literal> operators. This is only available for MongoDB 1.7
and higher.</para>
</listitem>
@@ -1615,7 +1714,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
<para><literal>Criteria</literal> <emphasis role="bold">nearSphere
</emphasis> <literal>(Point point) </literal>Creates a geospatial
criterion using <literal>$nearSphere$center</literal> operations.
This is only available for Mongo 1.7 and higher.</para>
This is only available for MongoDB 1.7 and higher.</para>
</listitem>
<listitem>
@@ -1629,7 +1728,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
<para>The <classname>Query</classname> class has some additional methods
used to provide options for the query.</para>
<section>
<section id="mongodb-template-query.query">
<title>Methods for the Query class</title>
<para>
@@ -1670,7 +1769,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
</section>
</section>
<section>
<section id="mongo-template.querying">
<title>Methods for querying for documents</title>
<para>The query methods need to specify the target type T that will be
@@ -1847,22 +1946,24 @@ GeoResults&lt;Restaurant&gt; = operations.geoNear(query, Restaurant.class);</pro
</section>
<section id="mongo.mapreduce">
<title>Map-Reduce</title>
<title>Map-Reduce Operations</title>
<para>You can query MongoDB using Map-Reduce which is useful for batch
processing, data aggregation, and for when the query language doesn't
fulfill your needs. Spring provides integration with MongoDB's map reduce
by providing methods on MongoOperations to simplify the creation and
execution of Map-Reduce operations. It also integrates with Spring's
<ulink
fulfill your needs.</para>
<para>Spring provides integration with MongoDB's map reduce by providing
methods on MongoOperations to simplify the creation and execution of
Map-Reduce operations. It can convert the results of a Map-Reduce
operation to a POJO also integrates with Spring's <ulink
url="http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/resources.html">Resource
abstraction</ulink> abstraction. This will let you place your JavaScript
files on the file system, classpath, http server or any other Spring
Resource implementation and then reference the JavaScript resources via an
easy URI style syntax, e.g. 'classpath:reduce.js;. Externalizing
JavaScript code in files is preferable to embedding them as Java strings
in your code. You can still pass JavaScript code as Java strings if you
prefer.</para>
JavaScript code in files is often preferable to embedding them as Java
strings in your code. Note that you can still pass JavaScript code as Java
strings if you prefer.</para>
<section id="mongo.mapreduce.example" lang="">
<title>Example Usage</title>
@@ -1919,7 +2020,7 @@ public class ValueObject {
private String id;
private float value;
public String getId() {
return id;
}
@@ -1936,9 +2037,7 @@ public class ValueObject {
public String toString() {
return "ValueObject [id=" + id + ", value=" + value + "]";
}
}
</programlisting> By default the output type of INLINE is used so you don't
}</programlisting> By default the output type of INLINE is used so you don't
have to specify an output collection. To specify additional map-reduce
options use an overloaded method that takes an additional
<classname>MapReduceOptions</classname> argument. The class
@@ -1946,28 +2045,152 @@ public class ValueObject {
additional options can be done in a very compact syntax. Here an example
that sets the output collection to "jmr1_out". Note that setting only
the output collection assumes a default output type of REPLACE.
<programlisting language="java">
MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce("jmr1", "classpath:map.js", "classpath:reduce.js",
new MapReduceOptions().outputCollection("jmr1_out"), ValueObject.class);
</programlisting> There is also a static import <literal>import static
<programlisting language="java">MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce("jmr1", "classpath:map.js", "classpath:reduce.js",
new MapReduceOptions().outputCollection("jmr1_out"), ValueObject.class);</programlisting>
There is also a static import <literal>import static
org.springframework.data.mongodb.core.mapreduce.MapReduceOptions.options;</literal>
that can be used to make the syntax slightly more compact
<programlisting language="java">
MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce("jmr1", "classpath:map.js", "classpath:reduce.js",
options().outputCollection("jmr1_out"), ValueObject.class);
</programlisting> You can also specify a query to reduce the set of data that
will be used to feed into the map-reduce operation. This will remove the
document that contains [a,b] from consideration for map-reduce
operations. <programlisting language="java">
Query query = new Query(where("x").ne(new String[] { "a", "b" }));
<programlisting language="java">MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce("jmr1", "classpath:map.js", "classpath:reduce.js",
options().outputCollection("jmr1_out"), ValueObject.class);</programlisting>
You can also specify a query to reduce the set of data that will be used
to feed into the map-reduce operation. This will remove the document
that contains [a,b] from consideration for map-reduce operations.
<programlisting language="java">Query query = new Query(where("x").ne(new String[] { "a", "b" }));
MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce(query, "jmr1", "classpath:map.js", "classpath:reduce.js",
options().outputCollection("jmr1_out"), ValueObject.class);
</programlisting> Note that you can specify additional limit and sort values
as well on the query but not skip values.</para>
options().outputCollection("jmr1_out"), ValueObject.class);</programlisting>
Note that you can specify additional limit and sort values as well on
the query but not skip values.</para>
</section>
</section>
<section>
<section id="mongo.group">
<title>Group Operations</title>
<para>As an alternative to usiing Map-Reduce to perform data aggregation,
you can use the <ulink
url="http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Group"><literal>group</literal>
operation</ulink> which feels similar to using SQL's group by query style,
so it may feel more approachable vs. using Map-Reduce. Using the group
operations does have some limitations, for example it is not supported in
a shareded environment and it returns the full result set in a single BSON
object, so the result should be small, less than 10,000 keys.</para>
<para>Spring provides integration with MongoDB's group operation by
providing methods on MongoOperations to simplify the creation and
execution of group operations. It can convert the results of the group
operation to a POJO and also integrates with Spring's <ulink
url="http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/resources.html">Resource
abstraction</ulink> abstraction. This will let you place your JavaScript
files on the file system, classpath, http server or any other Spring
Resource implementation and then reference the JavaScript resources via an
easy URI style syntax, e.g. 'classpath:reduce.js;. Externalizing
JavaScript code in files if often preferable to embedding them as Java
strings in your code. Note that you can still pass JavaScript code as Java
strings if you prefer.</para>
<section id="mongo.group.example">
<title>Example Usage</title>
<para>In order to understand how group operations work the following
example is used, which is somewhat artifical. For a more realistic
example consult the book 'MongoDB - The definitive guide'. A collection
named "group_test_collection" created with the following rows.</para>
<programlisting>{ "_id" : ObjectId("4ec1d25d41421e2015da64f1"), "x" : 1 }
{ "_id" : ObjectId("4ec1d25d41421e2015da64f2"), "x" : 1 }
{ "_id" : ObjectId("4ec1d25d41421e2015da64f3"), "x" : 2 }
{ "_id" : ObjectId("4ec1d25d41421e2015da64f4"), "x" : 3 }
{ "_id" : ObjectId("4ec1d25d41421e2015da64f5"), "x" : 3 }
{ "_id" : ObjectId("4ec1d25d41421e2015da64f6"), "x" : 3 }</programlisting>
<para>We would like to group by the only field in each row, the 'x'
field and aggregate the number of times each specific value of 'x'
occurs. To do this we need to create an initial document that contains
our count variable and also a reduce function which will increment it
each time it is encountered. The Java code to execute the group
operation is shown below</para>
<programlisting language="java">GroupByResults&lt;XObject&gt; results = mongoTemplate.group("group_test_collection",
GroupBy.key("x").initialDocument("{ count: 0 }").reduceFunction("function(doc, prev) { prev.count += 1 }"),
XObject.class);</programlisting>
<para>The first argument is the name of the collection to run the group
operation over, the second is a fluent API that specifies properties of
the group operation via a <classname>GroupBy</classname> class. In this
example we are using just the <methodname>intialDocument</methodname>
and <methodname>reduceFunction</methodname> methods. You can also
specify a key-function, as well as a finalizer as part of the fluent
API. If you have multiple keys to group by, you can pass in a comma
separated list of keys.</para>
<para>The raw results of the group operation is a JSON document that
looks like this</para>
<programlisting>{
"retval" : [ { "x" : 1.0 , "count" : 2.0} ,
{ "x" : 2.0 , "count" : 1.0} ,
{ "x" : 3.0 , "count" : 3.0} ] ,
"count" : 6.0 ,
"keys" : 3 ,
"ok" : 1.0
}</programlisting>
<para>The document under the "retval" field is mapped onto the third
argument in the group method, in this case XObject which is shown
below.</para>
<programlisting language="java">public class XObject {
private float x;
private float count;
public float getX() {
return x;
}
public void setX(float x) {
this.x = x;
}
public float getCount() {
return count;
}
public void setCount(float count) {
this.count = count;
}
@Override
public String toString() {
return "XObject [x=" + x + " count = " + count + "]";
}
}</programlisting>
<para>You can also obtain tha raw result as a
<classname>DbObject</classname> by calling the method
<methodname>getRawResults</methodname> on the
<classname>GroupByResults</classname> class.</para>
<para>There is an additional method overload of the group method on
<interfacename>MongoOperations</interfacename> which lets you specify a
<classname>Criteria</classname> object for selecting a subset of the
rows. An example which uses a <classname>Criteria</classname> object,
with some syntax sugar using static imports, as well as referencing a
key-function and reduce function javascript files via a Spring Resource
string is shown below.</para>
<programlisting>import static org.springframework.data.mongodb.core.mapreduce.GroupBy.keyFunction;
import static org.springframework.data.mongodb.core.query.Criteria.where;
GroupByResults&lt;XObject&gt; results = mongoTemplate.group(where("x").gt(0),
"group_test_collection",
keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);</programlisting>
</section>
</section>
<section id="mongo.custom-converters">
<title>Overriding default mapping with custom converters</title>
<para>In order to have more fine grained control over the mapping process
@@ -1990,7 +2213,7 @@ MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce(query, "
url="http://static.springsource.org/spring/docs/3.0.x/reference/validation.html#core-convert">here</ulink>.</para>
</note>
<section>
<section id="mongo.custom-converters.writer">
<title>Saving using a registered Spring Converter</title>
<para>An example implementation of the
@@ -2012,11 +2235,10 @@ public class PersonWriteConverter implements Converter&lt;Person, DBObject&gt; {
dbo.put("age", source.getAge());
return dbo;
}
}</programlisting>
</section>
<section>
<section id="mongo.custom-converters.reader">
<title>Reading using a Spring Converter</title>
<para>An example implemention of a Converter that converts from a
@@ -2029,11 +2251,10 @@ public class PersonWriteConverter implements Converter&lt;Person, DBObject&gt; {
p.setAge((Integer) source.get("age"));
return p;
}
}</programlisting>
</section>
<section>
<section id="mongo.custom-converters.xml">
<title>Registering Spring Converters with the MongoConverter</title>
<para>The mongo XSD namespace provides a convenience way to register
@@ -2061,13 +2282,36 @@ public class PersonWriteConverter implements Converter&lt;Person, DBObject&gt; {
</section>
</section>
<section>
<section id="mongo-template.index-and-collections">
<title>Index and Collection managment</title>
<para>MongoTemplate provides a few methods for managing indexes and
collections.</para>
<para><classname>MongoTemplate</classname> provides a few methods for
managing indexes and collections. These are collected into a helper
interface called <interfacename>IndexOperations</interfacename>. You
access these operations by calilng the method
<methodname>indexOps</methodname> and pass in either the collection name
or the <literal>java.lang.Class</literal> of your entity (the collection
name will be derived from the .class either by name or via annotation
metadata).</para>
<section>
<para>The <interfacename>IndexOperations</interfacename> interface is
shown below</para>
<programlisting language="java">public interface IndexOperations {
void ensureIndex(IndexDefinition indexDefinition);
void dropIndex(String name);
void dropAllIndexes();
void resetIndexCache();
List&lt;IndexInfo&gt; getIndexInfo();
}</programlisting>
<section id="mongo-template.index-and-collections.index">
<title>Methods for creating an Index</title>
<para>We can create an index on a collection to improve query
@@ -2076,7 +2320,7 @@ public class PersonWriteConverter implements Converter&lt;Person, DBObject&gt; {
<example>
<title>Creating an index using the MongoTemplate</title>
<programlisting language="java">mongoTemplate.ensureIndex(new Index().on("name",Order.ASCENDING), Person.class); </programlisting>
<programlisting language="java">mongoTemplate.indexOps(Person.class).ensureIndex(new Index().on("name",Order.ASCENDING)); </programlisting>
</example>
<para><itemizedlist>
@@ -2093,10 +2337,28 @@ public class PersonWriteConverter implements Converter&lt;Person, DBObject&gt; {
the Venue class defined in a previous section, you would declare a
geospatial query as shown below</para>
<programlisting language="java">mongoTemplate.ensureIndex(new GeospatialIndex("location"), Venue.class);</programlisting>
<programlisting language="java">mongoTemplate.indexOps(Venue.class).ensureIndex(new GeospatialIndex("location"));</programlisting>
</section>
<section>
<section id="mongo-template.index-and-collections.access">
<title>Accessing index information</title>
<para>The IndexOperations interface has the method getIndexInfo that
returns a list of IndexInfo objects. This contains all the indexes
defined on the collectcion. Here is an example that defines an index on
the Person class that has age property.</para>
<programlisting language="java">template.indexOps(Person.class).ensureIndex(new Index().on("age", Order.DESCENDING).unique(Duplicates.DROP));
List&lt;IndexInfo&gt; indexInfoList = template.indexOps(Person.class).getIndexInfo();
// Contains
// [IndexInfo [fieldSpec={_id=ASCENDING}, name=_id_, unique=false, dropDuplicates=false, sparse=false],
// IndexInfo [fieldSpec={age=DESCENDING}, name=age_-1, unique=true, dropDuplicates=true, sparse=false]]
</programlisting>
</section>
<section id="mongo-template.index-and-collections.collection">
<title>Methods for working with a Collection</title>
<para>It's time to look at some code examples showing how to use the
@@ -2143,15 +2405,16 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
</section>
</section>
<section>
<section id="mongo-template.commands">
<title>Executing Commands</title>
<para>You can also get at the Mongo driver's <classname>DB.command(
)</classname> method using the executeCommand methods on MongoTemplate.
These will also perform exception translation into Spring's Data Access
Exception hierarchy.</para>
<para>You can also get at the MongoDB driver's <classname>DB.command(
)</classname> method using the <methodname>executeCommand(…)</methodname>
methods on <classname>MongoTemplate</classname>. These will also perform
exception translation into Spring's
<classname>DataAccessException</classname> hierarchy.</para>
<section>
<section id="mongo-template.commands.execution">
<title>Methods for executing commands</title>
<para><itemizedlist>
@@ -2255,13 +2518,13 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
</itemizedlist>
</section>
<section id="mongo.exception" label="">
<section id="mongo.exception">
<title>Exception Translation</title>
<para>The Spring framework provides exception translation for a wide
variety of database and mapping technologies. This has traditionally been
for JDBC and JPA. The Spring support for Mongo extends this feature to the
MongoDB Database by providing an implementation of the
for JDBC and JPA. The Spring support for MongoDB extends this feature to
the MongoDB Database by providing an implementation of the
<classname>org.springframework.dao.support.PersistenceExceptionTranslator</classname>
interface.</para>
@@ -2278,15 +2541,16 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
MongoDB driver inherit from the MongoException class. The inner exception
and message are preserved so no information is lost.</para>
<para>Some of the mappings performed by the MongoExceptionTranslator are:
com.mongodb.Network to DataAccessResourceFailureException and
MongoException error codes 1003, 12001, 12010, 12011, 12012 to
InvalidDataAccessApiUsageException. Look into the implementation for more
details on the mapping.</para>
<para>Some of the mappings performed by the
<classname>MongoExceptionTranslator</classname> are: com.mongodb.Network
to DataAccessResourceFailureException and
<classname>MongoException</classname> error codes 1003, 12001, 12010,
12011, 12012 to <classname>InvalidDataAccessApiUsageException</classname>.
Look into the implementation for more details on the mapping.</para>
</section>
<section id="mongo.executioncallback">
<title>Execution Callback</title>
<title>Execution callbacks</title>
<para>One common design feature of all Spring template classes is that all
functionality is routed into one of the templates execute callback
@@ -2295,7 +2559,8 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
greater need in the case of JDBC and JMS than with MongoDB, it still
offers a single spot for exception translation and logging to occur. As
such, using thexe execute callback is the preferred way to access the
Mongo driver's DB and Collection objects to perform uncommon operations
MongoDB driver's <classname>DB</classname> and
<classname>DBCollection</classname> objects to perform uncommon operations
that were not exposed as methods on
<classname>MongoTemplate</classname>.</para>
@@ -2341,19 +2606,20 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
</listitem>
</itemizedlist></para>
<para>Here is an example that uses the CollectionCallback to return
information about an index.</para>
<para>Here is an example that uses the
<interfacename>CollectionCallback</interfacename> to return information
about an index</para>
<programlisting language="java"> boolean hasIndex = template.execute("geolocation", new CollectionCallback&lt;Boolean&gt;() {
public Boolean doInCollection(Venue.class, DBCollection collection) throws MongoException, DataAccessException {
List&lt;DBObject&gt; indexes = collection.getIndexInfo();
for (DBObject dbo : indexes) {
if ("location_2d".equals(dbo.get("name"))) {
return true;
}
}
return false;
<programlisting language="java">boolean hasIndex = template.execute("geolocation", new CollectionCallbackBoolean&gt;() {
public Boolean doInCollection(Venue.class, DBCollection collection) throws MongoException, DataAccessException {
List&lt;DBObject&gt; indexes = collection.getIndexInfo();
for (DBObject dbo : indexes) {
if ("location_2d".equals(dbo.get("name"))) {
return true;
}
});</programlisting>
}
return false;
}
});</programlisting>
</section>
</chapter>

View File

@@ -1,6 +1,67 @@
Spring Data Document Changelog
=============================================
Changes in version 1.0.0.RC1 MongoDB (2011-12-6)
------------------------------------------------
** Bug
* [DATAMONGO-199] - Synchronisation during performance tests
* [DATAMONGO-298] - Spring custom converters do not work for subclasses of java.lang.Number
* [DATAMONGO-306] - NullPointerException if mongo factory created via URI with out credentials
* [DATAMONGO-309] - POJO containing a List of Maps not persisting properly
* [DATAMONGO-312] - Cannot retrieve persisted Enum implementing an abstract method
* [DATAMONGO-315] - MongoTemplate.findOne(query) methods ignore SortOrder on query
* [DATAMONGO-316] - Replica Set configuration via properties file throws ArrayIndexOutOfBoundsException
* [DATAMONGO-318] - Distinguishing write errors and writes with zero documents affected
* [DATAMONGO-321] - An ID field of type integer is always saved as zero if not set by the user before calling save. Throw exception to indicate an int field will not be autopopulated.
* [DATAMONGO-322] - Throw exception in a save operation if the POJO's ID field is null and field type is not String, BigInteger or ObjectId.
* [DATAMONGO-325] - MongoTemplate fails to correctly report a js file not found on classpath while calling mapReduce
* [DATAMONGO-328] - Fix the import statement in mongodb manifest
* [DATAMONGO-329] - Map value not converted correctly
* [DATAMONGO-333] - AbstractMongoEventListener throws NullPointerException if used without generic parameter
** Improvement
* [DATAMONGO-26] - Investigate performance of POJO serialization.
* [DATAMONGO-174] - Add additional constructor to MongoTemplate that take com.mongodb.Mongo, database name, user credentials and MongoConverter.
* [DATAMONGO-208] - Add suppoprt for group() operation on collection in MongoOperations
* [DATAMONGO-213] - Provide additional options for setting WriteConcern on a per operation basis
* [DATAMONGO-234] - MongoTemplate should support the findAndModify operation to update version fields
* [DATAMONGO-292] - Several mongo for different database names
* [DATAMONGO-301] - Allow converters to be included through scanning
* [DATAMONGO-305] - Remove synchronized(this) from sort() and fields() methods in the Query class
* [DATAMONGO-310] - Allow Collections as parameters in @Query
* [DATAMONGO-320] - Remove use of slaveOk boolean option in MongoTemplate as it is deprecated. Replace with ReadPreference
* [DATAMONGO-323] - Using @Query and a Sort parameter on the same method should produce sorted results
* [DATAMONGO-324] - Support for JSON in mongo template
* [DATAMONGO-337] - The "nin" and "all" methods on Criteria should take a collection like the "in" method.
* [DATAMONGO-338] - Add query derivation implementations for newly introduced Regex, Exists, True and False keywords
** New Feature
* [DATAMONGO-185] - Add hint to Query
* [DATAMONGO-251] - Support geting index information on a collection or mapped class.
* [DATAMONGO-308] - Add support for upsert methods
** Refactoring
* [DATAMONGO-304] - Change package name for Class MongoLog4jAppender
* [DATAMONGO-313] - Use MongoOperations interface instead of MongoTemplate class
** Task
* [DATAMONGO-195] - Add description of @Field mapping annotation to reference docs
* [DATAMONGO-262] - Ensure Cloud Foundry Runtime works with RC1
* [DATAMONGO-263] - Ensure Cloud Foundry Examples work with RC1
* [DATAMONGO-311] - Update MongoDB driver to v 2.7.x
* [DATAMONGO-332] - Update reference documentation to list correct necessary dependencies
* [DATAMONGO-334] - Use repository URLs pointing to Artifactory
* [DATAMONGO-335] - Create hybrid Spring 3.0.6 / 3.1 build
Changes in version 1.0.0.M5 MongoDB (2011-10-24)
------------------------------------------------