Compare commits

..

61 Commits

Author SHA1 Message Date
Spring Buildmaster
22d5d4c019 DATAMONGO-513 - Release 1.1.0.RC1. 2012-08-24 02:24:06 -07:00
Oliver Gierke
7ac1e7b6e1 DATAMONGO-513 - Prepare changelog for 1.1.0.RC1. 2012-08-24 11:16:54 +02:00
Oliver Gierke
e86ab783f3 DATAMONGO-513 - Update to Spring Data Commons 1.4.0.RC1. 2012-08-23 20:12:41 +02:00
Oliver Gierke
6fe3e67ecb DATAMONGO-517 - Fixed complex keyword handling.
Introduced intermediate getMappedKeyword(Keyword keyword, MongoPersistentProperty property) to correctly return a DBObject for keyword plus converted value. A few refactorings and improvements in the implementation of QueryMapper (Keyword value object etc.).
2012-08-21 12:25:30 +02:00
Oliver Gierke
3b78034c55 DATAMONGO-519 - Make Spring 3.1.2.RELEASE default Spring dependency version.
We move away from Maven version ranges as they complicate the build and dependency resolution process. They make the build in-reproducible. Users stuck with a 3.0.x version of Spring will now have to manually declare Spring dependencies in their needed 3.0.x version. Not that at least Spring 3.0.7 is required currently.
2012-08-20 17:19:37 +02:00
Oliver Gierke
a06a69797f DATACMNS-214 - Adapted API change in Spring Data Commons.
Plus additional cleanups.
2012-08-16 14:10:02 +02:00
Oliver Gierke
8b1557e38c DATAMONGO-506 - Added test case to show BasicQuery is working for nested properties. 2012-08-15 19:40:02 +02:00
Oliver Gierke
fcdc6d0df2 DATAMONGO-511 - QueryMapper now maps associations correctly.
Complete overhaul of the QueryMapper to better handle complex scenarios like property paths and association references.
2012-08-15 18:30:29 +02:00
Oliver Gierke
83b6cd7f05 DATAMONGO-510 - Criteria now only uses BasicDBList internally. 2012-08-15 18:30:15 +02:00
Oliver Gierke
38a9a6d51d DATAMONGO-509 - SimpleMongoRepository.exists(…) now avoids loading unnecessary data.
We're explicitly ruling out the entities attributes via a field spec to avoid unnecessary object marshaling just to find out whether it exists or not.
2012-08-15 18:16:03 +02:00
Oliver Gierke
5e2f16c678 DATAMONGO-508 - Eagerly return DBRef creation if the given value already is a DBRef. 2012-08-15 16:38:56 +02:00
Oliver Gierke
d7ae95a779 DATAMONGO-505 - Fixed handling of parameter binding of associations and collection values.
Instead of converting given association values as-is, ConvertingParameterAccessor now converts each collection value into a DBRef individually.
2012-08-15 15:16:44 +02:00
Oliver Gierke
8fbdf9afbd DATACMNS-212 - Apply refactorings in Spring Data Commons. 2012-08-13 14:27:19 +02:00
Oliver Gierke
f5a4d78e62 DATAMONGO-476 - @EnableMongoRepositories is now inherited into sub-classes. 2012-08-13 11:37:59 +02:00
Oliver Gierke
1f4264e6a7 DATAMONGO-472 - MongoQueryCreator now correctly translates Not keyword.
We're now translating a negating property reference into a ne(…) call instead of a not().is(…).
2012-08-10 19:58:36 +02:00
Oliver Gierke
05baa851d8 DATAMONGO-502 - QueryMapper now translates property names into field names. 2012-08-08 18:56:18 +02:00
Oliver Gierke
35e8ae1224 DATAMONGO-500 - Index creation is only done for the correct MappingContext.
Index creation now double checks the MappingContext a MappingContextEvent originates from before actually creating indexes. This avoids invalid indexes being created in a multi-database scenario.
2012-07-31 15:34:50 +02:00
Oliver Gierke
ceec0bcc4a DATAMONGO-499 - Fixed namespace reference to repository XSD. 2012-07-31 10:39:03 +02:00
Oliver Gierke
0d87e7fa5f DATAMONGO-496 - AbstractMongoConfiguration now defaults mapping base package.
AbstractMappingConfiguration now uses the package of the class extending it to enable entity scanning for that package. To disable entity scanning entirely override the method to return null or an empty String.
2012-07-30 18:00:42 +02:00
Oliver Gierke
a530629d97 DATAMONGO-497 - Fixed reading of empty collections.
Reading an empty collection always returned a HashSet assuming the returned value would be converted into the assigned properties value later on. However the method should rather return the correct type already which we do now by invoking the potential conversion.
2012-07-30 15:39:22 +02:00
Oliver Gierke
ba0232b187 DATAMONGO-494 - QueryMapper now forwards entity metadata into nested $(n)or criterias.
Introduced helper class to ease assertions on DBObjects as well.
2012-07-27 10:39:31 +02:00
Oliver Gierke
04a17cacb7 DATAMONGO-495 - Fixed debug output in MongoTemplate.doFind(…).
Using SerializationUtils to safely output the query to be executed.
2012-07-26 10:35:30 +02:00
Oliver Gierke
761d725fce DATAMONGO-493 - Fixed broken $ne handling in QueryMapper.
$ne expressions are now only being tried to be converted into an ObjectId in case they follow an id property. Previously they tried in every case which might have led to Strings being converted into ObjectIds that accidentally were valid ObjectIds but didn't represent an id at all.
2012-07-24 20:43:52 +02:00
Oliver Gierke
726b0b1bcc DATAMONGO-493 - Added test case to show the described scenario is working. 2012-07-24 20:13:50 +02:00
Spring Buildmaster
888e031452 DATAMONGO-491 - Prepare next development iteration. 2012-07-24 07:57:13 -07:00
Spring Buildmaster
91b818b8c1 DATAMONGO-491 - Release 1.1.0.M2. 2012-07-24 07:57:10 -07:00
Oliver Gierke
7646a64770 DATAMONGO-491 - Prepare 1.1.0.M2 release.
Updated changelog and links to reference documentation.
2012-07-24 16:50:53 +02:00
Oliver Gierke
94c057e89c DATAMONGO-491 - Upgrade to Spring Data Commons 1.4.0.M1. 2012-07-24 15:27:55 +02:00
Oliver Gierke
190d7cefb0 DATAMONGO-474 - Populating id's after save now inspects field only.
So far the algorithm to inspect whether an id property has to be set after a save(…) operation has used the plain BeanWrapper.getProperty(PersistentProperty property) method. This caused problems in case the getter of the id field returned something completely different (to be precise: a complex type not convertible out of the box).

We now inspect the id field only to retrieve the value.
2012-07-24 13:23:40 +02:00
Oliver Gierke
669bc071b1 DATAMONGO-490 - Fixed typos. 2012-07-23 17:01:12 +02:00
Oliver Gierke
31b9b6b5c0 DATAMONGO-489 - Ensure read collections get converted to appropriate target type.
When reading BasicDBLists we now make sure the resulting collection is converted into the actual target type eventually. It might be an array and thus need an additional round of massaging before being returned as value.
2012-07-23 16:33:24 +02:00
Oliver Gierke
914ecd34fe DATAMONGO-486 - Polished namespace implementation.
Be better citizen regarding namespace support in STS. Moved ParsingUtils to Spring Data Commons.
2012-07-19 01:24:02 +02:00
Oliver Gierke
74532ff199 DATAMONGO-485 - Added test case to show complex id's are working. 2012-07-17 12:40:27 +02:00
Oliver Gierke
c7bcb55bda DATAMONGO-476 - JavaConfig support for repositories. 2012-07-17 11:58:49 +02:00
Amol Nayak
afe560ca7d DATAMONGO-480 - Consider WriteResult for insert(…) and save(…) methods. 2012-07-16 18:51:05 +02:00
Oliver Gierke
1380c49fd0 DATAMONGO-483 - Indexes now use the field name even if index name is defined. 2012-07-16 18:30:54 +02:00
Oliver Gierke
90240a8da5 DATAMONGO-482 - Fixed typo in reference documentation. 2012-07-16 17:42:37 +02:00
Oliver Gierke
ad587f63ed DATAMONGO-474 - Fixed criteria mapping for MongoTemplate.group(…).
The criteria object handed to the group object needs to be mapped correctly to map complex values. Improved error handling on the way.
2012-07-16 16:33:18 +02:00
Oliver Gierke
3c7cb592b3 DATAMONGO-475 - Fixed debug output in map-reduce operations.
Using SerializationUtils.serializeToJsonSafely(…) instead of plain toString() as this might cause SerializationExceptions for complex objects.
2012-07-16 14:56:03 +02:00
Raman Gupta
5b15c9500a DATAMONGO-477 - Change upper bound of Guava dependency to 14. 2012-07-15 17:55:51 +02:00
Oliver Gierke
2d25f0d6e4 DATAMONGO-469 - Polishing in MongoQueryCreator. 2012-06-26 17:36:33 +02:00
Oliver Gierke
557fc869eb DATAMONGO-469 - Fixed parsing of And keyword in derived queries. 2012-06-26 13:32:54 +02:00
Oliver Gierke
033f44e802 DATAMONGO-470 - Implemented equals(…) and hashCode() for Query and Criteria. 2012-06-26 13:30:00 +02:00
Oliver Gierke
ee3c1bc007 DATAMONGO-467 - Fix identifier handling for Querydsl.
As we try to massage the value of the id property into an ObjectId if possible we need to do so as well when mapping the Querydsl query. Adapted SpringDataMongoDbSerializer accordingly.
2012-06-25 13:09:18 +02:00
Oliver Gierke
f507fe2e4d DATAMONGO-464 - Fixed resource synchronization in MongoDbUtils.
MongoDbUtils now correctly returns DB instances for others than the first one bound. So far the lookup for an alternate database resulted in the first one bound to be returned. Polished log statements a bit.
2012-06-25 11:12:54 +02:00
Oliver Gierke
4a27ba0a3f DATAMONGO-466 - QueryMapper now only tries id conversion for top level document.
So far the QueryMapper has tried to map id properties of nested documents to ObjectIds which it shouldn't do actually.
2012-06-22 14:46:35 +02:00
Oliver Gierke
e2e5fd8b31 DATACMNS-186, DATACMNS-185 - Upgrade to Spring Data Commons 1.3.2.BUILD-SNAPSHOT.
Use the new initialize() method in test cases for MappingContext. Benefit from rejection of multiple id properties.
2012-06-22 14:43:23 +02:00
Oliver Gierke
ba9abd1dd0 DATAMONGO-455 - Documentation mentions BasicQuery. 2012-06-20 12:24:59 +02:00
Oliver Gierke
ab66614843 DATAMONGO-454 - Improvements to ServerAddressPropertyEditor.
ServerAddressPropertyEditor now only eventually fails if none of the configured addresses can be parsed correctly. Strengthened the parsing implementation to not fail for host-only parsing or accidental double commas.
Cleaned up integration tests for replica set configuration.
2012-06-20 10:57:32 +02:00
Oliver Gierke
caa2f5f030 DATAMONGO-378 - Fixed potential ClassCastException for MapReduceResults and upcoming MongoDB release.
The type of the value returned for the total field of the timing map in map-reduce results has changed from Integer to Long as of MongoDB version 2.1.0 apparently. Changed MapReduceResults to accommodate either Integer or Long types.
2012-06-20 10:05:16 +02:00
Oliver Gierke
6e7c8f5771 DATAMONGO-462 - Added custom converters for URL.
So far URL instances were treated as entities and serialized as nested document. As there was no custom converter registered to re-instantiate the objects and URL does not contain a no-arg constructor, reading the instances back in resulted in an ugly exception in ReflectionEntityInstantiator. We now register a custom Converter to serialize URL instances as their plain toString() representation. This causes the reading working out of the box as the StringToObjectConverter registered by default uses the constructor taking a String on URL accidentally. To make sure this still works we added an explicit StringToURLConverter to implement symmetric conversions.
2012-06-19 18:44:42 +02:00
Oliver Gierke
44c4566c9d DATAMONGO-428 - Fixed parsing of output collection for complex MapReduce result.
The raw result for a map-reduce operation might contain a complex element containing the output collection in case the original request configured an output database as option. Adapted the parsing of the output collection to accommodate both scenarios (plain String value as well as DBObject wrapper).
2012-06-19 09:30:35 +02:00
Oliver Gierke
9a1e6226b1 DATAMONGO-424 - Don't resolve DBRefs eagerly if property type is DBRef.
So far we have resolved DBRef values eagerly without inspecting the actual property type the would have to be assigned eventually. Now we simply skip the recursive resolve process if the property type (or component type, map value type) is DBRef actually.
2012-06-19 09:01:28 +02:00
Oliver Gierke
cdb6d54d6a DATAMONGO-460 - Improvements to Querydsl repository implementation.
Internalized setup of MongodbSerializer which makes the constructors of SpringDataMongodbQuery much simpler.
2012-06-18 20:18:48 +02:00
Oliver Gierke
1fbfd3f0cb DATAMONGO-450 - Log output uses mapped query for debug logging. 2012-06-14 12:37:14 +02:00
Oliver Gierke
848e6f59c2 DATAMONGO-447 - Fixed broken log output in debug level.
The debug output now uses the already mapped query object when concatenating the log string. Improved applying the id after save operations by inspecting whether the object already has the id set before trying to set it. This could have caused problems in case you use a complex id and don't provide a custom converter as it can be serialized out of the box. Fixed minor glitch in MappingMongoConverter which was not really a bug as another path through the code has covered the scenario later on. Introduced SerializationUtils class that provides a method to safely serialize objects to pseudo JSON. Pseudo in the sense that it simply renders a complex object as { $java : object.toString() }. This is useful for debug output before the DBObject was mapped into Mongo-native types.
2012-06-14 12:26:55 +02:00
Oliver Gierke
b3a891c69b DATAMONGO-458 - Read empty collections are modifiable now.
So far we've read empty collections and populated the property value of the Java object being created with Collections.emptySet(). This returns an unmodifiable Set so that further modifications fail with an UnsupportedOperationException. We now simply use new HashSet<Object>().
2012-06-08 12:59:10 +02:00
Oliver Gierke
54e105d19d General dependency and plugin version upgrades.
Upgraded to JUnit 4.10. Move to junit-dep dependency to allow cleaning up hamcrest dependencies. Use hamcrest-library ins tread of hamcrest-all. Upgrade to Mockito 1.9.0, use mockito-core instead of mockito-all. Upgrade to Spring Data Core 1.3.1.BUILD-SNAPSHOT. Upgraded to Querydsl 2.6.0. Upgraded compiler and Surefire plugin.
2012-05-29 14:20:53 +02:00
Oliver Gierke
4cfb62a413 DATAMONGO-451 - Tweaked pom.xml to allow build run without Bundlor.
Expose the failOnWarnings property to be able to override it through a command line argument. Necessary as the Sonar build uses Clover which instruments source code and thus creates type dependencies, Bundlor complains about not being declared in template.mf.
2012-05-15 13:43:59 +02:00
Maciej Walkowiak
7cdf9cedf3 DATAMONGO-446 - Fixed bug in paging query methods returning Lists
Using List as return type for paginating methods didn't work for query methods currently. Fixed by inspecting the Pageable parameter potentially handed into them and restricting the result set accordingly.
2012-05-15 13:17:56 +02:00
Spring Buildmaster
fd198c172b DATAMONGO-396 - Released 1.1.0.M1. 2012-05-07 06:11:26 -07:00
93 changed files with 3453 additions and 1118 deletions

View File

@@ -4,7 +4,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
@@ -42,7 +42,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
</dependency>
<dependency>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>

View File

@@ -5,21 +5,21 @@
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- versions for commonly-used dependencies -->
<junit.version>4.8.1</junit.version>
<junit.version>4.10</junit.version>
<log4j.version>1.2.16</log4j.version>
<org.mockito.version>1.8.4</org.mockito.version>
<org.mockito.version>1.9.0</org.mockito.version>
<org.slf4j.version>1.6.1</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version.30>3.0.7.RELEASE</org.springframework.version.30>
<org.springframework.version.40>4.0.0.RELEASE</org.springframework.version.40>
<org.springframework.version.range>[${org.springframework.version.30}, ${org.springframework.version.40})</org.springframework.version.range>
<data.commons.version>1.3.0.RC2</data.commons.version>
<org.springframework.version.range>3.1.2.RELEASE</org.springframework.version.range>
<data.commons.version>1.4.0.RC1</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
<bundlor.failOnWarnings>true</bundlor.failOnWarnings>
</properties>
<distributionManagement>
@@ -46,27 +46,27 @@
<dependencies>
<!-- Test dependencies -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<!-- Test dependencies -->
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<artifactId>hamcrest-library</artifactId>
<version>1.2.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<artifactId>junit-dep</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>${org.mockito.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
@@ -160,7 +160,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<version>2.4</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
@@ -180,7 +180,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.8</version>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
@@ -189,7 +189,7 @@
<excludes>
<exclude>**/PerformanceTests.java</exclude>
</excludes>
<junitArtifactName>junit:junit</junitArtifactName>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
<plugin>
@@ -225,7 +225,7 @@
<artifactId>com.springsource.bundlor.maven</artifactId>
<version>1.0.0.RELEASE</version>
<configuration>
<failOnWarnings>true</failOnWarnings>
<failOnWarnings>${bundlor.failOnWarnings}</failOnWarnings>
</configuration>
<executions>
<execution>
@@ -247,8 +247,8 @@
</pluginRepositories>
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>http://repo.springsource.org/libs-snapshot</url>
<id>spring-libs-milestone</id>
<url>http://repo.springsource.org/libs-milestone</url>
</repository>
</repositories>
<reporting>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.1.0.M1</version>
<version>1.1.0.RC1</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
@@ -12,7 +12,7 @@
<properties>
<mongo.version>2.7.1</mongo.version>
<querydsl.version>2.5.0</querydsl.version>
<querydsl.version>2.6.0</querydsl.version>
<cdi.version>1.0</cdi.version>
<validation.version>1.0.0.GA</validation.version>
<webbeans.version>1.1.3</webbeans.version>
@@ -163,6 +163,7 @@
<excludes>
<exclude>none</exclude>
</excludes>
<junitArtifactName>junit:junit-dep</junitArtifactName>
</configuration>
</plugin>
</plugins>

View File

@@ -39,7 +39,7 @@ import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
/**
* Base class for Spring Data Mongo configuration using JavaConfig.
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
@@ -86,22 +86,26 @@ public abstract class AbstractMongoConfiguration {
@Bean
public SimpleMongoDbFactory mongoDbFactory() throws Exception {
UserCredentials creadentials = getUserCredentials();
UserCredentials credentials = getUserCredentials();
if (creadentials == null) {
if (credentials == null) {
return new SimpleMongoDbFactory(mongo(), getDatabaseName());
} else {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), creadentials);
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), credentials);
}
}
/**
* Return the base package to scan for mapped {@link Document}s.
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour.
*
* @return
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
*/
protected String getMappingBasePackage() {
return null;
return getClass().getPackage().getName();
}
/**
@@ -127,7 +131,7 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
mappingContext.initialize();
return mappingContext;
}

View File

@@ -48,6 +48,7 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
@@ -166,27 +167,35 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
BeanDefinition conversionsDefinition) {
String ctxRef = element.getAttribute("mapping-context-ref");
if (!StringUtils.hasText(ctxRef)) {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
if (StringUtils.hasText(ctxRef)) {
return ctxRef;
}
BeanComponentDefinitionBuilder componentDefinitionBuilder = new BeanComponentDefinitionBuilder(element,
parserContext);
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element, mappingContextBuilder);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
}
if (conversionsDefinition != null) {
AbstractBeanDefinition simpleTypesDefinition = new GenericBeanDefinition();
simpleTypesDefinition.setFactoryBeanName("customConversions");
simpleTypesDefinition.setFactoryMethodName("getSimpleTypeHolder");
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
parserContext.getRegistry().registerBeanDefinition(MAPPING_CONTEXT, mappingContextBuilder.getBeanDefinition());
ctxRef = MAPPING_CONTEXT;
parserContext.registerBeanComponent(componentDefinitionBuilder.getComponent(mappingContextBuilder, ctxRef));
return ctxRef;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 by the original author(s).
* Copyright 2011-2012 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,19 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.ParsingUtils.*;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.util.StringUtils;
@@ -44,19 +44,29 @@ import com.mongodb.MongoURI;
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String id = element.getAttribute("id");
if (!StringUtils.hasText(id)) {
id = DB_FACTORY;
}
return id;
String id = super.resolveId(element, definition, parserContext);
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
@@ -64,12 +74,11 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
@@ -77,19 +86,26 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
// Defaulting
mongoRef = StringUtils.hasText(mongoRef) ? mongoRef : registerMongoBeanDefinition(element, parserContext);
dbname = StringUtils.hasText(dbname) ? dbname : "db";
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
} else {
dbFactoryBuilder.addConstructorArgValue(registerMongoBeanDefinition(element, parserContext));
}
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(dbname);
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
BeanComponentDefinition component = helper.getComponent(writeConcernPropertyEditorBuilder);
parserContext.registerBeanComponent(component);
return (AbstractBeanDefinition) helper.getComponentIdButFallback(dbFactoryBuilder, BeanNames.DB_FACTORY)
.getBeanDefinition();
}
/**
@@ -100,14 +116,13 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* @param parserContext must not be {@literal null}.
* @return
*/
private String registerMongoBeanDefinition(Element element, ParserContext parserContext) {
private BeanDefinition registerMongoBeanDefinition(Element element, ParserContext parserContext) {
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(element, mongoBuilder, "host");
ParsingUtils.setPropertyValue(element, mongoBuilder, "port");
setPropertyValue(mongoBuilder, element, "host");
setPropertyValue(mongoBuilder, element, "port");
return BeanDefinitionReaderUtils.registerWithGeneratedName(mongoBuilder.getBeanDefinition(),
parserContext.getRegistry());
return getSourceBeanDefinition(mongoBuilder, parserContext, element);
}
/**

View File

@@ -16,7 +16,9 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigParser;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryBeanDefinitionParser;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
@@ -32,7 +34,10 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
*/
public void init() {
registerBeanDefinitionParser("repositories", new MongoRepositoryConfigParser());
RepositoryConfigurationExtension extension = new MongoRepositoryConfigurationExtension();
RepositoryBeanDefinitionParser repositoryBeanDefinitionParser = new RepositoryBeanDefinitionParser(extension);
registerBeanDefinitionParser("repositories", repositoryBeanDefinitionParser);
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,20 +13,20 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
@@ -35,54 +35,59 @@ import org.w3c.dom.Element;
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public class MongoParser extends AbstractSingleBeanDefinitionParser {
public class MongoParser implements BeanDefinitionParser {
@Override
protected Class<?> getBeanClass(Element element) {
return MongoFactoryBean.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
ParsingUtils.setPropertyValue(element, builder, "port", "port");
ParsingUtils.setPropertyValue(element, builder, "host", "host");
ParsingUtils.setPropertyValue(element, builder, "write-concern", "writeConcern");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
ParsingUtils.parseMongoOptions(element, builder);
ParsingUtils.parseReplicaSet(element, builder);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "write-concern", "writeConcern");
registerServerAddressPropertyEditor(parserContext.getRegistry());
ParsingUtils.registerWriteConcernPropertyEditor(parserContext.getRegistry());
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernPropertyEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*
* @param parserContext the ParserContext to
*/
private void registerServerAddressPropertyEditor(BeanDefinitionRegistry registry) {
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
String name = super.resolveId(element, definition, parserContext);
if (!StringUtils.hasText(name)) {
name = "mongo";
}
return name;
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}
}

View File

@@ -0,0 +1,103 @@
/*
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
*/
abstract class MongoParsingUtils {
private MongoParsingUtils() {
}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,141 +0,0 @@
/*
* Copyright (c) 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionReaderUtils;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
abstract class ParsingUtils {
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return true if parsing actually occured, false otherwise
*/
static boolean parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
String replicaSetString = element.getAttribute("replica-set");
if (StringUtils.hasText(replicaSetString)) {
mongoBuilder.addPropertyValue("replicaSetSeeds", replicaSetString);
}
return true;
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsElement, optionsDefBuilder, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsElement, optionsDefBuilder, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsElement, optionsDefBuilder, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsElement, optionsDefBuilder, "connect-timeout", "connectTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-timeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsElement, optionsDefBuilder, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsElement, optionsDefBuilder, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsElement, optionsDefBuilder, "write-number", "writeNumber");
setPropertyValue(optionsElement, optionsDefBuilder, "write-timeout", "writeTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "write-fsync", "writeFsync");
setPropertyValue(optionsElement, optionsDefBuilder, "slave-ok", "slaveOk");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName, String propertyName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(propertyName, attr);
}
}
/**
* Sets the property with the given attribute name on the given {@link BeanDefinitionBuilder} to the value of the
* attribute with the given name.
*
* @param element must not be {@literal null}.
* @param builder must not be {@literal null}.
* @param attrName must not be {@literal null} or empty.
*/
static void setPropertyValue(Element element, BeanDefinitionBuilder builder, String attrName) {
String attr = element.getAttribute(attrName);
if (StringUtils.hasText(attr)) {
builder.addPropertyValue(attrName, attr);
}
}
/**
* Returns the {@link BeanDefinition} built by the given {@link BeanDefinitionBuilder} enriched with source
* information derived from the given {@link Element}.
*
* @param builder must not be {@literal null}.
* @param context must not be {@literal null}.
* @param element must not be {@literal null}.
* @return
*/
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context,
Element element) {
AbstractBeanDefinition definition = builder.getBeanDefinition();
definition.setSource(context.extractSource(element));
return definition;
}
/**
* Registers a {@link WriteConcernPropertyEditor} in the given {@link BeanDefinitionRegistry}.
*
* @param registry must not be {@literal null}.
*/
static void registerWriteConcernPropertyEditor(BeanDefinitionRegistry registry) {
Assert.notNull(registry);
BeanDefinitionBuilder customEditorConfigurer = BeanDefinitionBuilder
.genericBeanDefinition(CustomEditorConfigurer.class);
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
customEditorConfigurer.addPropertyValue("customEditors", customEditors);
BeanDefinitionReaderUtils.registerWithGeneratedName(customEditorConfigurer.getBeanDefinition(), registry);
}
}

View File

@@ -17,7 +17,11 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.util.StringUtils;
import com.mongodb.ServerAddress;
@@ -30,6 +34,8 @@ import com.mongodb.ServerAddress;
*/
public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
@@ -38,21 +44,49 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
public void setAsText(String replicaSetString) {
String[] replicaSetStringArray = StringUtils.commaDelimitedListToStringArray(replicaSetString);
ServerAddress[] serverAddresses = new ServerAddress[replicaSetStringArray.length];
Set<ServerAddress> serverAddresses = new HashSet<ServerAddress>(replicaSetStringArray.length);
for (int i = 0; i < replicaSetStringArray.length; i++) {
for (String element : replicaSetStringArray) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(replicaSetStringArray[i], ":");
ServerAddress address = parseServerAddress(element);
try {
serverAddresses[i] = new ServerAddress(hostAndPort[0], Integer.parseInt(hostAndPort[1]));
} catch (NumberFormatException e) {
throw new IllegalArgumentException("Could not parse port " + hostAndPort[1], e);
} catch (UnknownHostException e) {
throw new IllegalArgumentException("Could not parse host " + hostAndPort[0], e);
if (address != null) {
serverAddresses.add(address);
}
}
setValue(serverAddresses);
if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration! Validate your config!");
}
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
}
/**
* Parses the given source into a {@link ServerAddress}.
*
* @param source
* @return the
*/
private ServerAddress parseServerAddress(String source) {
String[] hostAndPort = StringUtils.delimitedListToStringArray(source.trim(), ":");
if (!StringUtils.hasText(source) || hostAndPort.length > 2) {
LOG.warn("Could not parse address source '{}'. Check your replica set configuration!", source);
return null;
}
try {
return hostAndPort.length == 1 ? new ServerAddress(hostAndPort[0]) : new ServerAddress(hostAndPort[0],
Integer.parseInt(hostAndPort[1]));
} catch (UnknownHostException e) {
LOG.warn("Could not parse host '{}'. Check your replica set configuration!", hostAndPort[0]);
} catch (NumberFormatException e) {
LOG.warn("Could not parse port '{}'. Check your replica set configuration!", hostAndPort[1]);
}
return null;
}
}

View File

@@ -78,29 +78,36 @@ public abstract class MongoDbUtils {
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate) {
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
if (dbHolder != null && !dbHolder.isEmpty()) {
// pre-bound Mongo DB
DB db = null;
if (TransactionSynchronizationManager.isSynchronizationActive() && dbHolder.doesNotHoldNonDefaultDB()) {
// Spring transaction management is active ->
db = dbHolder.getDB();
db = dbHolder.getDB(databaseName);
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing Mongo DB");
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
}
if (db != null) {
return db;
}
}
LOGGER.trace("Getting Mongo Database name=[" + databaseName + "]");
DB db = mongo.getDB(databaseName);
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
if (credentialsGiven && !db.isAuthenticated()) {
// Note, can only authenticate once against the same com.mongodb.DB object.
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
@@ -113,16 +120,20 @@ public abstract class MongoDbUtils {
// Use same Session for further Mongo actions within the transaction.
// Thread object will get removed by synchronization at transaction completion.
if (TransactionSynchronizationManager.isSynchronizationActive()) {
// We're within a Spring-managed transaction, possibly from JtaTransactionManager.
LOGGER.debug("Registering Spring transaction synchronization for new Hibernate Session");
LOGGER.debug("Registering Spring transaction synchronization for MongoDB instance {}.", databaseName);
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(db);
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(db);
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
}
@@ -146,6 +157,7 @@ public abstract class MongoDbUtils {
* @return whether the DB is transactional
*/
public static boolean isDBTransactional(DB db, Mongo mongo) {
if (mongo == null) {
return false;
}
@@ -159,6 +171,7 @@ public abstract class MongoDbUtils {
* @param db the DB to close (may be <code>null</code>)
*/
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {

View File

@@ -15,10 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.SerializationUtils.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
@@ -46,6 +46,7 @@ import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
@@ -105,6 +106,7 @@ import com.mongodb.util.JSON;
* @author Graeme Rocher
* @author Mark Pollack
* @author Oliver Gierke
* @author Amol Nayak
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -205,7 +207,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
}
}
}
/**
@@ -330,11 +331,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject sortObject = query.getSortObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: $s",
serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
@@ -729,11 +731,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDoc);
wr = collection.insert(dbDoc);
} else {
collection.insert(dbDoc, writeConcernToUse);
wr = collection.insert(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "insert");
return dbDoc.get(ID);
}
});
@@ -752,11 +756,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDocList);
wr = collection.insert(dbDocList);
} else {
collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
wr = collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
}
handleAnyWriteResultErrors(wr, null, "insert_list");
return null;
}
});
@@ -783,11 +789,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.save(dbDoc);
wr = collection.save(dbDoc);
} else {
collection.save(dbDoc, writeConcernToUse);
wr = collection.save(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "save");
return dbDoc.get(ID);
}
});
@@ -872,7 +880,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns a {@link Query} for the given entity by its id.
*
*
* @param object must not be {@literal null}.
* @return
*/
@@ -932,7 +940,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entityClass, null, queryObject);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("remove using query: " + queryObject + " in collection: " + collection.getName());
LOGGER.debug("remove using query: " + dboq + " in collection: " + collection.getName());
}
if (writeConcernToUse == null) {
wr = collection.remove(dboq);
@@ -990,26 +998,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
CommandResult commandResult = null;
try {
if (command.getOutputType() == MapReduceCommand.OutputType.INLINE) {
commandResult = executeCommand(commandObject, getDb().getOptions());
} else {
commandResult = executeCommand(commandObject);
}
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [" + commandResult + "]");
LOGGER.debug(String.format("MapReduce command result = [%s]", serializeToJsonSafely(commandObject)));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1034,7 +1031,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
@@ -1060,21 +1057,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
LOGGER.debug(String.format("Executing Group with DBObject [%s]", serializeToJsonSafely(commandObject)));
}
CommandResult commandResult = executeCommand(commandObject, getDb().getOptions());
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
}
@@ -1181,7 +1169,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Create the specified collection using the provided options
*
*
* @param collectionName
* @param collectionOptions
* @return the collection that was created
@@ -1203,7 +1191,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1228,7 +1216,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* The query document is specified as a standard DBObject and so is the fields specification.
* <p/>
* Can be overridden by subclasses.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1245,11 +1233,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
CursorPreparer preparer, DbObjectCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
+ " in collection: " + collectionName);
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), preparer,
objectCallback, collectionName);
}
@@ -1258,7 +1249,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1297,7 +1288,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param entityClass the parameterized type of the returned list.
@@ -1331,18 +1322,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
updateObj.put(key, mongoConverter.convertToMongoType(updateObj.get(key)));
}
DBObject mappedQuery = mapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndModify using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " and update: " + updateObj + " in collection: " + collectionName);
LOGGER.debug("findAndModify using query: " + mappedQuery + " fields: " + fields + " sort: " + sort
+ " for class: " + entityClass + " and update: " + updateObj + " in collection: " + collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mapper.getMappedObject(query, entity), fields, sort,
updateObj, options), new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, updateObj, options),
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
/**
* Populates the id property of the saved object, if it's not set already.
*
*
* @param savedObject
* @param id
*/
@@ -1358,14 +1351,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
try {
BeanWrapper.create(savedObject, mongoConverter.getConversionService()).setProperty(idProp, id);
ConversionService conversionService = mongoConverter.getConversionService();
BeanWrapper<PersistentEntity<Object, ?>, Object> wrapper = BeanWrapper.create(savedObject, conversionService);
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
if (idValue != null) {
return;
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(idProp, id);
}
private DBCollection getAndPrepareCollection(DB db, String collectionName) {
@@ -1385,7 +1380,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* <li>Execute the given {@link ConnectionCallback} for a {@link DBObject}.</li>
* <li>Apply the given {@link DbObjectCallback} to each of the {@link DBObject}s to obtain the result.</li>
* <ol>
*
*
* @param <T>
* @param collectionCallback the callback to retrieve the {@link DBObject} with
* @param objectCallback the {@link DbObjectCallback} to transform {@link DBObject}s into the actual domain type
@@ -1508,9 +1503,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String error = wr.getError();
if (error != null) {
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
String message;
if (operation.equals("insert") || operation.equals("save")) {
// assuming the insert operations will begin with insert string
message = String.format("Insert/Save for %s failed: %s", query, error);
} else if (operation.equals("insert_list")) {
message = String.format("Insert list failed: %s", error);
} else {
message = String.format("Execution of %s%s failed: %s", operation,
query == null ? "" : "' using '" + query.toString() + "' query", error);
}
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
@@ -1533,6 +1535,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return resolved == null ? ex : resolved;
}
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
*
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
private void handleCommandError(CommandResult result, DBObject source) {
try {
result.throwOnError();
} catch (MongoException ex) {
String error = result.getErrorMessage();
error = error == null ? "NO MESSAGE" : error;
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ source, ex);
}
}
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
converter.afterPropertiesSet();

View File

@@ -0,0 +1,110 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Collection;
import java.util.Iterator;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Utility methods for JSON serialization.
*
* @author Oliver Gierke
*/
public abstract class SerializationUtils {
private SerializationUtils() {
}
/**
* Serializes the given object into pseudo-JSON meaning it's trying to create a JSON representation as far as possible
* but falling back to the given object's {@link Object#toString()} method if it's not serializable. Useful for
* printing raw {@link DBObject}s containing complex values before actually converting them into Mongo native types.
*
* @param value
* @return
*/
public static String serializeToJsonSafely(Object value) {
if (value == null) {
return null;
}
try {
return JSON.serialize(value);
} catch (Exception e) {
if (value instanceof Collection) {
return toString((Collection<?>) value);
} else if (value instanceof Map) {
return toString((Map<?, ?>) value);
} else if (value instanceof DBObject) {
return toString(((DBObject) value).toMap());
} else {
return String.format("{ $java : %s }", value.toString());
}
}
}
private static String toString(Map<?, ?> source) {
return iterableToDelimitedString(source.entrySet(), "{ ", " }", new Converter<Entry<?, ?>, Object>() {
public Object convert(Entry<?, ?> source) {
return String.format("\"%s\" : %s", source.getKey(), serializeToJsonSafely(source.getValue()));
}
});
}
private static String toString(Collection<?> source) {
return iterableToDelimitedString(source, "[ ", " ]", new Converter<Object, Object>() {
public Object convert(Object source) {
return serializeToJsonSafely(source);
}
});
}
/**
* Creates a string representation from the given {@link Iterable} prepending the postfix, applying the given
* {@link Converter} to each element before adding it to the result {@link String}, concatenating each element with
* {@literal ,} and applying the postfix.
*
* @param source
* @param prefix
* @param postfix
* @param transformer
* @return
*/
private static <T> String iterableToDelimitedString(Iterable<T> source, String prefix, String postfix,
Converter<? super T, Object> transformer) {
StringBuilder builder = new StringBuilder(prefix);
Iterator<T> iterator = source.iterator();
while (iterator.hasNext()) {
builder.append(transformer.convert(iterator.next()));
if (iterator.hasNext()) {
builder.append(", ");
}
}
return builder.append(postfix).toString();
}
}

View File

@@ -38,6 +38,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalT
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.util.Assert;
@@ -89,6 +91,8 @@ public class CustomConversions {
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.addAll(converters);
for (Object c : this.converters) {
@@ -219,7 +223,7 @@ public class CustomConversions {
/**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass
* oth the given expected type though. If {@code expexctedTargetType} is {@literal null} we will simply return the
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the
* first target type matching or {@literal null} if no conversion can be found.
*
* @param source must not be {@literal null}

View File

@@ -15,11 +15,11 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
@@ -252,13 +252,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
Object obj = getValueInternal(inverseProp, dbo, evaluator, result);
try {
wrapper.setProperty(inverseProp, obj);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
wrapper.setProperty(inverseProp, obj);
}
});
@@ -470,7 +466,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param collection must not be {@literal null}.
* @param property must not be {@literal null}.
*
* @return
*/
protected DBObject createCollection(Collection<?> collection, MongoPersistentProperty property) {
@@ -676,6 +671,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(target);
if (target instanceof DBRef) {
return (DBRef) target;
}
MongoPersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
if (null == targetEntity) {
@@ -713,36 +712,41 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @return the converted {@link Collections}, will never be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
private Collection<?> readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
Assert.notNull(targetType);
Class<?> collectionType = targetType.getType();
if (sourceValue.isEmpty()) {
return Collections.emptySet();
return getPotentiallyConvertedSimpleRead(new HashSet<Object>(), collectionType);
}
Class<?> collectionType = targetType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, sourceValue.size());
TypeInformation<?> componentType = targetType.getComponentType();
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
if (dbObjItem instanceof DBRef) {
items.add(read(componentType, ((DBRef) dbObjItem).fetch(), parent));
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, ((DBRef) dbObjItem).fetch(),
parent));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, componentType == null ? null : componentType.getType()));
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
}
}
return items;
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
}
/**
@@ -776,9 +780,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, ((DBRef) value).fetch()));
} else {
Class<?> valueClass = valueType == null ? null : valueType.getType();
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
@@ -810,7 +817,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
Class<?> target = conversions.getCustomWriteTarget(getClass());
Class<?> target = conversions.getCustomWriteTarget(obj.getClass());
if (target != null) {
return conversionService.convert(obj, target);
}
@@ -939,9 +946,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) {
return (T) read(type, ((DBRef) value).fetch(), parent);
return (T) (rawType.equals(DBRef.class) ? value : read(type, ((DBRef) value).fetch(), parent));
} else if (value instanceof BasicDBList) {
return (T) getPotentiallyConvertedSimpleRead(readCollectionOrArray(type, (BasicDBList) value, parent), rawType);
return (T) readCollectionOrArray(type, (BasicDBList) value, parent);
} else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent);
} else {

View File

@@ -17,8 +17,12 @@ package org.springframework.data.mongodb.core.convert;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.Converter;
import org.springframework.util.StringUtils;
@@ -119,4 +123,28 @@ abstract class MongoConverters {
return StringUtils.hasText(source) ? new BigInteger(source) : null;
}
}
public static enum URLToStringConverter implements Converter<URL, String> {
INSTANCE;
public String convert(URL source) {
return source == null ? null : source.toString();
}
}
public static enum StringToURLConverter implements Converter<String, URL> {
INSTANCE;
private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class);
private static final TypeDescriptor TARGET = TypeDescriptor.valueOf(URL.class);
public URL convert(String source) {
try {
return source == null ? null : new URL(source);
} catch (MalformedURLException e) {
throw new ConversionFailedException(SOURCE, TARGET, source, e);
}
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,15 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import org.bson.types.BasicBSONList;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
@@ -33,11 +34,12 @@ import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* A helper class to encapsulate any modifications of a Query object before it gets submitted to the database.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class QueryMapper {
@@ -47,6 +49,7 @@ public class QueryMapper {
private final ConversionService conversionService;
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
@@ -54,9 +57,12 @@ public class QueryMapper {
* @param converter must not be {@literal null}.
*/
public QueryMapper(MongoConverter converter) {
Assert.notNull(converter);
this.conversionService = converter.getConversionService();
this.converter = converter;
this.mappingContext = converter.getMappingContext();
}
/**
@@ -69,48 +75,151 @@ public class QueryMapper {
*/
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
DBObject newDbo = new BasicDBObject();
if (Keyword.isKeyword(query)) {
return getMappedKeyword(new Keyword(query), entity);
}
DBObject result = new BasicDBObject();
for (String key : query.keySet()) {
MongoPersistentEntity<?> nestedEntity = getNestedEntity(entity, key);
String newKey = key;
MongoPersistentProperty targetProperty = getTargetProperty(key, entity);
String newKey = determineKey(key, entity);
Object value = query.get(key);
if (isIdKey(key, entity)) {
if (value instanceof DBObject) {
DBObject valueDbo = (DBObject) value;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
} else {
value = getMappedObject((DBObject) value, nestedEntity);
}
} else {
value = convertId(value);
}
newKey = "_id";
} else if (key.matches(N_OR_PATTERN)) {
// $or/$nor
Iterable<?> conditions = (Iterable<?>) value;
BasicBSONList newConditions = new BasicBSONList();
Iterator<?> iter = conditions.iterator();
while (iter.hasNext()) {
newConditions.add(getMappedObject((DBObject) iter.next(), nestedEntity));
}
value = newConditions;
} else if (key.equals("$ne")) {
value = convertId(value);
}
newDbo.put(newKey, convertSimpleOrDBObject(value, nestedEntity));
result.put(newKey, getMappedValue(value, targetProperty, newKey));
}
return newDbo;
return result;
}
/**
* Returns the given {@link DBObject} representing a keyword by mapping the keyword's value.
*
* @param query the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
* @param entity
* @return
*/
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN)) {
Iterable<?> conditions = (Iterable<?>) query.value;
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(getMappedObject((DBObject) condition, entity));
}
return new BasicDBObject(query.key, newConditions);
}
return new BasicDBObject(query.key, convertSimpleOrDBObject(query.value, entity));
}
/**
* Returns the mapped keyword considered defining a criteria for the given property.
*
* @param keyword
* @param property
* @return
*/
public DBObject getMappedKeyword(Keyword keyword, MongoPersistentProperty property) {
if (property.isAssociation()) {
convertAssociation(keyword.value, property);
}
return new BasicDBObject(keyword.key, getMappedValue(keyword.value, property, keyword.key));
}
/**
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param source the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
private Object getMappedValue(Object source, MongoPersistentProperty property, String newKey) {
if (property == null) {
return convertSimpleOrDBObject(source, null);
}
if (property.isIdProperty() || "_id".equals(newKey)) {
if (source instanceof DBObject) {
DBObject valueDbo = (DBObject) source;
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id));
}
valueDbo.put(inKey, ids.toArray(new Object[ids.size()]));
} else if (valueDbo.containsField("$ne")) {
valueDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject((DBObject) source, null);
}
return valueDbo;
} else {
return convertId(source);
}
}
if (property.isAssociation()) {
return Keyword.isKeyword(source) ? getMappedKeyword(new Keyword(source), property) : convertAssociation(source,
property);
}
return convertSimpleOrDBObject(source, mappingContext.getPersistentEntity(property));
}
private MongoPersistentProperty getTargetProperty(String key, MongoPersistentEntity<?> entity) {
if (isIdKey(key, entity)) {
return entity.getIdProperty();
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? null : path.getLeafProperty();
}
private PersistentPropertyPath<MongoPersistentProperty> getPath(String key, MongoPersistentEntity<?> entity) {
if (entity == null) {
return null;
}
try {
PropertyPath path = PropertyPath.from(key, entity.getTypeInformation());
return mappingContext.getPersistentPropertyPath(path);
} catch (PropertyReferenceException e) {
return null;
}
}
/**
* Returns the translated key assuming the given one is a propert (path) reference.
*
* @param key the source key
* @param entity the base entity
* @return the translated key
*/
private String determineKey(String key, MongoPersistentEntity<?> entity) {
if (entity == null && DEFAULT_ID_NAMES.contains(key)) {
return "_id";
}
PersistentPropertyPath<MongoPersistentProperty> path = getPath(key, entity);
return path == null ? key : path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE);
}
/**
@@ -133,6 +242,30 @@ public class QueryMapper {
return converter.convertToMongoType(source);
}
/**
* Converts the given source assuming it's actually an association to anoter object.
*
* @param source
* @param property
* @return
*/
private Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || !property.isAssociation()) {
return source;
}
if (source instanceof Iterable) {
BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) {
result.add(element instanceof DBRef ? element : converter.toDBRef(element, property));
}
return result;
}
return source instanceof DBRef ? source : converter.toDBRef(source, property);
}
/**
* Returns whether the given key will be considered an id key.
*
@@ -142,26 +275,19 @@ public class QueryMapper {
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (null != entity && entity.getIdProperty() != null) {
MongoPersistentProperty idProperty = entity.getIdProperty();
if (entity == null) {
return false;
}
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}
return DEFAULT_ID_NAMES.contains(key);
}
private MongoPersistentEntity<?> getNestedEntity(MongoPersistentEntity<?> entity, String key) {
MongoPersistentProperty property = entity == null ? null : entity.getPersistentProperty(key);
if (property == null || !property.isEntity()) {
return null;
}
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context = converter.getMappingContext();
return context.getPersistentEntity(property);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
@@ -178,4 +304,44 @@ public class QueryMapper {
return converter.convertToMongoType(id);
}
/**
* Value object to capture a query keyword representation.
*
* @author Oliver Gierke
*/
private static class Keyword {
String key;
Object value;
Keyword(Object source) {
Assert.isInstanceOf(DBObject.class, source);
DBObject value = (DBObject) source;
Assert.isTrue(value.keySet().size() == 1, "Keyword must have a single key only!");
this.key = value.keySet().iterator().next();
this.value = value.get(key);
}
/**
* Returns whether the given value actually represents a keyword. If this returns {@literal true} it's safe to call
* the constructor.
*
* @param value
* @return
*/
static boolean isKeyword(Object value) {
if (!(value instanceof DBObject)) {
return false;
}
DBObject dbObject = (DBObject) value;
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
@@ -24,14 +24,29 @@ import java.lang.annotation.Target;
/**
* Mark a class to use compound indexes.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
@Target({ ElementType.TYPE })
@Documented
@Retention(RetentionPolicy.RUNTIME)
public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
*
* @return
*/
String def();
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
* attribute actually.
*
* @return
*/
@Deprecated
IndexDirection direction() default IndexDirection.ASCENDING;
boolean unique() default false;
@@ -40,8 +55,18 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
*
* @return
*/
String name() default "";
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
*
* @return
*/
String collection() default "";
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,37 +13,51 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.util.Assert;
/**
* An implementation of ApplicationEventPublisher that will only fire MappingContextEvents for use by the index creator
* when MongoTemplate is used 'stand-alone', that is not declared inside a Spring ApplicationContext.
* An implementation of ApplicationEventPublisher that will only fire {@link MappingContextEvent}s for use by the index
* creator when MongoTemplate is used 'stand-alone', that is not declared inside a Spring {@link ApplicationContext}.
* Declare {@link MongoTemplate} inside an {@link ApplicationContext} to enable the publishing of all persistence events
* such as {@link AfterLoadEvent}, {@link AfterSaveEvent}, etc.
*
* Declare MongoTemplate inside an ApplicationContext to enable the publishing of all persistence events such as
* {@link AfterLoadEvent}, {@link AfterSaveEvent}, etc.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoMappingEventPublisher implements ApplicationEventPublisher {
private MongoPersistentEntityIndexCreator indexCreator;
private final MongoPersistentEntityIndexCreator indexCreator;
/**
* Creates a new {@link MongoMappingEventPublisher} for the given {@link MongoPersistentEntityIndexCreator}.
*
* @param indexCreator must not be {@literal null}.
*/
public MongoMappingEventPublisher(MongoPersistentEntityIndexCreator indexCreator) {
Assert.notNull(indexCreator);
this.indexCreator = indexCreator;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(org.springframework.context.ApplicationEvent)
*/
@SuppressWarnings("unchecked")
public void publishEvent(ApplicationEvent event) {
if (event instanceof MappingContextEvent) {
indexCreator
.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty>) event);
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
@@ -25,9 +24,8 @@ import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -39,32 +37,35 @@ import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link BasicMongoPersistentEntity} instances contained in the given
* {@link MongoMappingContext} for indexing metadata and ensures the indexes to be available.
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
* for indexing metadata and ensures the indexes to be available.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty>> {
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
private static final Logger log = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoDbFactory mongoDbFactory;
private final MongoMappingContext mappingContext;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@@iteral null}
* @param mongoDbFactory must not be {@@iteral null}
* @param mappingContext must not be {@literal null}
* @param mongoDbFactory must not be {@literal null}
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
Assert.notNull(mongoDbFactory);
Assert.notNull(mappingContext);
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
for (MongoPersistentEntity<?> entity : mappingContext.getPersistentEntities()) {
checkForIndexes(entity);
@@ -75,8 +76,11 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(
MappingContextEvent<MongoPersistentEntity<MongoPersistentProperty>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
}
PersistentEntity<?, ?> entity = event.getPersistentEntity();
@@ -97,12 +101,12 @@ public class MongoPersistentEntityIndexCreator implements
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = index.collection();
if ("".equals(indexColl)) {
indexColl = entity.getCollection();
}
ensureIndex(indexColl, index.name(), index.def(), index.direction(), index.unique(), index.dropDups(),
index.sparse());
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
}
@@ -111,10 +115,14 @@ public class MongoPersistentEntityIndexCreator implements
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
Field field = persistentProperty.getField();
if (field.isAnnotationPresent(Indexed.class)) {
Indexed index = field.getAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = persistentProperty.getFieldName();
} else {
@@ -126,11 +134,17 @@ public class MongoPersistentEntityIndexCreator implements
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
ensureIndex(collection, name, null, index.direction(), index.unique(), index.dropDups(), index.sparse());
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
}
} else if (field.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = field.getAnnotation(GeoSpatialIndexed.class);
@@ -155,21 +169,15 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void ensureIndex(String collection, final String name, final String def, final IndexDirection direction,
final boolean unique, final boolean dropDups, final boolean sparse) {
DBObject defObj;
if (null != def) {
defObj = (DBObject) JSON.parse(def);
} else {
defObj = new BasicDBObject();
defObj.put(name, (direction == IndexDirection.ASCENDING ? 1 : -1));
}
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(defObj, opts);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -87,6 +87,7 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
*/
@Override
public boolean isIdProperty() {
if (super.isIdProperty()) {
return true;
}

View File

@@ -42,6 +42,15 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.context.AbstractMappingContext#shouldCreatePersistentEntityFor(org.springframework.data.util.TypeInformation)
*/
@Override
protected boolean shouldCreatePersistentEntityFor(TypeInformation<?> type) {
return !MongoSimpleTypes.HOLDER.isSimpleType(type.getType());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.AbstractMappingContext#createPersistentProperty(java.lang.reflect.Field, java.beans.PropertyDescriptor, org.springframework.data.mapping.MutablePersistentEntity, org.springframework.data.mapping.SimpleTypeHolder)
@@ -72,6 +81,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,33 +26,39 @@ import com.mongodb.DBObject;
* Collects the results of performing a MapReduce operations.
*
* @author Mark Pollack
*
* @param <T> The class in which the results are mapped onto, accessible via an interator.
* @author Oliver Gierke
* @param <T> The class in which the results are mapped onto, accessible via an iterator.
*/
public class MapReduceResults<T> implements Iterable<T> {
private final List<T> mappedResults;
private final DBObject rawResults;
private final String outputCollection;
private final MapReduceTiming mapReduceTiming;
private final MapReduceCounts mapReduceCounts;
private DBObject rawResults;
private MapReduceTiming mapReduceTiming;
private MapReduceCounts mapReduceCounts;
private String outputCollection;
/**
* Creates a new {@link MapReduceResults} from the given mapped results and the raw one.
*
* @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}.
*/
public MapReduceResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
this.mappedResults = mappedResults;
this.rawResults = rawResults;
parseTiming(rawResults);
parseCounts(rawResults);
if (rawResults.get("result") != null) {
this.outputCollection = (String) rawResults.get("result");
}
this.mapReduceTiming = parseTiming(rawResults);
this.mapReduceCounts = parseCounts(rawResults);
this.outputCollection = parseOutputCollection(rawResults);
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}
@@ -73,28 +79,71 @@ public class MapReduceResults<T> implements Iterable<T> {
return rawResults;
}
protected void parseTiming(DBObject rawResults) {
private MapReduceTiming parseTiming(DBObject rawResults) {
DBObject timing = (DBObject) rawResults.get("timing");
if (timing != null) {
if (timing.get("mapTime") != null && timing.get("emitLoop") != null && timing.get("total") != null) {
mapReduceTiming = new MapReduceTiming((Long) timing.get("mapTime"), (Integer) timing.get("emitLoop"),
(Integer) timing.get("total"));
}
} else {
mapReduceTiming = new MapReduceTiming(-1, -1, -1);
if (timing == null) {
return new MapReduceTiming(-1, -1, -1);
}
if (timing.get("mapTime") != null && timing.get("emitLoop") != null && timing.get("total") != null) {
return new MapReduceTiming(getAsLong(timing, "mapTime"), getAsLong(timing, "emitLoop"),
getAsLong(timing, "total"));
}
return new MapReduceTiming(-1, -1, -1);
}
protected void parseCounts(DBObject rawResults) {
/**
* Returns the value of the source's field with the given key as {@link Long}.
*
* @param source
* @param key
* @return
*/
private Long getAsLong(DBObject source, String key) {
Object raw = source.get(key);
return raw instanceof Long ? (Long) raw : (Integer) raw;
}
/**
* Parses the raw {@link DBObject} result into a {@link MapReduceCounts} value object.
*
* @param rawResults
* @return
*/
private MapReduceCounts parseCounts(DBObject rawResults) {
DBObject counts = (DBObject) rawResults.get("counts");
if (counts != null) {
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
mapReduceCounts = new MapReduceCounts((Integer) counts.get("input"), (Integer) counts.get("emit"),
(Integer) counts.get("output"));
}
} else {
mapReduceCounts = new MapReduceCounts(-1, -1, -1);
if (counts == null) {
return new MapReduceCounts(-1, -1, -1);
}
if (counts.get("input") != null && counts.get("emit") != null && counts.get("output") != null) {
return new MapReduceCounts((Integer) counts.get("input"), (Integer) counts.get("emit"),
(Integer) counts.get("output"));
}
return new MapReduceCounts(-1, -1, -1);
}
/**
* Parses the output collection from the raw {@link DBObject} result.
*
* @param rawResults
* @return
*/
private String parseOutputCollection(DBObject rawResults) {
Object resultField = rawResults.get("result");
if (resultField == null) {
return null;
}
return resultField instanceof DBObject ? ((DBObject) resultField).get("collection").toString() : resultField
.toString();
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
@@ -23,13 +25,14 @@ import java.util.List;
import java.util.regex.Pattern;
import org.bson.BSON;
import org.bson.types.BasicBSONList;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -397,7 +400,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria orOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
}
@@ -408,7 +411,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
}
@@ -419,7 +422,7 @@ public class Criteria implements CriteriaDefinition {
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
BasicDBList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return this;
}
@@ -429,11 +432,9 @@ public class Criteria implements CriteriaDefinition {
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
*/
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
return criteriaChain.get(0).getSingleCriteriaObject();
@@ -477,8 +478,8 @@ public class Criteria implements CriteriaDefinition {
return queryCriteria;
}
private BasicBSONList createCriteriaList(Criteria[] criteria) {
BasicBSONList bsonList = new BasicBSONList();
private BasicDBList createCriteriaList(Criteria[] criteria) {
BasicDBList bsonList = new BasicDBList();
for (Criteria c : criteria) {
bsonList.add(c.getCriteriaObject());
}
@@ -496,4 +497,63 @@ public class Criteria implements CriteriaDefinition {
}
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Criteria that = (Criteria) obj;
boolean keyEqual = this.key == null ? that.key == null : this.key.equals(that.key);
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean valueEqual = isEqual(this.isValue, that.isValue);
return keyEqual && criteriaEqual && valueEqual;
}
/**
* Checks the given objects for equality. Handles {@link Pattern} and arrays correctly.
*
* @param left
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
if (left == null) {
return right == null;
}
if (left instanceof Pattern) {
return right instanceof Pattern ? ((Pattern) left).pattern().equals(((Pattern) right).pattern()) : false;
}
return ObjectUtils.nullSafeEquals(left, right);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += nullSafeHashCode(key);
result += criteria.hashCode();
result += nullSafeHashCode(isValue);
return result;
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.data.mongodb.core.SerializationUtils.*;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
@@ -140,4 +143,60 @@ public class Query {
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Query: %s, Fields: %s, Sort: %s", serializeToJsonSafely(getQueryObject()),
serializeToJsonSafely(getFieldsObject()), serializeToJsonSafely(getSortObject()));
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Query that = (Query) obj;
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * criteria.hashCode();
result += 31 * nullSafeHashCode(fieldSpec);
result += 31 * nullSafeHashCode(sort);
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
return result;
}
}

View File

@@ -0,0 +1,122 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.context.annotation.ComponentScan.Filter;
import org.springframework.context.annotation.Import;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
/**
* Annotation to activate MongoDB repositories. If no base package is configured through either {@link #value()},
* {@link #basePackages()} or {@link #basePackageClasses()} it will trigger scanning of the package of annotated class.
*
* @author Oliver Gierke
*/
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Inherited
@Import(MongoRepositoriesRegistrar.class)
public @interface EnableMongoRepositories {
/**
* Alias for the {@link #basePackages()} attribute. Allows for more concise annotation declarations e.g.:
* {@code @EnableJpaRepositories("org.my.pkg")} instead of {@code @EnableJpaRepositories(basePackages="org.my.pkg")}.
*/
String[] value() default {};
/**
* Base packages to scan for annotated components. {@link #value()} is an alias for (and mutually exclusive with) this
* attribute. Use {@link #basePackageClasses()} for a type-safe alternative to String-based package names.
*/
String[] basePackages() default {};
/**
* Type-safe alternative to {@link #basePackages()} for specifying the packages to scan for annotated components. The
* package of each class specified will be scanned. Consider creating a special no-op marker class or interface in
* each package that serves no purpose other than being referenced by this attribute.
*/
Class<?>[] basePackageClasses() default {};
/**
* Specifies which types are eligible for component scanning. Further narrows the set of candidate components from
* everything in {@link #basePackages()} to everything in the base packages that matches the given filter or filters.
*/
Filter[] includeFilters() default {};
/**
* Specifies which types are not eligible for component scanning.
*/
Filter[] excludeFilters() default {};
/**
* Returns the postfix to be used when looking up custom repository implementations. Defaults to {@literal Impl}. So
* for a repository named {@code PersonRepository} the corresponding implementation class will be looked up scanning
* for {@code PersonRepositoryImpl}.
*
* @return
*/
String repositoryImplementationPostfix() default "";
/**
* Configures the location of where to find the Spring Data named queries properties file. Will default to
* {@code META-INFO/mongo-named-queries.properties}.
*
* @return
*/
String namedQueriesLocation() default "";
/**
* Returns the key of the {@link QueryLookupStrategy} to be used for lookup queries for query methods. Defaults to
* {@link Key#CREATE_IF_NOT_FOUND}.
*
* @return
*/
Key queryLookupStrategy() default Key.CREATE_IF_NOT_FOUND;
/**
* Returns the {@link FactoryBean} class to be used for each repository instance. Defaults to
* {@link MongoRepositoryFactoryBean}.
*
* @return
*/
Class<?> repositoryFactoryBeanClass() default MongoRepositoryFactoryBean.class;
/**
* Configures the name of the {@link MongoTemplate} bean to be used with the repositories detected.
*
* @return
*/
String mongoTemplateRef() default "mongoTemplate";
/**
* Whether to automatically create indexes for query methods defined in the repository interface.
*
* @return
*/
boolean createIndexesForQueryMethods() default false;
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import java.lang.annotation.Annotation;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
/**
* Mongo-specific {@link ImportBeanDefinitionRegistrar}.
*
* @author Oliver Gierke
*/
class MongoRepositoriesRegistrar extends RepositoryBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoRepositories.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport#getExtension()
*/
@Override
protected RepositoryConfigurationExtension getExtension() {
return new MongoRepositoryConfigurationExtension();
}
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.data.mongodb.repository.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration;
import org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser;
import org.w3c.dom.Element;
/**
* {@link org.springframework.beans.factory.xml.BeanDefinitionParser} to create Mongo DB repositories from classpath
* scanning or manual definition.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigParser extends
AbstractRepositoryConfigDefinitionParser<SimpleMongoRepositoryConfiguration, MongoRepositoryConfiguration> {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser#getGlobalRepositoryConfigInformation(org.w3c.dom.Element)
*/
@Override
protected SimpleMongoRepositoryConfiguration getGlobalRepositoryConfigInformation(Element element) {
return new SimpleMongoRepositoryConfiguration(element);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser#postProcessBeanDefinition(org.springframework.data.repository.config.SingleRepositoryConfigInformation, org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.beans.factory.support.BeanDefinitionRegistry, java.lang.Object)
*/
@Override
protected void postProcessBeanDefinition(MongoRepositoryConfiguration context, BeanDefinitionBuilder builder,
BeanDefinitionRegistry registry, Object beanSource) {
builder.addPropertyReference("mongoOperations", context.getMongoTemplateRef());
builder.addPropertyValue("createIndexesForQueryMethods", context.getCreateQueryIndexes());
}
}

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource;
import org.springframework.data.repository.config.RepositoryConfigurationExtension;
import org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport;
import org.springframework.data.repository.config.XmlRepositoryConfigurationSource;
import org.w3c.dom.Element;
/**
* {@link RepositoryConfigurationExtension} for MongoDB.
*
* @author Oliver Gierke
*/
public class MongoRepositoryConfigurationExtension extends RepositoryConfigurationExtensionSupport {
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String CREATE_QUERY_INDEXES = "create-query-indexes";
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#getModulePrefix()
*/
@Override
protected String getModulePrefix() {
return "mongo";
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtension#getRepositoryFactoryClassName()
*/
public String getRepositoryFactoryClassName() {
return MongoRepositoryFactoryBean.class.getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.XmlRepositoryConfigurationSource)
*/
@Override
public void postProcess(BeanDefinitionBuilder builder, XmlRepositoryConfigurationSource config) {
Element element = config.getElement();
ParsingUtils.setPropertyReference(builder, element, MONGO_TEMPLATE_REF, "mongoOperations");
ParsingUtils.setPropertyValue(builder, element, CREATE_QUERY_INDEXES, "createIndexesForQueryMethods");
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfigurationExtensionSupport#postProcess(org.springframework.beans.factory.support.BeanDefinitionBuilder, org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource)
*/
@Override
public void postProcess(BeanDefinitionBuilder builder, AnnotationRepositoryConfigurationSource config) {
AnnotationAttributes attributes = config.getAttributes();
builder.addPropertyReference("mongoOperations", attributes.getString("mongoTemplateRef"));
builder.addPropertyValue("createIndexesForQueryMethods", attributes.getBoolean("createIndexesForQueryMethods"));
}
}

View File

@@ -1,196 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AutomaticRepositoryConfigInformation;
import org.springframework.data.repository.config.ManualRepositoryConfigInformation;
import org.springframework.data.repository.config.RepositoryConfig;
import org.springframework.data.repository.config.SingleRepositoryConfigInformation;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link RepositoryConfig} implementation to create {@link MongoRepositoryConfiguration} instances for both automatic
* and manual configuration.
*
* @author Oliver Gierke
*/
public class SimpleMongoRepositoryConfiguration
extends
RepositoryConfig<SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration, SimpleMongoRepositoryConfiguration> {
private static final String MONGO_TEMPLATE_REF = "mongo-template-ref";
private static final String CREATE_QUERY_INDEXES = "create-query-indexes";
private static final String DEFAULT_MONGO_TEMPLATE_REF = "mongoTemplate";
/**
* Creates a new {@link SimpleMongoRepositoryConfiguration} for the given {@link Element}.
*
* @param repositoriesElement
*/
protected SimpleMongoRepositoryConfiguration(Element repositoriesElement) {
super(repositoriesElement, MongoRepositoryFactoryBean.class.getName());
}
/**
* Returns the bean name of the {@link org.springframework.data.mongodb.core.core.MongoTemplate} to be referenced.
*
* @return
*/
public String getMongoTemplateRef() {
String templateRef = getSource().getAttribute(MONGO_TEMPLATE_REF);
return StringUtils.hasText(templateRef) ? templateRef : DEFAULT_MONGO_TEMPLATE_REF;
}
/**
* Returns whether to create indexes for query methods.
*
* @return
*/
public boolean getCreateQueryIndexes() {
String createQueryIndexes = getSource().getAttribute(CREATE_QUERY_INDEXES);
return StringUtils.hasText(createQueryIndexes) ? Boolean.parseBoolean(createQueryIndexes) : false;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.config.GlobalRepositoryConfigInformation
* #getAutoconfigRepositoryInformation(java.lang.String)
*/
public MongoRepositoryConfiguration getAutoconfigRepositoryInformation(String interfaceName) {
return new AutomaticMongoRepositoryConfiguration(interfaceName, this);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.config.RepositoryConfig#getNamedQueriesLocation()
*/
public String getNamedQueriesLocation() {
return "classpath*:META-INF/mongo-named-queries.properties";
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.repository.config.RepositoryConfig#
* createSingleRepositoryConfigInformationFor(org.w3c.dom.Element)
*/
@Override
protected MongoRepositoryConfiguration createSingleRepositoryConfigInformationFor(Element element) {
return new ManualMongoRepositoryConfiguration(element, this);
}
/**
* Simple interface for configuration values specific to Mongo repositories.
*
* @author Oliver Gierke
*/
public interface MongoRepositoryConfiguration extends
SingleRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> {
String getMongoTemplateRef();
boolean getCreateQueryIndexes();
}
/**
* Implements manual lookup of the additional attributes.
*
* @author Oliver Gierke
*/
private static class ManualMongoRepositoryConfiguration extends
ManualRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> implements MongoRepositoryConfiguration {
/**
* Creates a new {@link ManualMongoRepositoryConfiguration} for the given {@link Element} and parent.
*
* @param element
* @param parent
*/
public ManualMongoRepositoryConfiguration(Element element, SimpleMongoRepositoryConfiguration parent) {
super(element, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getAttribute(MONGO_TEMPLATE_REF);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getCreateQueryIndexes()
*/
public boolean getCreateQueryIndexes() {
String attribute = getAttribute(CREATE_QUERY_INDEXES);
return attribute == null ? false : Boolean.parseBoolean(attribute);
}
}
/**
* Implements the lookup of the additional attributes during automatic configuration.
*
* @author Oliver Gierke
*/
private static class AutomaticMongoRepositoryConfiguration extends
AutomaticRepositoryConfigInformation<SimpleMongoRepositoryConfiguration> implements MongoRepositoryConfiguration {
/**
* Creates a new {@link AutomaticMongoRepositoryConfiguration} for the given interface and parent.
*
* @param interfaceName
* @param parent
*/
public AutomaticMongoRepositoryConfiguration(String interfaceName, SimpleMongoRepositoryConfiguration parent) {
super(interfaceName, parent);
}
/*
* (non-Javadoc)
*
* @see org.springframework.data.mongodb.repository.config.
* SimpleMongoRepositoryConfiguration
* .MongoRepositoryConfiguration#getMongoTemplateRef()
*/
public String getMongoTemplateRef() {
return getParent().getMongoTemplateRef();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration#getCreateQueryIndexes()
*/
public boolean getCreateQueryIndexes() {
return getParent().getCreateQueryIndexes();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2011 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -85,7 +85,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
} else if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
} else if (method.isCollectionQuery()) {
return new CollectionExecution().execute(query);
return new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
} else {
@@ -133,12 +133,23 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
class CollectionExecution extends Execution {
private final Pageable pageable;
CollectionExecution(Pageable pageable) {
this.pageable = pageable;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Object execute(Query query) {
if (pageable != null) {
query = applyPagination(query, pageable);
}
return readCollection(query);
}
}

View File

@@ -15,7 +15,11 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Iterator;
import java.util.List;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -25,6 +29,9 @@ import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import com.mongodb.DBRef;
/**
* Custom {@link ParameterAccessor} that uses a {@link MongoWriter} to serialize parameters into Mongo format.
@@ -158,7 +165,28 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
* @see org.springframework.data.mongodb.repository.ConvertingParameterAccessor.PotentiallConvertingIterator#nextConverted()
*/
public Object nextConverted(MongoPersistentProperty property) {
return property.isAssociation() ? writer.toDBRef(next(), property) : getConvertedValue(next());
Object next = next();
if (next == null) {
return null;
}
if (property.isAssociation()) {
if (next.getClass().isArray() || next instanceof Iterable) {
List<DBRef> dbRefs = new ArrayList<DBRef>();
for (Object element : asCollection(next)) {
dbRefs.add(writer.toDBRef(element, property));
}
return dbRefs;
} else {
return writer.toDBRef(next, property);
}
}
return getConvertedValue(next);
}
/*
@@ -170,6 +198,33 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
}
}
/**
* Returns the given object as {@link Collection}. Will do a copy of it if it implements {@link Iterable} or is an
* array. Will return an empty {@link Collection} in case {@literal null} is given. Will wrap all other types into a
* single-element collction
*
* @param source
* @return
*/
private static Collection<?> asCollection(Object source) {
if (source instanceof Iterable) {
List<Object> result = new ArrayList<Object>();
for (Object element : (Iterable<?>) source) {
result.add(element);
}
return result;
}
if (source == null) {
return Collections.emptySet();
}
return source.getClass().isArray() ? CollectionUtils.arrayToList(source) : Collections.singleton(source);
}
/**
* Custom {@link Iterator} that adds a method to access elements in a converted manner.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2010 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -87,9 +87,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#create(org.springframework.data.repository.query.parser.Part, java.util.Iterator)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#create(org.springframework.data.repository.query.parser.Part, java.util.Iterator)
*/
@Override
protected Criteria create(Part part, Iterator<Object> iterator) {
@@ -107,9 +107,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#and(org.springframework.data.repository.query.parser.Part, java.lang.Object, java.util.Iterator)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#and(org.springframework.data.repository.query.parser.Part, java.lang.Object, java.util.Iterator)
*/
@Override
protected Criteria and(Part part, Criteria base, Iterator<Object> iterator) {
@@ -120,20 +120,15 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
MongoPersistentProperty property = path.getLeafProperty();
Criteria criteria = from(part.getType(), property,
where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
return from(part.getType(), property,
base.and(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return criteria.andOperator(criteria);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.parser.AbstractQueryCreator
* #or(java.lang.Object, java.lang.Object)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#or(java.lang.Object, java.lang.Object)
*/
@Override
protected Criteria or(Criteria base, Criteria criteria) {
@@ -142,12 +137,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.parser.AbstractQueryCreator
* #complete(java.lang.Object, org.springframework.data.domain.Sort)
*/
* (non-Javadoc)
* @see org.springframework.data.repository.query.parser.AbstractQueryCreator#complete(java.lang.Object, org.springframework.data.domain.Sort)
*/
@Override
protected Query complete(Criteria criteria, Sort sort) {
@@ -159,7 +151,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
QueryUtils.applySorting(query, sort);
if (LOG.isDebugEnabled()) {
LOG.debug("Created query " + query.getQueryObject());
LOG.debug("Created query " + query);
}
return query;
@@ -236,7 +228,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case SIMPLE_PROPERTY:
return criteria.is(parameters.nextConverted(property));
case NEGATING_SIMPLE_PROPERTY:
return criteria.not().is(parameters.nextConverted(property));
return criteria.ne(parameters.nextConverted(property));
}
throw new IllegalArgumentException("Unsupported keyword!");
@@ -290,4 +282,4 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return source.replaceAll("\\*", ".*");
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,19 +17,14 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import java.util.List;
import java.util.regex.Pattern;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
@@ -37,14 +32,10 @@ import org.springframework.data.querydsl.SimpleEntityPathResolver;
import org.springframework.data.repository.core.EntityMetadata;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.EntityPath;
import com.mysema.query.types.Expression;
import com.mysema.query.types.OrderSpecifier;
import com.mysema.query.types.Path;
import com.mysema.query.types.PathMetadata;
import com.mysema.query.types.Predicate;
import com.mysema.query.types.path.PathBuilder;
@@ -56,7 +47,6 @@ import com.mysema.query.types.path.PathBuilder;
public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleMongoRepository<T, ID> implements
QueryDslPredicateExecutor<T> {
private final MongodbSerializer serializer;
private final PathBuilder<T> builder;
/**
@@ -67,7 +57,6 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
* @param template
*/
public QueryDslMongoRepository(MongoEntityInformation<T, ID> entityInformation, MongoOperations mongoOperations) {
this(entityInformation, mongoOperations, SimpleEntityPathResolver.INSTANCE);
}
@@ -86,40 +75,27 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
Assert.notNull(resolver);
EntityPath<T> path = resolver.createPath(entityInformation.getJavaType());
this.builder = new PathBuilder<T>(path.getType(), path.getMetadata());
this.serializer = new SpringDataMongodbSerializer(mongoOperations.getConverter());
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findOne(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findOne(com.mysema.query.types.Predicate)
*/
public T findOne(Predicate predicate) {
return createQueryFor(predicate).uniqueResult();
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate)
*/
public List<T> findAll(Predicate predicate) {
return createQueryFor(predicate).list();
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate,
* com.mysema.query.types.OrderSpecifier<?>[])
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, com.mysema.query.types.OrderSpecifier<?>[])
*/
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
@@ -128,11 +104,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #findAll(com.mysema.query.types.Predicate,
* org.springframework.data.domain.Pageable)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, org.springframework.data.domain.Pageable)
*/
public Page<T> findAll(Predicate predicate, Pageable pageable) {
@@ -144,13 +116,9 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.mongodb.repository.QueryDslExecutor
* #count(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#count(com.mysema.query.types.Predicate)
*/
public long count(Predicate predicate) {
return createQueryFor(predicate).count();
}
@@ -164,7 +132,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
Class<T> domainType = getEntityInformation().getJavaType();
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), serializer, domainType);
MongodbQuery<T> query = new SpringDataMongodbQuery<T>(getMongoOperations(), domainType);
return query.where(predicate);
}
@@ -219,40 +187,4 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
return new OrderSpecifier(order.isAscending() ? com.mysema.query.types.Order.ASC
: com.mysema.query.types.Order.DESC, property);
}
/**
* Custom {@link MongodbSerializer} to take mapping information into account when building keys for constraints.
*
* @author Oliver Gierke
*/
static class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
}
@Override
protected String getKeyForPath(Path<?> expr, PathMetadata<?> metadata) {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}
@Override
protected DBObject asDBObject(String key, Object value) {
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,6 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.EntityPath;
/**
@@ -33,7 +32,6 @@ public abstract class QuerydslRepositorySupport {
private final MongoOperations template;
private final MappingContext<? extends MongoPersistentEntity<?>, ?> context;
private final MongodbSerializer serializer;
/**
* Creates a new {@link QuerydslRepositorySupport} for the given {@link MongoOperations}.
@@ -41,10 +39,11 @@ public abstract class QuerydslRepositorySupport {
* @param operations must not be {@literal null}
*/
public QuerydslRepositorySupport(MongoOperations operations) {
Assert.notNull(operations);
this.template = operations;
this.context = operations.getConverter().getMappingContext();
this.serializer = new MongodbSerializer();
}
/**
@@ -72,6 +71,6 @@ public abstract class QuerydslRepositorySupport {
Assert.notNull(path);
Assert.hasText(collection);
return new SpringDataMongodbQuery<T>(template, serializer, path.getType(), collection);
return new SpringDataMongodbQuery<T>(template, path.getType(), collection);
}
}

View File

@@ -115,8 +115,11 @@ public class SimpleMongoRepository<T, ID extends Serializable> implements MongoR
public boolean exists(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findOne(new Query(Criteria.where("_id").is(id)), Object.class,
entityInformation.getCollectionName()) != null;
final Query idQuery = getIdQuery(id);
idQuery.fields();
return mongoOperations.findOne(idQuery, entityInformation.getJavaType(), entityInformation.getCollectionName()) != null;
}
/*

View File

@@ -21,7 +21,6 @@ import com.google.common.base.Function;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbQuery;
import com.mysema.query.mongodb.MongodbSerializer;
/**
* Spring Data specfic {@link MongodbQuery} implementation.
@@ -36,30 +35,26 @@ class SpringDataMongodbQuery<T> extends MongodbQuery<T> {
* Creates a new {@link SpringDataMongodbQuery}.
*
* @param operations must not be {@literal null}.
* @param serializer must not be {@literal null}.
* @param type must not be {@literal null}.
*/
public SpringDataMongodbQuery(final MongoOperations operations, final MongodbSerializer serializer,
final Class<? extends T> type) {
this(operations, serializer, type, operations.getCollectionName(type));
public SpringDataMongodbQuery(final MongoOperations operations, final Class<? extends T> type) {
this(operations, type, operations.getCollectionName(type));
}
/**
* Creates a new {@link SpringDataMongodbQuery} to query the given collection.
*
* @param operations must not be {@literal null}.
* @param serializer must not be {@literal null}.
* @param type must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
*/
public SpringDataMongodbQuery(final MongoOperations operations, final MongodbSerializer serializer,
final Class<? extends T> type, String collectionName) {
public SpringDataMongodbQuery(final MongoOperations operations, final Class<? extends T> type, String collectionName) {
super(operations.getCollection(collectionName), new Function<DBObject, T>() {
public T apply(DBObject input) {
return operations.getConverter().read(type, input);
}
}, serializer);
}, new SpringDataMongodbSerializer(operations.getConverter()));
this.operations = operations;
}
@@ -72,4 +67,4 @@ class SpringDataMongodbQuery<T> extends MongodbQuery<T> {
protected DBCollection getCollection(Class<?> type) {
return operations.getCollection(operations.getCollectionName(type));
}
}
}

View File

@@ -0,0 +1,83 @@
/*
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.util.regex.Pattern;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.Path;
import com.mysema.query.types.PathMetadata;
/**
* Custom {@link MongodbSerializer} to take mapping information into account when building keys for constraints.
*
* @author Oliver Gierke
*/
class SpringDataMongodbSerializer extends MongodbSerializer {
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
*
* @param mappingContext
*/
public SpringDataMongodbSerializer(MongoConverter converter) {
Assert.notNull(converter, "MongoConverter must not be null!");
this.mappingContext = converter.getMappingContext();
this.converter = converter;
this.mapper = new QueryMapper(converter);
}
/*
* (non-Javadoc)
* @see com.mysema.query.mongodb.MongodbSerializer#getKeyForPath(com.mysema.query.types.Path, com.mysema.query.types.PathMetadata)
*/
@Override
protected String getKeyForPath(Path<?> expr, PathMetadata<?> metadata) {
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getExpression().toString());
return property == null ? super.getKeyForPath(expr, metadata) : property.getFieldName();
}
/*
* (non-Javadoc)
* @see com.mysema.query.mongodb.MongodbSerializer#asDBObject(java.lang.String, java.lang.Object)
*/
@Override
protected DBObject asDBObject(String key, Object value) {
if ("_id".equals(key)) {
return super.asDBObject(key, mapper.convertId(value));
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}

View File

@@ -14,7 +14,7 @@
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>

View File

@@ -103,15 +103,6 @@ The Mongo URI string.]]></xsd:documentation>
</xsd:complexType>
</xsd:element>
<xsd:complexType name="mongo-repository">
<xsd:complexContent>
<xsd:extension base="repository:repository">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
@@ -134,9 +125,6 @@ The Mongo URI string.]]></xsd:documentation>
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:sequence>
<xsd:element name="repository" minOccurs="0" maxOccurs="unbounded" type="mongo-repository"/>
</xsd:sequence>
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>

View File

@@ -0,0 +1,97 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.mapping.Document;
import com.mongodb.Mongo;
/**
* Unit tests for {@link AbstractMongoConfiguration}.
*
* @author Oliver Gierke
*/
public class AbstractMongoConfigurationUnitTests {
/**
* @see DATAMONGO-496
*/
@Test
public void usesConfigClassPackageAsBaseMappingPackage() throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration();
assertThat(configuration.getMappingBasePackage(), is(SampleMongoConfiguration.class.getPackage().getName()));
assertThat(configuration.getInitialEntitySet(), hasSize(1));
assertThat(configuration.getInitialEntitySet(), hasItem(Entity.class));
}
/**
* @see DATAMONGO-496
*/
@Test
public void doesNotScanPackageIfMappingPackageIsNull() throws ClassNotFoundException {
assertScanningDisabled(null);
}
/**
* @see DATAMONGO-496
*/
@Test
public void doesNotScanPackageIfMappingPackageIsEmpty() throws ClassNotFoundException {
assertScanningDisabled("");
assertScanningDisabled(" ");
}
private static void assertScanningDisabled(final String value) throws ClassNotFoundException {
AbstractMongoConfiguration configuration = new SampleMongoConfiguration() {
@Override
protected String getMappingBasePackage() {
return value;
}
};
assertThat(configuration.getMappingBasePackage(), is(value));
assertThat(configuration.getInitialEntitySet(), hasSize(0));
}
static class SampleMongoConfiguration extends AbstractMongoConfiguration {
@Override
protected String getDatabaseName() {
return "database";
}
@Bean
@Override
public Mongo mongo() throws Exception {
return new Mongo();
}
}
@Document
static class Entity {
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.CommandResult;
import com.mongodb.Mongo;
@@ -36,47 +38,45 @@ import com.mongodb.ServerAddress;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
public class MongoNamespaceReplicaSetTests {
@Autowired
private ApplicationContext ctx;
@Test
@SuppressWarnings("unchecked")
public void testParsingMongoWithReplicaSets() throws Exception {
assertTrue(ctx.containsBean("replicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&replicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("127.0.0.1", replicaSetSeeds.get(0).getHost());
assertEquals(10001, replicaSetSeeds.get(0).getPort());
assertEquals("localhost", replicaSetSeeds.get(1).getHost());
assertEquals(10002, replicaSetSeeds.get(1).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasItems(new ServerAddress("127.0.0.1", 10001), new ServerAddress("localhost", 10002)));
}
@Test
@SuppressWarnings("unchecked")
public void testParsingWithPropertyPlaceHolder() throws Exception {
assertTrue(ctx.containsBean("manyReplicaSetMongo"));
MongoFactoryBean mfb = (MongoFactoryBean) ctx.getBean("&manyReplicaSetMongo");
List<ServerAddress> replicaSetSeeds = readField("replicaSetSeeds", mfb);
assertNotNull(replicaSetSeeds);
assertEquals("192.168.174.130", replicaSetSeeds.get(0).getHost());
assertEquals(27017, replicaSetSeeds.get(0).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(1).getHost());
assertEquals(27018, replicaSetSeeds.get(1).getPort());
assertEquals("192.168.174.130", replicaSetSeeds.get(2).getHost());
assertEquals(27019, replicaSetSeeds.get(2).getPort());
List<ServerAddress> replicaSetSeeds = (List<ServerAddress>) ReflectionTestUtils.getField(mfb, "replicaSetSeeds");
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasSize(3));
assertThat(
replicaSetSeeds,
hasItems(new ServerAddress("192.168.174.130", 27017), new ServerAddress("192.168.174.130", 27018),
new ServerAddress("192.168.174.130", 27019)));
}
@Test
@Ignore("CI infrastructure does not yet support replica sets")
public void testMongoWithReplicaSets() {
Mongo mongo = ctx.getBean(Mongo.class);
assertEquals(2, mongo.getAllAddress().size());
List<ServerAddress> servers = mongo.getAllAddress();
@@ -88,6 +88,5 @@ public class MongoNamespaceReplicaSetTests extends NamespaceTestSupport {
MongoTemplate template = new MongoTemplate(mongo, "admin");
CommandResult result = template.executeCommand("{replSetGetStatus : 1}");
assertEquals("blort", result.getString("set"));
}
}

View File

@@ -1,42 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.reflect.Field;
public class NamespaceTestSupport {
@SuppressWarnings({ "unchecked" })
public static <T> T readField(String name, Object target) throws Exception {
Field field = null;
Class<?> clazz = target.getClass();
do {
try {
field = clazz.getDeclaredField(name);
} catch (Exception ex) {
}
clazz = clazz.getSuperclass();
} while (field == null && !clazz.equals(Object.class));
if (field == null)
throw new IllegalArgumentException("Cannot find field '" + name + "' in the class hierarchy of "
+ target.getClass());
field.setAccessible(true);
return (T) field.get(target);
}
}

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.ServerAddress;
/**
* Unit tests for {@link ServerAddressPropertyEditor}.
*
* @author Oliver Gierke
*/
public class ServerAddressPropertyEditorUnitTests {
ServerAddressPropertyEditor editor;
@Before
public void setUp() {
editor = new ServerAddressPropertyEditor();
}
/**
* @see DATAMONGO-454
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsAddressConfigWithoutASingleParsableServerAddress() {
editor.setAsText("foo, bar");
}
/**
* @see DATAMONGO-454
*/
@Test
public void skipsUnparsableAddressIfAtLeastOneIsParsable() throws UnknownHostException {
editor.setAsText("foo, localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
/**
* @see DATAMONGO-454
*/
@Test
public void handlesEmptyAddressAsParseError() throws UnknownHostException {
editor.setAsText(", localhost");
assertSingleAddressOfLocalhost(editor.getValue());
}
private static void assertSingleAddressOfLocalhost(Object result) throws UnknownHostException {
assertThat(result, is(instanceOf(ServerAddress[].class)));
Collection<ServerAddress> addresses = Arrays.asList((ServerAddress[]) result);
assertThat(addresses, hasSize(1));
assertThat(addresses, hasItem(new ServerAddress("localhost")));
}
}

View File

@@ -0,0 +1,81 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Helper classes to ease assertions on {@link DBObject}s.
*
* @author Oliver Gierke
*/
public abstract class DBObjectUtils {
private DBObjectUtils() {
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link DBObject} in turn and returns it.
*
* @param source the {@link DBObject} to lookup the nested one
* @param key the key of the field to lokup the nested {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(DBObject source, String key) {
return getTypedValue(source, key, DBObject.class);
}
/**
* Expects the field with the given key to be not {@literal null} and a {@link BasicDBList}.
*
* @param source the {@link DBObject} to lookup the {@link BasicDBList} in
* @param key the key of the field to find the {@link BasicDBList} in
* @return
*/
public static BasicDBList getAsDBList(DBObject source, String key) {
return getTypedValue(source, key, BasicDBList.class);
}
/**
* Expects the list element with the given index to be a non-{@literal null} {@link DBObject} and returns it.
*
* @param source the {@link BasicDBList} to look up the {@link DBObject} element in
* @param index the index of the element expected to contain a {@link DBObject}
* @return
*/
public static DBObject getAsDBObject(BasicDBList source, int index) {
assertThat(source.size(), greaterThanOrEqualTo(index + 1));
Object value = source.get(index);
assertThat(value, is(instanceOf(DBObject.class)));
return (DBObject) value;
}
@SuppressWarnings("unchecked")
private static <T> T getTypedValue(DBObject source, String key, Class<T> type) {
Object value = source.get(key);
assertThat(value, is(notNullValue()));
assertThat(value, is(instanceOf(type)));
return (T) value;
}
}

View File

@@ -0,0 +1,65 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
*/
public class MongoDbUtilsUnitTests {
Mongo mongo;
@Before
public void setUp() throws Exception {
this.mongo = new Mongo();
TransactionSynchronizationManager.initSynchronization();
}
@After
public void tearDown() {
for (Object key : TransactionSynchronizationManager.getResourceMap().keySet()) {
TransactionSynchronizationManager.unbindResource(key);
}
TransactionSynchronizationManager.clearSynchronization();
}
@Test
public void returnsNewInstanceForDifferentDatabaseName() {
DB first = MongoDbUtils.getDB(mongo, "first");
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(first));
DB second = MongoDbUtils.getDB(mongo, "second");
assertThat(second, is(not(first)));
assertThat(MongoDbUtils.getDB(mongo, "second"), is(second));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,7 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
@@ -50,6 +51,7 @@ import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.mongodb.core.query.Query;
@@ -72,6 +74,7 @@ import com.mongodb.WriteResult;
*
* @author Oliver Gierke
* @author Thomas Risberg
* @author Amol Nayak
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
@@ -81,6 +84,7 @@ public class MongoTemplateTests {
MongoTemplate template;
@Autowired
MongoDbFactory factory;
MongoTemplate mappingTemplate;
@Rule
@@ -100,7 +104,7 @@ public class MongoTemplateTests {
PersonWithIdPropertyOfPrimitiveInt.class, PersonWithIdPropertyOfTypeLong.class,
PersonWithIdPropertyOfPrimitiveLong.class)));
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
mappingContext.initialize();
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.setCustomConversions(conversions);
@@ -120,18 +124,19 @@ public class MongoTemplateTests {
}
protected void cleanDb() {
template.dropCollection(template.getCollectionName(Person.class));
template.dropCollection(template.getCollectionName(PersonWithAList.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeObjectId.class));
template.dropCollection(template.getCollectionName(PersonWith_idPropertyOfTypeString.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeObjectId.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeString.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeInteger.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfPrimitiveInt.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfTypeLong.class));
template.dropCollection(template.getCollectionName(PersonWithIdPropertyOfPrimitiveLong.class));
template.dropCollection(template.getCollectionName(TestClass.class));
template.dropCollection(Person.class);
template.dropCollection(PersonWithAList.class);
template.dropCollection(PersonWith_idPropertyOfTypeObjectId.class);
template.dropCollection(PersonWith_idPropertyOfTypeString.class);
template.dropCollection(PersonWithIdPropertyOfTypeObjectId.class);
template.dropCollection(PersonWithIdPropertyOfTypeString.class);
template.dropCollection(PersonWithIdPropertyOfTypeInteger.class);
template.dropCollection(PersonWithIdPropertyOfPrimitiveInt.class);
template.dropCollection(PersonWithIdPropertyOfTypeLong.class);
template.dropCollection(PersonWithIdPropertyOfPrimitiveLong.class);
template.dropCollection(TestClass.class);
template.dropCollection(Sample.class);
template.dropCollection(MyPerson.class);
}
@Test
@@ -161,6 +166,112 @@ public class MongoTemplateTests {
mongoTemplate.updateFirst(q, u, Person.class);
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForDuplicateIds() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.insert(person);
try {
template.insert(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(), containsString("E11000 duplicate key error index: database.person.$_id_ dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForUpdateWithInvalidPushOperator() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
template.insert(person);
try {
Query query = new Query(Criteria.where("firstName").is("Amol"));
Update upd = new Update().push("age", 29);
template.updateFirst(query, upd, Person.class);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
is("Execution of update with '{ \"$push\" : { \"age\" : 29}}'' using '{ \"firstName\" : \"Amol\"}' "
+ "query failed: Cannot apply $push/$pushAll modifier to non-array"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForIndexViolationIfConfigured() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
template.indexOps(Person.class).ensureIndex(new Index().on("firstName", Order.DESCENDING).unique());
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.save(person);
person = new Person(new ObjectId(), "Amol");
person.setAge(28);
try {
template.save(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
containsString("E11000 duplicate key error index: database.person.$firstName_-1 dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void rejectsDuplicateIdInInsertAll() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
List<Person> records = new ArrayList<Person>();
records.add(person);
records.add(person);
try {
template.insertAll(records);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
}
@Test
public void testEnsureIndex() throws Exception {
@@ -191,7 +302,7 @@ public class MongoTemplateTests {
assertThat(dropDupes, is(true));
List<IndexInfo> indexInfoList = template.indexOps(Person.class).getIndexInfo();
System.out.println(indexInfoList);
assertThat(indexInfoList.size(), is(2));
IndexInfo ii = indexInfoList.get(1);
assertThat(ii.isUnique(), is(true));
@@ -911,7 +1022,6 @@ public class MongoTemplateTests {
assertThat(lastMongoAction.getEntityClass().toString(), is(PersonWithIdPropertyOfTypeObjectId.class.toString()));
assertThat(lastMongoAction.getMongoActionOperation(), is(MongoActionOperation.UPDATE));
assertThat(lastMongoAction.getQuery(), equalTo(q.getQueryObject()));
assertThat(lastMongoAction.getDocument(), equalTo(u.getUpdateObject()));
}
@@ -938,7 +1048,7 @@ public class MongoTemplateTests {
DBRef first = new DBRef(factory.getDb(), "foo", new ObjectId());
DBRef second = new DBRef(factory.getDb(), "bar", new ObjectId());
template.updateFirst(null, Update.update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
template.updateFirst(null, update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
}
class ClassWithDBRefs {
@@ -1101,12 +1211,70 @@ public class MongoTemplateTests {
template.save(second);
Query query = query(where("field").not().regex("Matthews"));
System.out.println(query.getQueryObject());
List<Sample> result = template.find(query, Sample.class);
assertThat(result.size(), is(1));
assertThat(result.get(0).field, is("Beauford"));
}
/**
* @see DATAMONGO-447
*/
@Test
public void storesAndRemovesTypeWithComplexId() {
MyId id = new MyId();
id.first = "foo";
id.second = "bar";
TypeWithMyId source = new TypeWithMyId();
source.id = id;
template.save(source);
template.remove(query(where("id").is(id)), TypeWithMyId.class);
}
/**
* @see DATAMONGO-506
*/
@Test
public void exceutesBasicQueryCorrectly() {
Address address = new Address();
address.state = "PA";
address.city = "Philadelphia";
MyPerson person = new MyPerson();
person.name = "Oleg";
person.address = address;
template.save(person);
Query query = new BasicQuery("{'name' : 'Oleg'}");
List<MyPerson> result = template.find(query, MyPerson.class);
assertThat(result, hasSize(1));
assertThat(result.get(0), hasProperty("name", is("Oleg")));
query = new BasicQuery("{'address.state' : 'PA' }");
result = template.find(query, MyPerson.class);
assertThat(result, hasSize(1));
assertThat(result.get(0), hasProperty("name", is("Oleg")));
}
static class MyId {
String first;
String second;
}
static class TypeWithMyId {
@Id
MyId id;
}
public static class Sample {
@Id
@@ -1141,4 +1309,21 @@ public class MongoTemplateTests {
return source == null ? null : new DateTime(source.getTime());
}
}
public static class MyPerson {
String id;
String name;
Address address;
public String getName() {
return name;
}
}
static class Address {
String state;
String city;
}
}

View File

@@ -22,6 +22,7 @@ import static org.mockito.Mockito.*;
import java.math.BigInteger;
import java.util.Collections;
import java.util.regex.Pattern;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -172,6 +173,31 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
verify(collection, times(1)).update(Mockito.any(DBObject.class), eq(reference), anyBoolean(), anyBoolean());
}
/**
* @see DATAMONGO-474
*/
@Test
public void setsUnpopulatedIdField() {
NotAutogenerateableId entity = new NotAutogenerateableId();
template.populateIdIfNecessary(entity, 5);
assertThat(entity.id, is(5));
}
/**
* @see DATAMONGO-474
*/
@Test
public void doesNotSetAlreadyPopulatedId() {
NotAutogenerateableId entity = new NotAutogenerateableId();
entity.id = 5;
template.populateIdIfNecessary(entity, 7);
assertThat(entity.id, is(5));
}
class AutogenerateableId {
@Id
@@ -182,6 +208,10 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Id
Integer id;
public Pattern getId() {
return Pattern.compile(".");
}
}
enum MyConverter implements Converter<AutogenerateableId, String> {

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.SerializationUtils.*;
import java.util.Arrays;
import org.hamcrest.Matcher;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link SerializationUtils}.
*
* @author Oliver Gierke
*/
public class SerializationUtilsUnitTests {
@Test
public void writesSimpleDBObject() {
DBObject dbObject = new BasicDBObject("foo", "bar");
assertThat(serializeToJsonSafely(dbObject), is("{ \"foo\" : \"bar\"}"));
}
@Test
public void writesComplexObjectAsPlainToString() {
DBObject dbObject = new BasicDBObject("foo", new Complex());
assertThat(serializeToJsonSafely(dbObject),
startsWith("{ \"foo\" : { $java : org.springframework.data.mongodb.core.SerializationUtilsUnitTests$Complex"));
}
@Test
public void writesCollection() {
DBObject dbObject = new BasicDBObject("foo", Arrays.asList("bar", new Complex()));
Matcher<String> expectedOutput = allOf(
startsWith("{ \"foo\" : [ \"bar\", { $java : org.springframework.data.mongodb.core.SerializationUtilsUnitTests$Complex"),
endsWith(" } ] }"));
assertThat(serializeToJsonSafely(dbObject), is(expectedOutput));
}
static class Complex {
}
}

View File

@@ -21,7 +21,7 @@ public class TestMongoConfiguration extends AbstractMongoConfiguration {
@Override
@Bean
public Mongo mongo() throws Exception {
return new Mongo("localhost", 27017);
return new Mongo("127.0.0.1", 27017);
}
@Override

View File

@@ -3,6 +3,7 @@ package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.URL;
import java.text.DateFormat;
import java.text.Format;
import java.util.Arrays;
@@ -152,6 +153,25 @@ public class CustomConversionsUnitTests {
assertThat(conversions.isSimpleType(Binary.class), is(true));
}
/**
* @see DATAMONGO-462
*/
@Test
public void hasWriteConverterForURL() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.hasCustomWriteTarget(URL.class), is(true));
}
/**
* @see DATAMONGO-462
*/
@Test
public void readTargetForURL() {
CustomConversions conversions = new CustomConversions();
assertThat(conversions.hasCustomReadTarget(String.class, URL.class), is(true));
}
enum FormatToStringConverter implements Converter<Format, String> {
INSTANCE;

View File

@@ -15,13 +15,12 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import java.util.Arrays;
import java.util.HashSet;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.hamcrest.CoreMatchers;
import org.junit.Assert;
import org.junit.Before;
@@ -32,11 +31,12 @@ import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Test case to verify correct usage of custom {@link Converter} implementations to be used.
*
@@ -71,7 +71,7 @@ public class CustomConvertersUnitTests {
context = new MongoMappingContext();
context.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(Foo.class, Bar.class)));
context.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
context.afterPropertiesSet();
context.initialize();
converter = new MappingMongoConverter(mongoDbFactory, context);
converter.setCustomConversions(conversions);

View File

@@ -46,8 +46,6 @@ public class DataMongo273Tests {
public void setupMongoConv() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.afterPropertiesSet();
MongoDbFactory factory = mock(MongoDbFactory.class);
converter = new MappingMongoConverter(factory, mappingContext);
@@ -186,4 +184,4 @@ public class DataMongo273Tests {
return boxes;
}
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,14 +13,15 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.URL;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
@@ -35,6 +36,7 @@ import java.util.Set;
import java.util.SortedMap;
import org.bson.types.ObjectId;
import org.hamcrest.Matcher;
import org.joda.time.LocalDate;
import org.junit.Before;
import org.junit.Test;
@@ -42,6 +44,8 @@ import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.event.ContextRefreshedEvent;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
@@ -57,6 +61,7 @@ import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -72,12 +77,15 @@ public class MappingMongoConverterUnitTests {
MongoMappingContext mappingContext;
@Mock
MongoDbFactory factory;
@Mock
ApplicationContext context;
@Before
public void setUp() {
mappingContext = new MongoMappingContext();
mappingContext.afterPropertiesSet();
mappingContext.setApplicationContext(context);
mappingContext.onApplicationEvent(new ContextRefreshedEvent(context));
converter = new MappingMongoConverter(factory, mappingContext);
converter.afterPropertiesSet();
@@ -118,12 +126,30 @@ public class MappingMongoConverterUnitTests {
DBObject dbObject = new BasicDBObject();
converter.write(person, dbObject);
assertThat(dbObject.get("birthDate"), is(Date.class));
assertThat(dbObject.get("birthDate"), is(instanceOf(Date.class)));
Person result = converter.read(Person.class, dbObject);
assertThat(result.birthDate, is(notNullValue()));
}
@Test
public void convertsCustomTypeOnConvertToMongoType() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new LocalDateToDateConverter());
converters.add(new DateToLocalDateConverter());
CustomConversions conversions = new CustomConversions(converters);
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
converter = new MappingMongoConverter(factory, mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
LocalDate date = new LocalDate();
converter.convertToMongoType(date);
}
/**
* @see DATAMONGO-130
*/
@@ -161,7 +187,7 @@ public class MappingMongoConverterUnitTests {
dbObject.put("birthDate", new LocalDate());
dbObject.put(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Person.class.getName());
assertThat(converter.read(Contact.class, dbObject), is(Person.class));
assertThat(converter.read(Contact.class, dbObject), is(instanceOf(Person.class)));
}
/**
@@ -174,14 +200,13 @@ public class MappingMongoConverterUnitTests {
dbObject.put("birthDate", new LocalDate());
dbObject.put(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Person.class.getName());
assertThat(converter.read(BirthDateContainer.class, dbObject), is(BirthDateContainer.class));
assertThat(converter.read(BirthDateContainer.class, dbObject), is(instanceOf(BirthDateContainer.class)));
}
@Test
public void writesTypeDiscriminatorIntoRootObject() {
Person person = new Person();
person.birthDate = new LocalDate();
DBObject result = new BasicDBObject();
converter.write(person, result);
@@ -202,7 +227,7 @@ public class MappingMongoConverterUnitTests {
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("sampleEnum"), is(String.class));
assertThat(result.get("sampleEnum"), is(instanceOf(String.class)));
assertThat(result.get("sampleEnum").toString(), is("FIRST"));
}
@@ -218,7 +243,7 @@ public class MappingMongoConverterUnitTests {
DBObject result = new BasicDBObject();
converter.write(value, result);
assertThat(result.get("enums"), is(BasicDBList.class));
assertThat(result.get("enums"), is(instanceOf(BasicDBList.class)));
BasicDBList enums = (BasicDBList) result.get("enums");
assertThat(enums.size(), is(1));
@@ -248,7 +273,7 @@ public class MappingMongoConverterUnitTests {
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, dbObject);
assertThat(result.enums, is(List.class));
assertThat(result.enums, is(instanceOf(List.class)));
assertThat(result.enums.size(), is(1));
assertThat(result.enums, hasItem(SampleEnum.FIRST));
}
@@ -302,8 +327,8 @@ public class MappingMongoConverterUnitTests {
*/
@Test
public void writesCollectionWithInterfaceCorrectly() {
Person person = new Person();
person.birthDate = new LocalDate();
person.firstname = "Oliver";
CollectionWrapper wrapper = new CollectionWrapper();
@@ -313,7 +338,7 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, dbObject);
Object result = dbObject.get("contacts");
assertThat(result, is(BasicDBList.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
BasicDBList contacts = (BasicDBList) result;
DBObject personDbObject = (DBObject) contacts.get(0);
assertThat(personDbObject.get("foo").toString(), is("Oliver"));
@@ -336,9 +361,8 @@ public class MappingMongoConverterUnitTests {
assertThat(result.contacts, is(notNullValue()));
assertThat(result.contacts.size(), is(1));
Contact contact = result.contacts.get(0);
assertThat(contact, is(Person.class));
assertThat(contact, is(instanceOf(Person.class)));
assertThat(((Person) contact).firstname, is("Oliver"));
}
@Test
@@ -350,7 +374,7 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, dbObject);
Object localeField = dbObject.get("locale");
assertThat(localeField, is(String.class));
assertThat(localeField, is(instanceOf(String.class)));
assertThat((String) localeField, is("en_US"));
LocaleWrapper read = converter.read(LocaleWrapper.class, dbObject);
@@ -458,13 +482,13 @@ public class MappingMongoConverterUnitTests {
DBObject dbo1 = new BasicDBObject();
converter.write(p1, dbo1);
assertThat(dbo1.get("_id"), is(String.class));
assertThat(dbo1.get("_id"), is(instanceOf(String.class)));
PersonPojoStringId p2 = new PersonPojoStringId(new ObjectId().toString(), "Text-1");
DBObject dbo2 = new BasicDBObject();
converter.write(p2, dbo2);
assertThat(dbo2.get("_id"), is(ObjectId.class));
assertThat(dbo2.get("_id"), is(instanceOf(ObjectId.class)));
}
/**
@@ -478,8 +502,8 @@ public class MappingMongoConverterUnitTests {
ClassWithSortedMap result = converter.read(ClassWithSortedMap.class, wrapper);
assertThat(result, is(ClassWithSortedMap.class));
assertThat(result.map, is(SortedMap.class));
assertThat(result, is(instanceOf(ClassWithSortedMap.class)));
assertThat(result.map, is(instanceOf(SortedMap.class)));
}
/**
@@ -745,7 +769,7 @@ public class MappingMongoConverterUnitTests {
assertThat(result.containsField("Foo"), is(true));
assertThat(result.get("Foo"), is(notNullValue()));
assertThat(result.get("Foo"), is(BasicDBList.class));
assertThat(result.get("Foo"), is(instanceOf(BasicDBList.class)));
BasicDBList list = (BasicDBList) result.get("Foo");
@@ -796,11 +820,11 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, result);
Object mapObject = result.get("mapOfObjects");
assertThat(mapObject, is(BasicDBObject.class));
assertThat(mapObject, is(instanceOf(BasicDBObject.class)));
DBObject map = (DBObject) mapObject;
Object valueObject = map.get("foo");
assertThat(valueObject, is(BasicDBList.class));
assertThat(valueObject, is(instanceOf(BasicDBList.class)));
List<Object> list = (List<Object>) valueObject;
assertThat(list.size(), is(1));
@@ -891,7 +915,7 @@ public class MappingMongoConverterUnitTests {
converter.write(wrapper, result);
Object contacts = result.get("contacts");
assertThat(contacts, is(Collection.class));
assertThat(contacts, is(instanceOf(Collection.class)));
assertThat(((Collection<?>) contacts).size(), is(2));
assertThat(((Collection<Object>) contacts), hasItem(nullValue()));
}
@@ -954,7 +978,7 @@ public class MappingMongoConverterUnitTests {
Item read = converter.read(Item.class, result);
assertThat(read.attributes.size(), is(1));
assertThat(read.attributes.get(0).key, is(attribute.key));
assertThat(read.attributes.get(0).value, is(Collection.class));
assertThat(read.attributes.get(0).value, is(instanceOf(Collection.class)));
@SuppressWarnings("unchecked")
Collection<String> values = (Collection<String>) read.attributes.get(0).value;
@@ -1013,11 +1037,11 @@ public class MappingMongoConverterUnitTests {
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singleton(address));
assertThat(result, is(BasicDBList.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
Set<?> readResult = converter.read(Set.class, (BasicDBList) result);
assertThat(readResult.size(), is(1));
assertThat(readResult.iterator().next(), is(Map.class));
assertThat(readResult.iterator().next(), is(instanceOf(Map.class)));
}
/**
@@ -1065,6 +1089,181 @@ public class MappingMongoConverterUnitTests {
assertThat(dbRef.getRef(), is("person"));
}
/**
* @see DATAMONGO-458
*/
@Test
public void readEmptyCollectionIsModifiable() {
DBObject dbObject = new BasicDBObject("contactsSet", new BasicDBList());
CollectionWrapper wrapper = converter.read(CollectionWrapper.class, dbObject);
assertThat(wrapper.contactsSet, is(notNullValue()));
wrapper.contactsSet.add(new Contact() {
});
}
/**
* @see DATAMONGO-424
*/
@Test
public void readsPlainDBRefObject() {
DBRef dbRef = new DBRef(mock(DB.class), "foo", 2);
DBObject dbObject = new BasicDBObject("ref", dbRef);
DBRefWrapper result = converter.read(DBRefWrapper.class, dbObject);
assertThat(result.ref, is(dbRef));
}
/**
* @see DATAMONGO-424
*/
@Test
public void readsCollectionOfDBRefs() {
DBRef dbRef = new DBRef(mock(DB.class), "foo", 2);
BasicDBList refs = new BasicDBList();
refs.add(dbRef);
DBObject dbObject = new BasicDBObject("refs", refs);
DBRefWrapper result = converter.read(DBRefWrapper.class, dbObject);
assertThat(result.refs, hasSize(1));
assertThat(result.refs, hasItem(dbRef));
}
/**
* @see DATAMONGO-424
*/
@Test
public void readsDBRefMap() {
DBRef dbRef = mock(DBRef.class);
BasicDBObject refMap = new BasicDBObject("foo", dbRef);
DBObject dbObject = new BasicDBObject("refMap", refMap);
DBRefWrapper result = converter.read(DBRefWrapper.class, dbObject);
assertThat(result.refMap.entrySet(), hasSize(1));
assertThat(result.refMap.values(), hasItem(dbRef));
}
/**
* @see DATAMONGO-424
*/
@Test
@SuppressWarnings({ "rawtypes", "unchecked" })
public void resolvesDBRefMapValue() {
DBRef dbRef = mock(DBRef.class);
when(dbRef.fetch()).thenReturn(new BasicDBObject());
BasicDBObject refMap = new BasicDBObject("foo", dbRef);
DBObject dbObject = new BasicDBObject("personMap", refMap);
DBRefWrapper result = converter.read(DBRefWrapper.class, dbObject);
Matcher isPerson = instanceOf(Person.class);
assertThat(result.personMap.entrySet(), hasSize(1));
assertThat(result.personMap.values(), hasItem(isPerson));
}
/**
* @see DATAMONGO-462
*/
@Test
public void writesURLsAsStringOutOfTheBox() throws Exception {
URLWrapper wrapper = new URLWrapper();
wrapper.url = new URL("http://springsource.org");
DBObject sink = new BasicDBObject();
converter.write(wrapper, sink);
assertThat(sink.get("url"), is((Object) "http://springsource.org"));
}
/**
* @see DATAMONGO-462
*/
@Test
public void readsURLFromStringOutOfTheBox() throws Exception {
DBObject dbObject = new BasicDBObject("url", "http://springsource.org");
URLWrapper result = converter.read(URLWrapper.class, dbObject);
assertThat(result.url, is(new URL("http://springsource.org")));
}
/**
* @see DATAMONGO-485
*/
@Test
public void writesComplexIdCorrectly() {
ComplexId id = new ComplexId();
id.innerId = 4711L;
ClassWithComplexId entity = new ClassWithComplexId();
entity.complexId = id;
DBObject dbObject = new BasicDBObject();
converter.write(entity, dbObject);
Object idField = dbObject.get("_id");
assertThat(idField, is(notNullValue()));
assertThat(idField, is(instanceOf(DBObject.class)));
assertThat(((DBObject) idField).get("innerId"), is((Object) 4711L));
}
/**
* @see DATAMONGO-485
*/
@Test
public void readsComplexIdCorrectly() {
DBObject innerId = new BasicDBObject("innerId", 4711L);
DBObject entity = new BasicDBObject("_id", innerId);
ClassWithComplexId result = converter.read(ClassWithComplexId.class, entity);
assertThat(result.complexId, is(notNullValue()));
assertThat(result.complexId.innerId, is(4711L));
}
/**
* @see DATAMONGO-489
*/
@Test
public void readsArraysAsMapValuesCorrectly() {
BasicDBList list = new BasicDBList();
list.add("Foo");
list.add("Bar");
DBObject map = new BasicDBObject("key", list);
DBObject wrapper = new BasicDBObject("mapOfStrings", map);
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, wrapper);
assertThat(result.mapOfStrings, is(notNullValue()));
String[] values = result.mapOfStrings.get("key");
assertThat(values, is(notNullValue()));
assertThat(values, is(arrayWithSize(2)));
}
/**
* @see DATAMONGO-497
*/
@Test
public void readsEmptyCollectionIntoConstructorCorrectly() {
DBObject source = new BasicDBObject("attributes", new BasicDBList());
TypWithCollectionConstructor result = converter.read(TypWithCollectionConstructor.class, source);
assertThat(result.attributes, is(notNullValue()));
}
private static void assertSyntheticFieldValueOf(Object target, Object expected) {
for (int i = 0; i < 10; i++) {
@@ -1079,6 +1278,20 @@ public class MappingMongoConverterUnitTests {
fail(String.format("Didn't find synthetic field on %s!", target));
}
/**
* @see DATAMGONGO-508
*/
@Test
public void eagerlyReturnsDBRefObjectIfTargetAlreadyIsOne() {
DB db = mock(DB.class);
DBRef dbRef = new DBRef(db, "collection", "id");
org.springframework.data.mongodb.core.mapping.DBRef annotation = mock(org.springframework.data.mongodb.core.mapping.DBRef.class);
assertThat(converter.createDBRef(dbRef, annotation), is(dbRef));
}
static class GenericType<T> {
T content;
}
@@ -1144,6 +1357,7 @@ public class MappingMongoConverterUnitTests {
Map<Locale, String> map;
Map<String, List<String>> mapOfLists;
Map<String, Object> mapOfObjects;
Map<String, String[]> mapOfStrings;
}
static class ClassWithNestedMaps {
@@ -1164,6 +1378,7 @@ public class MappingMongoConverterUnitTests {
List<Contact> contacts;
List<List<String>> strings;
List<Map<String, Locale>> listOfMaps;
Set<Contact> contactsSet;
}
static class LocaleWrapper {
@@ -1229,6 +1444,37 @@ public class MappingMongoConverterUnitTests {
Person person;
}
static class DBRefWrapper {
DBRef ref;
List<DBRef> refs;
Map<String, DBRef> refMap;
Map<String, Person> personMap;
}
static class URLWrapper {
URL url;
}
static class ClassWithComplexId {
@Id
ComplexId complexId;
}
static class ComplexId {
Long innerId;
}
static class TypWithCollectionConstructor {
List<Attribute> attributes;
public TypWithCollectionConstructor(List<Attribute> attributes) {
this.attributes = attributes;
}
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {
public Date convert(LocalDate source) {

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,11 +17,14 @@ package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.DBObjectUtils.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -31,7 +34,10 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.DBObjectUtils;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.BasicQuery;
@@ -49,11 +55,11 @@ import com.mongodb.QueryBuilder;
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
@SuppressWarnings("unused")
public class QueryMapperUnitTests {
QueryMapper mapper;
MongoMappingContext context;
MappingMongoConverter converter;
@Mock
MongoDbFactory factory;
@@ -63,7 +69,7 @@ public class QueryMapperUnitTests {
context = new MongoMappingContext();
MappingMongoConverter converter = new MappingMongoConverter(factory, context);
converter = new MappingMongoConverter(factory, context);
converter.afterPropertiesSet();
mapper = new QueryMapper(converter);
@@ -84,15 +90,15 @@ public class QueryMapperUnitTests {
public void convertsStringIntoObjectId() {
DBObject query = new BasicDBObject("_id", new ObjectId().toString());
DBObject result = mapper.getMappedObject(query, null);
assertThat(result.get("_id"), is(ObjectId.class));
DBObject result = mapper.getMappedObject(query, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
}
@Test
public void handlesBigIntegerIdsCorrectly() {
DBObject dbObject = new BasicDBObject("id", new BigInteger("1"));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) "1"));
}
@@ -101,7 +107,7 @@ public class QueryMapperUnitTests {
ObjectId id = new ObjectId();
DBObject dbObject = new BasicDBObject("id", new BigInteger(id.toString(), 16));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) id));
}
@@ -115,9 +121,9 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(criteria.getCriteriaObject(), context.getPersistentEntity(Sample.class));
Object object = result.get("_id");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
DBObject dbObject = (DBObject) object;
assertThat(dbObject.get("$ne"), is(ObjectId.class));
assertThat(dbObject.get("$ne"), is(instanceOf(ObjectId.class)));
}
/**
@@ -129,7 +135,7 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(String.class));
assertThat(object, is(instanceOf(String.class)));
}
@Test
@@ -138,10 +144,10 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
Object ne = ((DBObject) object).get("$ne");
assertThat(ne, is(String.class));
assertThat(ne, is(instanceOf(String.class)));
assertThat(ne.toString(), is(Enum.INSTANCE.name()));
}
@@ -152,14 +158,14 @@ public class QueryMapperUnitTests {
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
Object object = result.get("foo");
assertThat(object, is(DBObject.class));
assertThat(object, is(instanceOf(DBObject.class)));
Object in = ((DBObject) object).get("$in");
assertThat(in, is(BasicDBList.class));
assertThat(in, is(instanceOf(BasicDBList.class)));
BasicDBList list = (BasicDBList) in;
assertThat(list.size(), is(1));
assertThat(list.get(0), is(String.class));
assertThat(list.get(0), is(instanceOf(String.class)));
assertThat(list.get(0).toString(), is(Enum.INSTANCE.name()));
}
@@ -199,6 +205,146 @@ public class QueryMapperUnitTests {
assertThat(result, is(query.getQueryObject()));
}
@Test
public void doesHandleNestedFieldsWithDefaultIdNames() {
BasicDBObject dbObject = new BasicDBObject("id", new ObjectId().toString());
dbObject.put("nested", new BasicDBObject("id", new ObjectId().toString()));
MongoPersistentEntity<?> entity = context.getPersistentEntity(ClassWithDefaultId.class);
DBObject result = mapper.getMappedObject(dbObject, entity);
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
assertThat(((DBObject) result.get("nested")).get("_id"), is(instanceOf(ObjectId.class)));
}
/**
* @see DATAMONGO-493
*/
@Test
public void doesNotTranslateNonIdPropertiesFor$NeCriteria() {
ObjectId accidentallyAnObjectId = new ObjectId();
Query query = Query.query(Criteria.where("id").is("id_value").and("publishers")
.ne(accidentallyAnObjectId.toString()));
DBObject dbObject = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(UserEntity.class));
assertThat(dbObject.get("publishers"), is(instanceOf(DBObject.class)));
DBObject publishers = (DBObject) dbObject.get("publishers");
assertThat(publishers.containsField("$ne"), is(true));
assertThat(publishers.get("$ne"), is(instanceOf(String.class)));
}
/**
* @see DATAMONGO-494
*/
@Test
public void usesEntityMetadataInOr() {
Query query = query(new Criteria().orOperator(where("foo").is("bar")));
DBObject result = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Sample.class));
assertThat(result.keySet(), hasSize(1));
assertThat(result.keySet(), hasItem("$or"));
BasicDBList ors = getAsDBList(result, "$or");
assertThat(ors, hasSize(1));
DBObject criterias = getAsDBObject(ors, 0);
assertThat(criterias.keySet(), hasSize(1));
assertThat(criterias.get("_id"), is(notNullValue()));
assertThat(criterias.get("foo"), is(nullValue()));
}
@Test
public void translatesPropertyReferenceCorrectly() {
Query query = query(where("field").is(new CustomizedField()));
DBObject result = mapper
.getMappedObject(query.getQueryObject(), context.getPersistentEntity(CustomizedField.class));
assertThat(result.containsField("foo"), is(true));
assertThat(result.keySet().size(), is(1));
}
@Test
public void translatesNestedPropertyReferenceCorrectly() {
Query query = query(where("field.field").is(new CustomizedField()));
DBObject result = mapper
.getMappedObject(query.getQueryObject(), context.getPersistentEntity(CustomizedField.class));
assertThat(result.containsField("foo.foo"), is(true));
assertThat(result.keySet().size(), is(1));
}
@Test
public void returnsOriginalKeyIfNoPropertyReference() {
Query query = query(where("bar").is(new CustomizedField()));
DBObject result = mapper
.getMappedObject(query.getQueryObject(), context.getPersistentEntity(CustomizedField.class));
assertThat(result.containsField("bar"), is(true));
assertThat(result.keySet().size(), is(1));
}
@Test
public void convertsAssociationCorrectly() {
Reference reference = new Reference();
reference.id = 5L;
Query query = query(where("reference").is(reference));
DBObject object = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(WithDBRef.class));
Object referenceObject = object.get("reference");
assertThat(referenceObject, is(instanceOf(com.mongodb.DBRef.class)));
}
@Test
public void convertsNestedAssociationCorrectly() {
Reference reference = new Reference();
reference.id = 5L;
Query query = query(where("withDbRef.reference").is(reference));
DBObject object = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WithDBRefWrapper.class));
Object referenceObject = object.get("withDbRef.reference");
assertThat(referenceObject, is(instanceOf(com.mongodb.DBRef.class)));
}
@Test
public void convertsInKeywordCorrectly() {
Reference first = new Reference();
first.id = 5L;
Reference second = new Reference();
second.id = 6L;
Query query = query(where("reference").in(first, second));
DBObject result = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(WithDBRef.class));
DBObject reference = DBObjectUtils.getAsDBObject(result, "reference");
assertThat(reference.containsField("$in"), is(true));
}
class IdWrapper {
Object id;
}
class ClassWithDefaultId {
String id;
ClassWithDefaultId nested;
}
class Sample {
@Id
@@ -214,4 +360,31 @@ public class QueryMapperUnitTests {
enum Enum {
INSTANCE;
}
class UserEntity {
String id;
List<String> publishers = new ArrayList<String>();
}
class CustomizedField {
@Field("foo")
CustomizedField field;
}
class WithDBRef {
@DBRef
Reference reference;
}
class Reference {
Long id;
}
class WithDBRefWrapper {
WithDBRef withDbRef;
}
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
import org.hamcrest.Matchers;
import org.junit.After;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* Integration tests for {@link MongoPersistentEntityIndexCreator}.
*
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoPersistentEntityIndexCreatorIntegrationTests {
@Autowired
@Qualifier("mongo1")
MongoOperations templateOne;
@Autowired
@Qualifier("mongo2")
MongoOperations templateTwo;
@After
public void cleanUp() {
templateOne.dropCollection(SampleEntity.class);
templateTwo.dropCollection(SampleEntity.class);
}
@Test
public void foo() {
List<IndexInfo> indexInfo = templateOne.indexOps(SampleEntity.class).getIndexInfo();
assertThat(indexInfo, hasSize(greaterThan(0)));
assertThat(indexInfo, Matchers.<IndexInfo> hasItem(hasProperty("name", is("prop"))));
indexInfo = templateTwo.indexOps(SampleEntity.class).getIndexInfo();
assertThat(indexInfo, hasSize(0));
}
}

View File

@@ -0,0 +1,108 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.context.ApplicationContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Unit tests for {@link MongoPersistentEntityIndexCreator}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoPersistentEntityIndexCreatorUnitTests {
@Mock
MongoDbFactory factory;
@Mock
ApplicationContext context;
@Test
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(Person.class));
mappingContext.initialize();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("fieldname"));
assertThat(creator.name, is("indexName"));
}
@Test
public void doesNotCreateIndexForEntityComingFromDifferentMappingContext() {
MongoMappingContext mappingContext = new MongoMappingContext();
MongoMappingContext personMappingContext = new MongoMappingContext();
personMappingContext.setInitialEntitySet(Collections.singleton(Person.class));
personMappingContext.initialize();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
MongoPersistentEntity<?> entity = personMappingContext.getPersistentEntity(Person.class);
MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event = new MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>(
personMappingContext, entity);
creator.onApplicationEvent(event);
assertThat(creator.indexDefinition, is(nullValue()));
}
static class Person {
@Indexed(name = "indexName")
@Field("fieldname")
String field;
}
static class DummyMongoPersistentEntityIndexCreator extends MongoPersistentEntityIndexCreator {
DBObject indexDefinition;
String name;
public DummyMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
super(mappingContext, mongoDbFactory);
}
@Override
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
this.name = name;
this.indexDefinition = indexDefinition;
}
}
}

View File

@@ -0,0 +1,14 @@
package org.springframework.data.mongodb.core.index;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class SampleEntity {
@Id
String id;
@Indexed
String prop;
}

View File

@@ -15,18 +15,15 @@
*/
package org.springframework.data.mongodb.core.mapping;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.lang.reflect.Field;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.ReflectionUtils;

View File

@@ -51,7 +51,7 @@ public class GenericMappingTests {
public void setUp() throws Exception {
context = new MongoMappingContext();
context.setInitialEntitySet(Collections.singleton(StringWrapper.class));
context.afterPropertiesSet();
context.initialize();
converter = new MappingMongoConverter(factory, context);
}

View File

@@ -16,11 +16,17 @@
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Map;
import org.junit.Test;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.model.MappingException;
import com.mongodb.DBRef;
/**
* Unit tests for {@link MongoMappingContext}.
@@ -31,9 +37,33 @@ public class MongoMappingContextUnitTests {
@Test
public void addsSelfReferencingPersistentEntityCorrectly() throws Exception {
MongoMappingContext context = new MongoMappingContext();
context.setInitialEntitySet(Collections.singleton(SampleClass.class));
context.afterPropertiesSet();
context.initialize();
}
@Test(expected = MappingException.class)
public void rejectsEntityWithMultipleIdProperties() {
MongoMappingContext context = new MongoMappingContext();
context.getPersistentEntity(ClassWithMultipleIdProperties.class);
}
@Test
public void doesNotReturnPersistentEntityForMongoSimpleType() {
MongoMappingContext context = new MongoMappingContext();
assertThat(context.getPersistentEntity(DBRef.class), is(nullValue()));
}
class ClassWithMultipleIdProperties {
@Id
String myId;
String id;
}
public class SampleClass {

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.springframework.data.mongodb.core.mapreduce.GroupBy.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.util.Arrays;
import java.util.HashSet;
@@ -37,9 +40,6 @@ import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.Mongo;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.mapreduce.GroupBy.*;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class GroupByTests {
@@ -50,9 +50,6 @@ public class GroupByTests {
@Autowired
ApplicationContext applicationContext;
// @Autowired
// MongoTemplate mongoTemplate;
MongoTemplate mongoTemplate;
@Autowired
@@ -61,7 +58,7 @@ public class GroupByTests {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(XObject.class)));
mappingContext.afterPropertiesSet();
mappingContext.initialize();
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.afterPropertiesSet();

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link MapReduceResults}.
*
* @author Oliver Gierke
*/
public class MapReduceResultsUnitTests {
/**
* @see DATAMONGO-428
*/
@Test
public void resolvesOutputCollectionForPlainResult() {
DBObject rawResult = new BasicDBObject("result", "FOO");
MapReduceResults<Object> results = new MapReduceResults<Object>(Collections.emptyList(), rawResult);
assertThat(results.getOutputCollection(), is("FOO"));
}
/**
* @see DATAMONGO-428
*/
@Test
public void resolvesOutputCollectionForDBObjectResult() {
DBObject rawResult = new BasicDBObject("result", new BasicDBObject("collection", "FOO"));
MapReduceResults<Object> results = new MapReduceResults<Object>(Collections.emptyList(), rawResult);
assertThat(results.getOutputCollection(), is("FOO"));
}
/**
* @see DATAMONGO-378
*/
@Test
public void handlesLongTotalInResult() {
DBObject inner = new BasicDBObject("total", 1L);
inner.put("mapTime", 1L);
inner.put("emitLoop", 1);
DBObject source = new BasicDBObject("timing", inner);
new MapReduceResults<Object>(Collections.emptyList(), source);
}
}

View File

@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.junit.Assert.assertEquals;
import static org.springframework.data.mongodb.core.query.Criteria.where;
import static org.springframework.data.mongodb.core.mapreduce.MapReduceOptions.options;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.mapreduce.MapReduceOptions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.util.ArrayList;
import java.util.Arrays;
@@ -69,7 +69,7 @@ public class MapReduceTests {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(new HashSet<Class<?>>(Arrays.asList(ValueObject.class)));
mappingContext.afterPropertiesSet();
mappingContext.initialize();
MappingMongoConverter mappingConverter = new MappingMongoConverter(factory, mappingContext);
mappingConverter.afterPropertiesSet();

View File

@@ -318,6 +318,16 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(females.get(0), is(alicia));
}
/**
* @see DATAMONGO-446
*/
@Test
public void findsPeopleBySexPaginated() {
List<Person> males = repository.findBySex(Sex.MALE, new PageRequest(0, 2));
assertThat(males.size(), is(2));
}
@Test
public void findsPeopleByNamedQuery() {
List<Person> result = repository.findByNamedQuery("Dave");
@@ -485,4 +495,15 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
List<Person> result = repository.findByCreatedAtLessThanManually(boyd.createdAt);
assertThat(result.isEmpty(), is(false));
}
}
/**
* @see DATAMONGO-472
*/
@Test
public void findsPeopleUsingNotPredicate() {
List<Person> result = repository.findByLastnameNot("Matthews");
assertThat(result, not(hasItem(dave)));
assertThat(result, hasSize(5));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2010 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -147,6 +147,8 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
List<Person> findBySex(Sex sex);
List<Person> findBySex(Sex sex, Pageable pageable);
List<Person> findByNamedQuery(String firstname);
GeoResults<Person> findByLocationNear(Point point, Distance maxDistance);
@@ -181,4 +183,11 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
List<Person> findByCreatedAtAfter(Date date);
/**
* @see DATAMONGO-472
* @param lastname
* @return
*/
List<Person> findByLastnameNot(String lastname);
}

View File

@@ -20,6 +20,7 @@ import static org.junit.Assert.*;
import org.apache.webbeans.cditest.CdiTestContainer;
import org.apache.webbeans.cditest.CdiTestContainerLoader;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import org.springframework.data.mongodb.repository.Person;
@@ -39,11 +40,16 @@ public class CdiExtensionIntegrationTests {
container.bootContainer();
}
@AfterClass
public static void tearDown() throws Exception {
container.shutdownContainer();
}
@Test
public void bootstrapsRepositoryCorrectly() {
RepositoryClient client = container.getInstance(RepositoryClient.class);
PersonRepository repository = client.getRepository();
CdiPersonRepository repository = client.getRepository();
assertThat(repository, is(notNullValue()));

View File

@@ -18,7 +18,7 @@ package org.springframework.data.mongodb.repository.cdi;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.data.repository.Repository;
public interface PersonRepository extends Repository<Person, String> {
public interface CdiPersonRepository extends Repository<Person, String> {
void deleteAll();

View File

@@ -18,18 +18,17 @@ package org.springframework.data.mongodb.repository.cdi;
import javax.inject.Inject;
/**
*
* @author Oliver Gierke
*/
class RepositoryClient {
@Inject
PersonRepository repository;
CdiPersonRepository repository;
/**
* @return the repository
*/
public PersonRepository getRepository() {
public CdiPersonRepository getRepository() {
return repository;
}
}

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.config;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.repository.PersonRepository;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoRepositoriesRegistrar}.
*
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoRepositoriesRegistrarIntegrationTests {
@Configuration
@EnableMongoRepositories(basePackages = "org.springframework.data.mongodb.repository")
static class Config {
@Bean
public MongoOperations mongoTemplate() throws Exception {
return new MongoTemplate(new SimpleMongoDbFactory(new Mongo(), "database"));
}
}
@Autowired
PersonRepository personRepository;
@Test
public void testConfiguration() {
}
}

View File

@@ -15,11 +15,13 @@
*/
package org.springframework.data.mongodb.repository.query;
import static org.mockito.Mockito.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import static org.mockito.Mockito.*;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -27,7 +29,11 @@ import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor.PotentiallyConvertingIterator;
import com.mongodb.BasicDBList;
@@ -76,4 +82,66 @@ public class ConvertingParameterAccessorUnitTests {
assertThat(result, is((Object) reference));
}
/**
* @see DATAMONGO-505
*/
@Test
public void convertsAssociationsToDBRef() {
Property property = new Property();
property.id = 5L;
Object result = setupAndConvert(property);
assertThat(result, is(instanceOf(com.mongodb.DBRef.class)));
com.mongodb.DBRef dbRef = (com.mongodb.DBRef) result;
assertThat(dbRef.getRef(), is("property"));
assertThat(dbRef.getId(), is((Object) 5L));
}
/**
* @see DATAMONGO-505
*/
@Test
public void convertsAssociationsToDBRefForCollections() {
Property property = new Property();
property.id = 5L;
Object result = setupAndConvert(Arrays.asList(property));
assertThat(result, is(instanceOf(Collection.class)));
Collection<?> collection = (Collection<?>) result;
assertThat(collection, hasSize(1));
Object element = collection.iterator().next();
assertThat(element, is(instanceOf(com.mongodb.DBRef.class)));
com.mongodb.DBRef dbRef = (com.mongodb.DBRef) element;
assertThat(dbRef.getRef(), is("property"));
assertThat(dbRef.getId(), is((Object) 5L));
}
private Object setupAndConvert(Object... parameters) {
MongoParameterAccessor delegate = new StubParameterAccessor(parameters);
PotentiallyConvertingIterator iterator = new ConvertingParameterAccessor(converter, delegate).iterator();
MongoPersistentEntity<?> entity = context.getPersistentEntity(Entity.class);
MongoPersistentProperty property = entity.getPersistentProperty("property");
return iterator.nextConverted(property);
}
static class Entity {
@DBRef
Property property;
}
static class Property {
Long id;
}
}

View File

@@ -84,12 +84,22 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstName", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Oliver"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("firstName").is("Oliver"))));
}
creator.createQuery();
/**
* @see DATAMONGO-469
*/
@Test
public void createsAndQueryCorrectly() {
creator = new MongoQueryCreator(new PartTree("findByFirstNameAndFriend", Person.class), getAccessor(converter,
"Oliver", new Person()), context);
creator.createQuery();
Person person = new Person();
MongoQueryCreator creator = new MongoQueryCreator(new PartTree("findByFirstNameAndFriend", Person.class),
getAccessor(converter, "Oliver", person), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("firstName").is("Oliver").and("friend").is(person))));
}
@Test
@@ -98,7 +108,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameNotNull", Person.class);
Query query = new MongoQueryCreator(tree, getAccessor(converter), context).createQuery();
assertThat(query.getQueryObject(), is(new Query(Criteria.where("firstName").ne(null)).getQueryObject()));
assertThat(query, is(new Query(Criteria.where("firstName").ne(null))));
}
@Test
@@ -107,7 +117,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameIsNull", Person.class);
Query query = new MongoQueryCreator(tree, getAccessor(converter), context).createQuery();
assertThat(query.getQueryObject(), is(new Query(Criteria.where("firstName").is(null)).getQueryObject()));
assertThat(query, is(new Query(Criteria.where("firstName").is(null))));
}
@Test
@@ -139,7 +149,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, 18), context);
Query reference = query(where("age").lte(18));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
@Test
@@ -149,7 +159,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, 18), context);
Query reference = query(where("age").gte(18));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
/**
@@ -162,7 +172,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(partTree, getAccessor(converter, "Oliver"), context);
Query reference = query(where("foo").is("Oliver"));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
/**
@@ -174,7 +184,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByAgeExists", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, true), context);
Query query = query(where("age").exists(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -186,7 +196,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameRegex", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, ".*"), context);
Query query = query(where("firstName").regex(".*"));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -198,7 +208,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByActiveTrue", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -210,7 +220,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByActiveFalse", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(false));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -223,8 +233,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Dave", 42), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(),
is(query(new Criteria().orOperator(where("firstName").is("Dave"), where("age").is(42))).getQueryObject()));
assertThat(query, is(query(new Criteria().orOperator(where("firstName").is("Dave"), where("age").is(42)))));
}
/**
@@ -241,7 +250,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, user), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(), is(query(where("creator").is(dbref)).getQueryObject()));
assertThat(query, is(query(where("creator").is(dbref))));
}
/**
@@ -254,7 +263,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Matt"), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(), is(query(where("foo").regex("Matt.*")).getQueryObject()));
assertThat(query, is(query(where("foo").regex("Matt.*"))));
}
/**
@@ -267,7 +276,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "ews"), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(), is(query(where("foo").regex(".*ews")).getQueryObject()));
assertThat(query, is(query(where("foo").regex(".*ews"))));
}
/**
@@ -280,7 +289,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "thew"), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(), is(query(where("foo").regex(".*thew.*")).getQueryObject()));
assertThat(query, is(query(where("foo").regex(".*thew.*"))));
}
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
@@ -298,7 +307,7 @@ public class MongoQueryCreatorUnitTests {
Query query = new MongoQueryCreator(tree, new ConvertingParameterAccessor(converter, accessor), context)
.createQuery();
assertThat(query.getQueryObject(), is(query.getQueryObject()));
assertThat(query, is(query));
}
interface PersonRepository extends Repository<Person, Long> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,9 +15,10 @@
*/
package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -30,10 +31,11 @@ import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.QAddress;
import org.springframework.data.mongodb.repository.QPerson;
import org.springframework.data.mongodb.repository.support.QueryDslMongoRepository.SpringDataMongodbSerializer;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mysema.query.types.expr.BooleanOperation;
import com.mysema.query.types.path.PathBuilder;
import com.mysema.query.types.path.StringPath;
/**
@@ -53,7 +55,7 @@ public class SpringDataMongodbSerializerUnitTests {
public void setUp() {
MongoMappingContext context = new MongoMappingContext();
converter = new MappingMongoConverter(dbFactory, context);
serializer = new QueryDslMongoRepository.SpringDataMongodbSerializer(converter);
serializer = new SpringDataMongodbSerializer(converter);
}
@Test
@@ -77,12 +79,12 @@ public class SpringDataMongodbSerializerUnitTests {
address.zipCode = "01234";
DBObject result = serializer.asDBObject("foo", address);
assertThat(result, is(BasicDBObject.class));
assertThat(result, is(instanceOf(BasicDBObject.class)));
BasicDBObject dbObject = (BasicDBObject) result;
Object value = dbObject.get("foo");
assertThat(value, is(notNullValue()));
assertThat(value, is(BasicDBObject.class));
assertThat(value, is(instanceOf(BasicDBObject.class)));
Object reference = converter.convertToMongoType(address);
assertThat(value, is(reference));
@@ -98,7 +100,25 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat(serializer.getKeyForPath(address, address.getMetadata()), is(""));
}
/**
* @see DATAMONGO-467
*/
@Test
public void convertsIdPropertyCorrectly() {
ObjectId id = new ObjectId();
PathBuilder<Address> builder = new PathBuilder<Address>(Address.class, "address");
StringPath idPath = builder.getString("id");
DBObject result = (DBObject) serializer.visit((BooleanOperation) idPath.eq(id.toString()), (Void) null);
assertThat(result.get("_id"), is(notNullValue()));
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
assertThat(result.get("_id"), is((Object) id));
}
class Address {
String id;
String street;
@Field("zip_code")
String zipCode;

View File

@@ -3,7 +3,7 @@
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd">
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd">
<mongo:mongo host="localhost" port="27017"/>

View File

@@ -3,7 +3,7 @@
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd">
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd">
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>

View File

@@ -2,7 +2,7 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="rack1" />

View File

@@ -2,7 +2,7 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="first" mongo-ref="mongo" write-concern="SAFE" />

View File

@@ -3,7 +3,7 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">

View File

@@ -3,7 +3,7 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">

View File

@@ -0,0 +1,23 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:mapping-converter id="mongoConverter" db-factory-ref="factory1"
base-package="org.springframework.data.mongodb.core.index" />
<bean id="mongo1" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="factory1" />
<constructor-arg ref="mongoConverter" />
</bean>
<bean id="mongo2" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="factory2" />
</bean>
<mongo:db-factory id="factory1" host="127.0.0.1" dbname="mongo-index-db1" />
<mongo:db-factory id="factory2" host="127.0.0.1" dbname="mongo-index-db2" />
</beans>

View File

@@ -3,9 +3,9 @@
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.0.xsd">
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<mongo:db-factory dbname="repositories" />

View File

@@ -3,9 +3,9 @@
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.0.xsd">
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<mongo:db-factory dbname="repositories"/>
<mongo:mapping-converter base-package="org.springframework.data.mongodb.repository"/>
@@ -15,7 +15,6 @@
<constructor-arg ref="mappingConverter"/>
</bean>
<mongo:repositories base-package="org.springframework.data.mongodb.repository"
create-query-indexes="true" />
<mongo:repositories base-package="org.springframework.data.mongodb.repository" create-query-indexes="true" />
</beans>

View File

@@ -4,7 +4,7 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:repository="http://www.springframework.org/schema/data/repository"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/data/repository http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.0.xsd">

View File

@@ -4,7 +4,7 @@
xmlns:p="http://www.springframework.org/schema/p"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">

View File

@@ -2,8 +2,8 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd">
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd">
<bean class="org.springframework.data.mongodb.core.TestMongoConfiguration"/>

View File

@@ -7,7 +7,7 @@ Import-Package:
Export-Template:
org.springframework.data.mongodb.*;version="${project.version}"
Import-Template:
com.google.common.base.*;version="[11.0.0,12.0.0)";resolution:=optional,
com.google.common.base.*;version="[11.0.0,14.0.0)";resolution:=optional,
com.mongodb.*;version="0",
com.mysema.query.*;version="[2.1.1, 3.0.0)";resolution:=optional,
javax.annotation.processing.*;version="0",

View File

@@ -52,7 +52,7 @@
<xi:include href="introduction/why-sd-doc.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/master/src/docbkx/repositories.xml">
<xi:include href="https://github.com/SpringSource/spring-data-commons/raw/1.4.0.RC1/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -72,9 +72,12 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/master/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.4.0.RC1/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.4.0.RC1/src/docbkx/repository-query-keywords-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-query-keywords-reference.xml" />
</xi:include>
</part>
</book>

View File

@@ -9,8 +9,8 @@
<para>This chapter will point out the specialties for repository support
for MongoDB. This builds on the core repository support explained in <xref
linkend="repositories" />. So make sure you've got a sound understanding
of the basic concepts explained there.</para>
linkend="repositories"/>. So make sure you've got a sound understanding of
the basic concepts explained there.</para>
</section>
<section id="mongo-repo-usage">
@@ -91,6 +91,36 @@
configure <code>mongo-template-ref</code> explicitly if you deviate from
this convention.</para>
<para>If you'd rather like to go with JavaConfig use the
<interfacename>@EnableMongoRepositories</interfacename> annotation. The
annotation carries the very same attributes like the namespace element. If
no base package is configured the infrastructure will scan the package of
the annotated configuration class.</para>
<example>
<title>JavaConfig for repositories</title>
<programlisting id="id2371855_07-mongodb" language="java">@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
protected String getDatabaseName() {
return "e-store";
}
@Override
public Mongo mongo() throws Exception {
return new Mongo();
}
@Override
protected String getMappingBasePackage() {
return "com.oreilly.springdata.mongodb"
}
}</programlisting>
</example>
<para>As our domain repository extends
<interfacename>PagingAndSortingRepository</interfacename> it provides you
with CRUD operations as well as methods for paginated and sorted access to
@@ -169,11 +199,11 @@ public class PersonRepositoryTests {
<title>Supported keywords for query methods</title>
<tgroup cols="3">
<colspec colwidth="1*" />
<colspec colwidth="1*"/>
<colspec colwidth="2*" />
<colspec colwidth="2*"/>
<colspec colwidth="2*" />
<colspec colwidth="2*"/>
<thead>
<row>
@@ -370,7 +400,7 @@ Distance distance = new Distance(200, Metrics.KILOMETERS);
<simplesect>
<title>Geo-near queries</title>
<para></para>
<para/>
<programlisting language="java">public interface PersonRepository extends MongoRepository&lt;Person, String&gt;
@@ -512,4 +542,4 @@ Page&lt;Person&gt; page = repository.findAll(person.lastname.contains("a"),
MongoDB queries.</para>
</section>
</section>
</chapter>
</chapter>

View File

@@ -624,7 +624,7 @@ public class MongoConfiguration {
}
</programlisting>
<para></para>
<para/>
</section>
<section id="mongo.mongo-db-factory-xml">
@@ -686,7 +686,7 @@ public class MongoConfiguration {
&lt;constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/&gt;
&lt;/bean&gt;</programlisting>
<para></para>
<para/>
</section>
</section>
@@ -1546,8 +1546,17 @@ assertThat(p.getAge(), is(1));</programlisting>
follow a fluent API style so that you can easily chain together multiple
method criteria and queries while having easy to understand code. Static
imports in Java are used to help remove the need to see the 'new' keyword
for creating Query and Criteria instances so as to improve
readability.</para>
for creating <classname>Query</classname> and
<classname>Criteria</classname> instances so as to improve readability. If
you like to create <classname>Query</classname> instances from a plain
JSON String use <classname>BasicQuery</classname>.</para>
<example>
<title>Creating a Query instance from a plain JSON String</title>
<programlisting language="java">BasicQuery query = new BasicQuery("{ age : { $lt : 50 }, accounts.balance : { $gt : 1000.00 }}");
List&lt;Person&gt; result = mongoTemplate.find(query, Person.class); </programlisting>
</example>
<para>GeoSpatial queries are also supported and are described more in the
section <link linkend="mongo.geospatial">GeoSpatial Queries</link>.</para>
@@ -1572,10 +1581,10 @@ assertThat(p.getAge(), is(1));</programlisting>
<programlisting language="java">import static org.springframework.data.mongodb.core.query.Criteria.where;
import static org.springframework.data.mongodb.core.query.Query.query;
...
List&lt;Person&gt; result = mongoTemplate.find(query(where("age").lt(50).and("accounts.balance").gt(1000.00d)), Person.class);
</programlisting>
List&lt;Person&gt; result = mongoTemplate.find(query(where("age").lt(50)
.and("accounts.balance").gt(1000.00d)), Person.class); </programlisting>
</example>
<para>All find methods take a <classname>Query</classname> object as a
@@ -1769,7 +1778,7 @@ import static org.springframework.data.mongodb.core.query.Query.query;
<para><literal>Criteria</literal> <emphasis role="bold">withinBox
</emphasis> <literal>(Box box)</literal> Creates a geospatial
criterion using a <literal>$within $box</literal> operation
<literal /></para>
<literal/></para>
</listitem>
<listitem>
@@ -2126,7 +2135,7 @@ MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce(query, "
<section id="mongo.group">
<title>Group Operations</title>
<para>As an alternative to usiing Map-Reduce to perform data aggregation,
<para>As an alternative to using Map-Reduce to perform data aggregation,
you can use the <ulink
url="http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Group"><literal>group</literal>
operation</ulink> which feels similar to using SQL's group by query style,
@@ -2722,4 +2731,4 @@ mongoTemplate.dropCollection("MyNewCollection"); </programlisting>
}
});</programlisting>
</section>
</chapter>
</chapter>

View File

@@ -1,6 +1,135 @@
Spring Data MongoDB Changelog
=============================================
Changes in version 1.1.0.RC1 (2012-24-08)
-----------------------------------------
** Bug
* [DATAMONGO-493] - Criteria.ne() method converts all value into ObjectId
* [DATAMONGO-494] - $or/$nor expressions do not consider entity class mapping
* [DATAMONGO-495] - JSON can't serialize Enum when printing Query in DEBUG message
* [DATAMONGO-497] - Reading an empty List throws a MappingInstantiationException because it returns an HashSet instead of returning an ArrayList
* [DATAMONGO-505] - Conversion of associations doesn't work for collection values
* [DATAMONGO-508] - DBRef can accidentally get added as PersistentProperty
* [DATAMONGO-517] - QueryMapping incorrectly translates complex keywords
** Improvement
* [DATAMONGO-496] - AbstractMongoConfiguration.getMappingBasePackage() could default to config class' package
* [DATAMONGO-499] - Namespace XSDs of current release version should refer to repositories XSD in version 1.0
* [DATAMONGO-500] - Index creation reacts on events not intended for it
* [DATAMONGO-502] - QueryMapper should transparently translate property names to field names
* [DATAMONGO-509] - SimpleMongoRepository.exists(…) can be improved.
* [DATAMONGO-510] - Criteria should only use BasicDBList internally
* [DATAMONGO-511] - QueryMapper should correctly transform associations
* [DATAMONGO-516] - Make Spring 3.1.2.RELEASE default Spring dependency version
** Task
* [DATAMONGO-513] - Release 1.1 RC1
Changes in version 1.0.4.RELEASE MongoDB (2012-08-24)
-----------------------------------------------------
** Bug
* [DATAMONGO-493] - Criteria.ne() method converts all value into ObjectId
* [DATAMONGO-494] - $or/$nor expressions do not consider entity class mapping
* [DATAMONGO-495] - JSON can't serialize Enum when printing Query in DEBUG message
** Improvement
* [DATAMONGO-499] - Namespace XSDs of current release version should refer to repositories XSD in version 1.0
** Task
* [DATAMONGO-514] - Release 1.0.4.
Changes in version 1.1.0.M2 (2012-24-07)
----------------------------------------
** Bug
* [DATAMONGO-378] - MapReduceResults ClassCastException due to raw results counts as Long
* [DATAMONGO-424] - Declaring a list of DBRef in a domian class results in Null for each DBRef when reading from mongo database
* [DATAMONGO-425] - Binding a Date to a manually defined repository query fails
* [DATAMONGO-428] - ClassCastException when using outputDatabase option in map-reduce
* [DATAMONGO-446] - Pageable query methods returning List are broken
* [DATAMONGO-447] - Removal of Documents fails in in debug mode for Documents with complex ids
* [DATAMONGO-450] - enabling DEBUG causes RuntimeException
* [DATAMONGO-454] - ServerAddressPropertyEditor fails if a hostname is unresolvable
* [DATAMONGO-458] - When reading back empty collections unmodifiable instances of Collections.emptyList/Set is returned.
* [DATAMONGO-462] - findAll() fails with NPE - discovering the root cause
* [DATAMONGO-465] - Mongo inserts document with "_id" as an integer but saves with "_id" as a string.
* [DATAMONGO-467] - String @id field is not mapped to ObjectId when using QueryDSL ".id" path
* [DATAMONGO-469] - Query creation from method names using AND criteria does not work anymore
* [DATAMONGO-474] - Wrong property is used for Id mapping
* [DATAMONGO-475] - 'group' operation fails where query references non primitive property
* [DATAMONGO-480] - The WriteResultChecking is not used in case of insert or save of documents.
* [DATAMONGO-483] - @Indexed(unique=true, name="foo") puts name's value to the 'key' in the MongoDB
* [DATAMONGO-489] - ClassCastException when loading Map<String, String[]>
** Improvement
* [DATAMONGO-448] - Remove the need for Converters for complex classes that are used as IDs
* [DATAMONGO-455] - Document how to use raw queries using BasicQuery
* [DATAMONGO-460] - Improve Querydsl implementation internals
* [DATAMONGO-466] - QueryMapper shouldn't map id properties of nested classes
* [DATAMONGO-470] - Criteria and Query should have proper equals(…) and hashCode() method.
* [DATAMONGO-477] - Change upper bound of Google Guava package import to 13
* [DATAMONGO-482] - typo in documentation - 2 i's in usiing
* [DATAMONGO-486] - Polish namspace implementation
* [DATAMONGO-491] - Release 1.1.0.M2
** New Feature
* [DATAMONGO-476] - JavaConfig support for Mongo repositories
** Task
* [DATAMONGO-451] - Tweak pom.xml to let Sonar build run without Bundlor
* [DATAMONGO-490] - Fix minor typos
Changes in version 1.0.3.RELEASE (2012-24-07)
---------------------------------------------
** Bug
* [DATAMONGO-467] - String @id field is not mapped to ObjectId when using QueryDSL ".id" path
* [DATAMONGO-469] - Query creation from method names using AND criteria does not work anymore
* [DATAMONGO-474] - Wrong property is used for Id mapping
* [DATAMONGO-475] - 'group' operation fails where query references non primitive property
* [DATAMONGO-480] - The WriteResultChecking is not used in case of insert or save of documents.
* [DATAMONGO-483] - @Indexed(unique=true, name="foo") puts name's value to the 'key' in the MongoDB
* [DATAMONGO-489] - ClassCastException when loading Map<String, String[]>
** Improvement
* [DATAMONGO-466] - QueryMapper shouldn't map id properties of nested classes
* [DATAMONGO-470] - Criteria and Query should have proper equals(…) and hashCode() method.
* [DATAMONGO-482] - typo in documentation - 2 i's in usiing
** Task
* [DATAMONGO-492] - Release 1.0.3
Changes in version 1.0.2.RELEASE (2012-06-20)
---------------------------------------------
** Bug
* [DATAMONGO-360] - java.lang.ClassCastException when placing GeospatialIndex into IndexOperations and invoking IndexOperations.getIndexInfo()
* [DATAMONGO-366] - Chapter 3.2. points to wrong bugtracker
* [DATAMONGO-378] - MapReduceResults ClassCastException due to raw results counts as Long
* [DATAMONGO-382] - ClassCastException: "com.mongodb.BasicDBObject cannot be cast to com.mongodb.BasicDBList" during find()
* [DATAMONGO-411] - Potential ClassCastExceptions in MongoPersistentEntityIndexCreator
* [DATAMONGO-412] - getUserCredentials() is called twice in AbstractMongoConfiguration::mongoDbFactory()
* [DATAMONGO-413] - Using "Or" in repository query yields a ClassCastException
* [DATAMONGO-422] - UUIDToBinaryConverter not compatible with mongo java driver
* [DATAMONGO-423] - Criteria.regex should use java.util.Pattern instead of $regex
* [DATAMONGO-425] - Binding a Date to a manually defined repository query fails
* [DATAMONGO-428] - ClassCastException when using outputDatabase option in map-reduce
* [DATAMONGO-429] - using @Query annotation, arrays are translated somewhere between query creation and mongo interpretation
* [DATAMONGO-446] - Pageable query methods returning List are broken
* [DATAMONGO-447] - Removal of Documents fails in in debug mode for Documents with complex ids
* [DATAMONGO-450] - enabling DEBUG causes RuntimeException
* [DATAMONGO-454] - ServerAddressPropertyEditor fails if a hostname is unresolvable
* [DATAMONGO-461] - MappedConstructor potentially throws NullPointerException
* [DATAMONGO-462] - findAll() fails with NPE - discovering the root cause
** Improvement
* [DATAMONGO-448] - Remove the need for Converters for complex classes that are used as IDs
* [DATAMONGO-455] - Document how to use raw queries using BasicQuery
** Task
* [DATAMONGO-463] - Release 1.0.2
Changes in version 1.1.0.M1 (2012-05-07)
----------------------------------------