Compare commits

..

30 Commits

Author SHA1 Message Date
Spring Buildmaster
4777dd2e5e DATAMONGO-919 - Release version 1.5.0.RC1. 2014-05-02 05:58:13 -07:00
Oliver Gierke
64b4591b72 DATAMONGO-919 - Prepare 1.5.0.RC1.
Upgraded to Spring Data Parent 1.4.0.RC1 and Spring Data Commons 1.8.0.RC1. Switched to milestone repository.
2014-05-02 14:50:48 +02:00
Oliver Gierke
bac5961fd8 DATAMONGO-919 - Updated changelog. 2014-05-02 14:50:48 +02:00
Thomas Darimont
b3cac862d8 DATAMONGO-924 - Improve aggregation field reference resolving.
Previously we didn't support referring to aliased fields defined in former stages of an aggregation pipeline. We now also consider field aliases during field reference lookup.

Original pull request: #176.
2014-05-02 14:46:13 +02:00
Oliver Gierke
b8ab2ad539 DATAMNOG-919 - Forward port changelog entries from bugfix branch. 2014-05-02 12:48:18 +02:00
Christoph Strobl
618d5bd5e9 DATAMONGO-919 - Prepare Release 1.5.0.RC1
Upgrade AspectJ Maven plugin to recent version which enables building the module on java 8 using AspectJ 1.8.
2014-05-01 21:02:01 +02:00
Oliver Gierke
096c3278b3 DATAMONGO-92 - Upgraded to MongoDB Java driver 2.12.1. 2014-04-30 08:51:18 +02:00
Kim Toms
c9d9976c22 DATAMONGO-920 - Improve debug message for delete events in AbstractMongoEventListener.
Adjusted debug message to reflect the actual operation.

Original pull request: #95.
2014-04-29 15:52:12 +02:00
Thomas Darimont
65497f93d4 DATAMONGO-827 - Allow index creation use mongodb auto generated names.
We now support letting MongoDB generate index names by introducing the attribute "useGeneratedName" to the @Indexed, @GeoSpatialIndex, @CompoundIndex annotations.
2014-04-28 19:58:56 +02:00
Christoph Strobl
af8a53bef6 DATAMONGO-909 - Compound index on inherited class should be created correctly.
With the overhaul of the index creation done in DATAMONGO-899 the CompoundIndex annotation is not longer just looked up at the concrete type but rather all its interfaces and super classes. So we just added an additional test to verify this behaviour.
2014-04-28 19:58:56 +02:00
Oliver Gierke
916b856e97 DATAMONGO-899 - Polished API of new index creation abstractions.
Removed the introduction of the IndexDefinition being collection aware again. The collection an index is created in is now held in the IndexDefinitionHolder. This is mostly due to the fact that the IndexDefinition implementations can be used with MongoTemplate and the index opoerations take a collection alongside the index definition.

Made the IndexResolver API package protected so that we can further change it going forward. We should think about deprecating the collectionName attributes on index annotations as it doesn't make too much sense to manually configure the collection name for the indexes as the collection is predefined through the domain type setting here. This would allow us to remove the entire collection handling code inside the IndexResolver implementation.

Turned IndexDefinitionHolder into a value object.

Original pull request: #168.
2014-04-28 19:58:50 +02:00
Christoph Strobl
7848da63f2 DATAMONGO-899 - Ensure proper creation of index structures for nested entities.
Index creation did not consider the properties path when creating the index. This lead to broken index creation when nesting entities that might require index structures.

Off now index creation traverses the entities property path for all top level entities (namely those holding the @Document annotation) and creates the index using the full property path.

This is a breaking change as having an entity to carry the @Document annotation has not been required by now.

Original Pull Request: #168
2014-04-28 16:50:26 +02:00
Christoph Strobl
f359a1d31a DATAMONGO-847 - Allow usage of Query within an Update clause.
In case we detect Query within a value used for an Update value we map the query itself to build the expression to use. This allows to form query statements for e.g. $pull using the same API as for the query itself.

Update update = new Update().pull("list", query(where("value").in("foo", "bar")));

Original Pull Request: #172.
2014-04-28 13:30:13 +02:00
Thomas Darimont
72d645feae DATAMONGO-917 - Improve Spring 4.0 framework version detection to avoid NullPointerExceptions.
We now check for the presence of DefaultParameterNameDiscoverer in order to determine if we are running with a Spring version later than 4.0 since this avoids potential NullPointerExceptions in cases where the package version information is not available e.g. in cases where the application was bundled into an uberjar e.g. via the maven-shade-plugin.

Original pull request: #173.
2014-04-28 13:21:09 +02:00
Thomas Darimont
72fe382bba DATAMONGO-913 - Improve DBRef handling in for LazyLoadingProxies.
We now use the captured DBRef of a given LazyLoadingProxy in MappingMongoConverter.toDBRef(..) in order to avoid a new DBRef creation that would fail for the proxy.

Original pull request: #174.
2014-04-28 13:07:57 +02:00
Thomas Darimont
d25e840cf5 DATAMONGO-914 - Improve resolving of lazy-loading proxies for classes that override equals(…)/hashCode().
We now properly resolve lazy-loading proxies for @DBRef's when an overridden equals or hash code method is called with Spring 4. We fall back to our old Objenesis proxy generation in order to circumvent the default handling for overridden hashcCode() and equals(…) methods in CglibAopProxies generated by Spring 4.

If we detect that we run with Spring 4 we use the repacked Objenesis that is included in Spring 4. Previously the generated proxy used some generic hashCode() or equals(…) logic that did not trigger a proper lazy loading in such cases.

Original pull request: #171.
2014-04-23 09:29:33 +02:00
Thomas Darimont
df1775572a DATAMONGO-912 - Consider custom conversions in all stages of an aggregation pipeline.
We now consider custom mongo conversions in all stages of an aggregation pipeline. Previously we did this only for the first stage and returned object basically unmapped in later stages. We now pass the root AggregationOperationContext on to nested ExposedFieldsAggregationOperationContexts so that those can delegate any mongo Mapping to the root context.

Original pull request: #170.
2014-04-23 09:00:56 +02:00
Christoph Strobl
86670cd49f DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:36:11 +02:00
Christoph Strobl
31a4bf906e DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:11:12 +02:00
Christoph Strobl
599291e8b7 DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails. 

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:11:00 +02:00
Thomas Darimont
f5a04fb9fb DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:55:57 +02:00
Oliver Gierke
88558b67c3 DATAMONGO-866 - Polishing for new field naming strategy configuration support.
Use the camel case split logic from Spring Data Commons (introduced for DATACMNS-486) in a common CamelCaseSplittingFieldNamingStrategy super class.

MappingMongoConverterParser now also rejects the configuration if both abbreviate-field-names and field-naming-strategy-ref are configured.
2014-04-10 18:36:37 +02:00
Ryan Tenney
d52cb255e0 DATAMONGO-866 - Add means to configure field naming strategy. 2014-04-10 18:03:55 +02:00
Oliver Gierke
6c214cbc37 DATAMONGO-910 - Upgraded to latest MongoDB Java driver 2.12.0.
Removed special mongo-osgi property as the driver version is now a valid OSGi version number.
2014-04-10 16:55:22 +02:00
Jeff Yemin
01012c1448 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:45 +02:00
Christoph Strobl
4c7befb910 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:43:51 +02:00
Christoph Strobl
b62669ec8f DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:33:34 +02:00
Oliver Gierke
0fb74caf9b DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 12:12:29 +02:00
Oliver Gierke
9623dac01f DATAMONGO-901 - Fixed regression of not registering type predicting post processor.
The changes for DATAMONGO-843 introduced a regression by skipping the registration of the RepositoryInterfaceAwareBeanPostProcessor. This can cause the wiring of repository bean definitions to fail depending on in which order the bean definitions get instantiated.

This change reintroduces the registration and adds an explicit test case for it.
2014-04-02 17:58:36 +02:00
Spring Buildmaster
695b27968c DATAMONGO-859 - Prepare next development iteration. 2014-03-31 17:10:40 +02:00
60 changed files with 3061 additions and 307 deletions

14
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.4.0.M1</version>
<version>1.4.0.RC1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,9 +29,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.8.0.M1</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
<springdata.commons>1.8.0.RC1</springdata.commons>
<mongo>2.12.1</mongo>
</properties>
<developers>
@@ -107,8 +106,7 @@
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
<mongo>2.12.0</mongo>
</properties>
</profile>
</profiles>
@@ -125,7 +123,7 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone/</url>
<url>http://repo.spring.io/libs-milestone</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
</dependency>
<dependency>
@@ -56,17 +56,13 @@
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
@@ -102,7 +98,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
@@ -130,7 +126,8 @@
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
</configuration>

View File

@@ -2,12 +2,12 @@ Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
Import-Template:
com.mongodb.*;version="${mongo:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,5 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.5.0.RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,7 +37,9 @@ import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.FieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.PropertyNameFieldNamingStrategy;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
@@ -51,6 +53,7 @@ import com.mongodb.Mongo;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -144,10 +147,7 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
@@ -232,4 +232,15 @@ public abstract class AbstractMongoConfiguration {
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -30,6 +30,7 @@ import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.parsing.ReaderContext;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -71,6 +72,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -83,8 +85,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
@@ -209,11 +214,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
@@ -222,6 +223,32 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return ctxRef;
}
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
if (StringUtils.hasText(fieldNamingStrategy) && StringUtils.hasText(abbreviateFieldNames)) {
context
.error("Only one of the attributes abbreviate-field-names and field-naming-strategy-ref can be configured!",
element);
return;
}
Object value = null;
if ("true".equals(abbreviateFieldNames)) {
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
} else if (StringUtils.hasText(fieldNamingStrategy)) {
value = new RuntimeBeanReference(fieldNamingStrategy);
}
if (value != null) {
builder.addPropertyValue("fieldNamingStrategy", value);
}
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");

View File

@@ -115,6 +115,7 @@ import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
*
@@ -354,7 +355,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query, null));
}
/**
@@ -532,7 +533,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
new QueryCursorPreparer(query, entityClass));
}
public <T> T findById(Object id, Class<T> entityClass) {
@@ -614,8 +615,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
@@ -626,8 +627,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findAndRemove(Query query, Class<T> entityClass, String collectionName) {
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass);
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass);
}
public long count(Query query, Class<?> entityClass) {
@@ -1941,6 +1943,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return converter;
}
private DBObject getMappedSortObject(Query query, Class<?> type) {
if (query == null || query.getSortObject() == null) {
return null;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
return queryMapper.getMappedObject(query.getSortObject(), entity);
}
// Callback implementations
/**
@@ -2134,9 +2146,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
private final Class<?> type;
public QueryCursorPreparer(Query query, Class<?> type) {
public QueryCursorPreparer(Query query) {
this.query = query;
this.type = type;
}
/*
@@ -2164,7 +2179,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
cursorToUse = cursorToUse.sort(getMappedSortObject(query, type));
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -248,7 +248,7 @@ public class Aggregation {
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}

View File

@@ -268,14 +268,21 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.isAliased();
}
/**
* @return the synthetic
*/
public boolean isSynthetic() {
return synthetic;
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param input
* @param name
* @return
*/
public boolean canBeReferredToBy(String input) {
return getTarget().equals(input);
public boolean canBeReferredToBy(String name) {
return getName().equals(name) || getTarget().equals(name);
}
/*
@@ -340,6 +347,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
public FieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,16 +32,22 @@ import com.mongodb.DBObject;
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
private final AggregationOperationContext rootContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
this.exposedFields = exposedFields;
this.rootContext = rootContext;
}
/*
@@ -50,7 +56,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
return rootContext.getMappedObject(dbObject);
}
/*
@@ -59,7 +65,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field.getTarget());
return getReference(field, field.getTarget());
}
/*
@@ -68,11 +74,42 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(String name) {
return getReference(null, name);
}
ExposedField field = exposedFields.getField(name);
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
*/
private FieldReference getReference(Field field, String name) {
if (field != null) {
return new FieldReference(field);
Assert.notNull(name, "Name must not be null!");
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new FieldReference(exposedField);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));

View File

@@ -28,11 +28,11 @@ import org.aopalliance.intercept.MethodInvocation;
import org.objenesis.Objenesis;
import org.objenesis.ObjenesisStd;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.BeanUtils;
import org.springframework.cglib.proxy.Callback;
import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory;
import org.springframework.cglib.proxy.MethodProxy;
import org.springframework.core.SpringVersion;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException;
@@ -57,7 +57,6 @@ import com.mongodb.DBRef;
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final boolean IS_SPRING_4_OR_BETTER = SpringVersion.getVersion().startsWith("4");
private static final boolean OBJENESIS_PRESENT = ClassUtils.isPresent("org.objenesis.Objenesis", null);
private final MongoDbFactory mongoDbFactory;
@@ -138,7 +137,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
proxyFactory.setProxyTargetClass(true);
proxyFactory.setTargetClass(propertyType);
if (IS_SPRING_4_OR_BETTER || !OBJENESIS_PRESENT) {
if (!OBJENESIS_PRESENT) {
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
@@ -374,13 +373,27 @@ public class DefaultDbRefResolver implements DbRefResolver {
}
/**
* Static class to accomodate optional dependency on Objenesis.
* Static class to accommodate optional dependency on Objenesis.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.4
*/
private static class ObjenesisProxyEnhancer {
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
private static final boolean IS_SPRING_4_OR_BETTER = ClassUtils.isPresent(
"org.springframework.core.DefaultParameterNameDiscoverer", null);
private static final InstanceCreatorStrategy INSTANCE_CREATOR;
static {
if (IS_SPRING_4_OR_BETTER) {
INSTANCE_CREATOR = new Spring4ObjenesisInstanceCreatorStrategy();
} else {
INSTANCE_CREATOR = new DefaultObjenesisInstanceCreatorStrategy();
}
}
public static Object enhanceAndGet(ProxyFactory proxyFactory, Class<?> type,
org.springframework.cglib.proxy.MethodInterceptor interceptor) {
@@ -390,9 +403,77 @@ public class DefaultDbRefResolver implements DbRefResolver {
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
Factory factory = (Factory) INSTANCE_CREATOR.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });
return factory;
}
/**
* Strategy for constructing new instances of a given {@link Class}.
*
* @author Thomas Darimont
*/
interface InstanceCreatorStrategy {
Object newInstance(Class<?> clazz);
}
/**
* An {@link InstanceCreatorStrategy} that uses Objenesis from the classpath.
*
* @author Thomas Darimont
*/
private static class DefaultObjenesisInstanceCreatorStrategy implements InstanceCreatorStrategy {
private static final Objenesis OBJENESIS = new ObjenesisStd(true);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.ObjenesisProxyEnhancer.InstanceCreatorStrategy#newInstance(java.lang.Class)
*/
@Override
public Object newInstance(Class<?> clazz) {
return OBJENESIS.newInstance(clazz);
}
}
/**
* An {@link InstanceCreatorStrategy} that uses a repackaged version of Objenesis from Spring 4.
*
* @author Thomas Darimont
*/
private static class Spring4ObjenesisInstanceCreatorStrategy implements InstanceCreatorStrategy {
private static final String SPRING4_OBJENESIS_CLASS_NAME = "org.springframework.objenesis.ObjenesisStd";
private static final Object OBJENESIS;
private static final Method NEW_INSTANCE_METHOD;
static {
try {
Class<?> objenesisClass = ClassUtils.forName(SPRING4_OBJENESIS_CLASS_NAME,
ObjenesisProxyEnhancer.class.getClassLoader());
OBJENESIS = BeanUtils.instantiateClass(objenesisClass.getConstructor(boolean.class), true);
NEW_INSTANCE_METHOD = objenesisClass.getMethod("newInstance", Class.class);
} catch (Exception e) {
throw new RuntimeException("Could not setup Objenesis infrastructure with Spring 4 ", e);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.ObjenesisProxyEnhancer.InstanceCreatorStrategy#newInstance(java.lang.Class)
*/
@Override
public Object newInstance(Class<?> clazz) {
try {
return NEW_INSTANCE_METHOD.invoke(OBJENESIS, clazz);
} catch (Exception e) {
throw new RuntimeException("Could not created instance for " + clazz, e);
}
}
}
}
}

View File

@@ -294,6 +294,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
}
// @see DATAMONGO-913
if (object instanceof LazyLoadingProxy) {
return ((LazyLoadingProxy) object).toDBRef();
}
return createDBRef(object, referingProperty);
}
@@ -919,7 +924,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? null : ClassTypeInformation.OBJECT;
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);

View File

@@ -250,14 +250,31 @@ public class QueryMapper {
* type of the given value is compatible with the type of the given document field in order to deal with potential
* query field exclusions, since MongoDB uses the {@code int} {@literal 0} as an indicator for an excluded field.
*
* @param documentField
* @param documentField must not be {@literal null}.
* @param value
* @return
*/
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return documentField.isAssociation() && value != null
&& (documentField.getProperty().getActualType().isAssignableFrom(value.getClass()) //
|| documentField.getPropertyEntity().getIdProperty().getActualType().isAssignableFrom(value.getClass()));
Assert.notNull(documentField, "Document field must not be null!");
if (value == null) {
return false;
}
if (!documentField.isAssociation()) {
return false;
}
Class<? extends Object> type = value.getClass();
MongoPersistentProperty property = documentField.getProperty();
if (property.getActualType().isAssignableFrom(type)) {
return true;
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
}
/**
@@ -289,7 +306,7 @@ public class QueryMapper {
* @return the converted mongo type or null if source is null
*/
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return converter.convertToMongoType(source);
return converter.convertToMongoType(source, entity == null ? null : entity.getTypeInformation());
}
protected Object convertAssociation(Object source, Field field) {
@@ -594,6 +611,21 @@ public class QueryMapper {
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this(name, entity, context, null);
}
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext} with the given {@link MongoPersistentProperty}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
* @param property may be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
MongoPersistentProperty property) {
super(name);
@@ -603,7 +635,7 @@ public class QueryMapper {
this.mappingContext = context;
this.path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
this.property = path == null ? property : path.getLeafProperty();
this.association = findAssociation();
}
@@ -613,7 +645,7 @@ public class QueryMapper {
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
return new MetadataBackedField(name, entity, mappingContext, property);
}
/*

View File

@@ -25,6 +25,7 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
@@ -79,10 +80,19 @@ public class UpdateMapper extends QueryMapper {
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
}
if (!isUpdateModifier(rawValue)) {
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
if (isQuery(rawValue)) {
return createMapEntry(field,
super.getMappedObject(((Query) rawValue).getQueryObject(), field.getPropertyEntity()));
}
if (isUpdateModifier(rawValue)) {
return getMappedUpdateModifier(field, rawValue);
}
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
}
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
Object value = null;
if (rawValue instanceof Modifier) {
@@ -99,7 +109,6 @@ public class UpdateMapper extends QueryMapper {
value = modificationOperations;
} else {
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
}
@@ -119,6 +128,10 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Modifier || value instanceof Modifiers;
}
private boolean isQuery(Object value) {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);

View File

@@ -78,6 +78,15 @@ public @interface CompoundIndex {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Index definition to span multiple keys.
*
* @author Christoph Strobl
* @since 1.5
*/
public class CompoundIndexDefinition extends Index {
private DBObject keys;
/**
* Creates a new {@link CompoundIndexDefinition} for the given keys.
*
* @param keys must not be {@literal null}.
*/
public CompoundIndexDefinition(DBObject keys) {
Assert.notNull(keys, "Keys must not be null!");
this.keys = keys;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.Index#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
BasicDBObject dbo = new BasicDBObject();
dbo.putAll(this.keys);
dbo.putAll(super.getIndexKeys());
return dbo;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,6 +25,7 @@ import java.lang.annotation.Target;
*
* @author Jon Brisbin
* @author Laurent Canet
* @author Thomas Darimont
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -37,6 +38,15 @@ public @interface GeoSpatialIndexed {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Name of the collection in which to create the index.
*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import com.mongodb.DBObject;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Laurent Canet
* @author Christoph Strobl
*/
public class GeospatialIndex implements IndexDefinition {
@@ -57,8 +58,6 @@ public class GeospatialIndex implements IndexDefinition {
*/
public GeospatialIndex named(String name) {
Assert.hasText(name, "Name must have text!");
this.name = name;
return this;
}
@@ -151,12 +150,12 @@ public class GeospatialIndex implements IndexDefinition {
public DBObject getIndexOptions() {
if (name == null && min == null && max == null && bucketSize == null) {
if (!StringUtils.hasText(name) && min == null && max == null && bucketSize == null) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
@@ -186,6 +185,7 @@ public class GeospatialIndex implements IndexDefinition {
}
break;
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,13 +17,20 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Oliver Gierke
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
public class Index implements IndexDefinition {
@@ -41,6 +48,10 @@ public class Index implements IndexDefinition {
private boolean sparse = false;
private boolean background = false;
private long expire = -1;
public Index() {}
public Index(String key, Direction direction) {
@@ -104,6 +115,44 @@ public class Index implements IndexDefinition {
return this;
}
/**
* Build the index in background (non blocking).
*
* @return
* @since 1.5
*/
public Index background() {
this.background = true;
return this;
}
/**
* Specifies TTL in seconds.
*
* @param value
* @return
* @since 1.5
*/
public Index expire(long value) {
return expire(value, TimeUnit.SECONDS);
}
/**
* Specifies TTL with given {@link TimeUnit}.
*
* @param value
* @param unit
* @return
* @since 1.5
*/
public Index expire(long value, TimeUnit unit) {
Assert.notNull(unit, "TimeUnit for expiration must not be null.");
this.expire = unit.toSeconds(value);
return this;
}
/**
* @see http://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping
* @param duplicates
@@ -125,11 +174,9 @@ public class Index implements IndexDefinition {
}
public DBObject getIndexOptions() {
if (name == null && !unique) {
return null;
}
DBObject dbo = new BasicDBObject();
if (name != null) {
if (StringUtils.hasText(name)) {
dbo.put("name", name);
}
if (unique) {
@@ -141,6 +188,13 @@ public class Index implements IndexDefinition {
if (sparse) {
dbo.put("sparse", true);
}
if (background) {
dbo.put("background", true);
}
if (expire >= 0) {
dbo.put("expireAfterSeconds", expire);
}
return dbo;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright (c) 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,11 +20,11 @@ import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Christoph Strobl
*/
public interface IndexDefinition {
DBObject getIndexKeys();
DBObject getIndexOptions();
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @since 1.5
*/
interface IndexResolver {
/**
* Find and create {@link IndexDefinition}s for properties of given {@code type}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param type
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinitionHolder> resolveIndexForClass(Class<?> type);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,6 +27,7 @@ import java.lang.annotation.Target;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Thomas Darimont
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@@ -63,6 +64,15 @@ public @interface Indexed {
*/
String name() default "";
/**
* If set to {@literal true} then MongoDB will ignore the given index name and instead generate a new name. Defaults
* to {@literal false}.
*
* @return
* @since 1.5
*/
boolean useGeneratedName() default false;
/**
* Colleciton name for index to be created on.
*

View File

@@ -22,19 +22,15 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
@@ -45,6 +41,7 @@ import com.mongodb.util.JSON;
* @author Philipp Schneider
* @author Johno Crawford
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
@@ -54,21 +51,37 @@ public class MongoPersistentEntityIndexCreator implements
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoDbFactory mongoDbFactory;
private final MongoMappingContext mappingContext;
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}
* @param mongoDbFactory must not be {@literal null}
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
this(mappingContext, mongoDbFactory, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory,
IndexResolver indexResolver) {
Assert.notNull(mongoDbFactory);
Assert.notNull(mappingContext);
Assert.notNull(indexResolver);
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.indexResolver = indexResolver;
for (MongoPersistentEntity<?> entity : mappingContext.getPersistentEntities()) {
checkForIndexes(entity);
@@ -93,87 +106,36 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void checkForIndexes(final MongoPersistentEntity<?> entity) {
final Class<?> type = entity.getType();
private void checkForIndexes(final MongoPersistentEntity<?> entity) {
Class<?> type = entity.getType();
if (!classesSeen.containsKey(type)) {
this.classesSeen.put(type, Boolean.TRUE);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Analyzing class " + type + " for index information.");
}
// Make sure indexes get created
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created compound index " + index);
}
}
}
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.isAnnotationPresent(Indexed.class)) {
Indexed index = property.findAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = property.getFieldName();
} else {
if (!name.equals(property.getName()) && index.unique() && !index.sparse()) {
// Names don't match, and sparse is not true. This situation will generate an error on the server.
if (LOGGER.isWarnEnabled()) {
LOGGER.warn("The index name " + name + " doesn't match this property name: " + property.getName()
+ ". Setting sparse=true on this index will prevent errors when inserting documents.");
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(property.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse(),
index.background(), index.expireAfterSeconds());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created property index " + index);
}
} else if (property.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = property.findAnnotation(GeoSpatialIndexed.class);
GeospatialIndex indexObject = new GeospatialIndex(property.getFieldName());
indexObject.withMin(index.min()).withMax(index.max());
indexObject.named(StringUtils.hasText(index.name()) ? index.name() : property.getName());
indexObject.typed(index.type()).withBucketSize(index.bucketSize())
.withAdditionalField(index.additionalField());
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
mongoDbFactory.getDb().getCollection(collection)
.ensureIndex(indexObject.getIndexKeys(), indexObject.getIndexOptions());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Created %s for entity %s in collection %s! ", indexObject, entity.getType(),
collection));
}
}
}
});
classesSeen.put(type, true);
checkForAndCreateIndexes(entity);
}
}
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.findAnnotation(Document.class) != null) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexForClass(entity.getType())) {
createIndex(indexToCreate);
}
}
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
}
/**
* Returns whether the current index creator was registered for the given {@link MappingContext}.
*
@@ -183,33 +145,4 @@ public class MongoPersistentEntityIndexCreator implements
public boolean isIndexCreatorFor(MappingContext<?, ?> context) {
return this.mappingContext.equals(context);
}
/**
* Triggers the actual index creation.
*
* @param collection the collection to create the index in
* @param name the name of the index about to be created
* @param indexDefinition the index definition
* @param unique whether it shall be a unique index
* @param dropDups whether to drop duplicates
* @param sparse sparse or not
* @param background whether the index will be created in the background
* @param expireAfterSeconds the time to live for documents in the collection
*/
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
opts.put("background", background);
if (expireAfterSeconds != -1) {
opts.put("expireAfterSeconds", expireAfterSeconds);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -0,0 +1,344 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* {@link IndexResolver} implementation inspecting {@link MongoPersistentEntity} for {@link MongoPersistentEntity} to be
* indexed. <br />
* All {@link MongoPersistentProperty} of the {@link MongoPersistentEntity} are inspected for potential indexes by
* scanning related annotations.
*
* @author Christoph Strobl
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
private final MongoMappingContext mappingContext;
/**
* Create new {@link MongoPersistentEntityIndexResolver}.
*
* @param mappingContext must not be {@literal null}.
*/
public MongoPersistentEntityIndexResolver(MongoMappingContext mappingContext) {
Assert.notNull(mappingContext, "Mapping context must not be null in order to resolve index definitions");
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexResolver#resolveIndexForClass(java.lang.Class)
*/
@Override
public List<IndexDefinitionHolder> resolveIndexForClass(Class<?> type) {
return resolveIndexForEntity(mappingContext.getPersistentEntity(type));
}
/**
* Resolve the {@link IndexDefinition}s for given {@literal root} entity by traversing {@link MongoPersistentProperty}
* scanning for index annotations {@link Indexed}, {@link CompoundIndex} and {@link GeospatialIndex}. The given
* {@literal root} has therefore to be annotated with {@link Document}.
*
* @param root must not be null.
* @return List of {@link IndexDefinitionHolder}. Will never be {@code null}.
* @throws IllegalArgumentException in case of missing {@link Document} annotation marking root entities.
*/
public List<IndexDefinitionHolder> resolveIndexForEntity(final MongoPersistentEntity<?> root) {
Assert.notNull(root, "Index cannot be resolved for given 'null' entity.");
Document document = root.findAnnotation(Document.class);
Assert.notNull(document, "Given entity is not collection root.");
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root.getType()));
root.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
if (persistentProperty.isEntity()) {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(),
persistentProperty.getFieldName(), root.getCollection()));
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(
persistentProperty.getFieldName(), root.getCollection(), persistentProperty);
if (indexDefinitionHolder != null) {
indexInformation.add(indexDefinitionHolder);
}
}
});
return indexInformation;
}
/**
* Recursively resolve and inspect properties of given {@literal type} for indexes to be created.
*
* @param type
* @param path The {@literal "dot} path.
* @param collection
* @return List of {@link IndexDefinitionHolder} representing indexes for given type and its referenced property
* types. Will never be {@code null}.
*/
private List<IndexDefinitionHolder> resolveIndexForClass(Class<?> type, final String path, final String collection) {
final List<IndexDefinitionHolder> indexInformation = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions(path, collection, type));
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
String propertyDotPath = (StringUtils.hasText(path) ? path + "." : "") + persistentProperty.getFieldName();
if (persistentProperty.isEntity()) {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(), propertyDotPath, collection));
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(propertyDotPath,
collection, persistentProperty);
if (indexDefinitionHolder != null) {
indexInformation.add(indexDefinitionHolder);
}
}
});
return indexInformation;
}
private IndexDefinitionHolder createIndexDefinitionHolderForProperty(String dotPath, String collection,
MongoPersistentProperty persistentProperty) {
if (persistentProperty.isAnnotationPresent(Indexed.class)) {
return createIndexDefinition(dotPath, collection, persistentProperty);
} else if (persistentProperty.isAnnotationPresent(GeoSpatialIndexed.class)) {
return createGeoSpatialIndexDefinition(dotPath, collection, persistentProperty);
}
return null;
}
private List<IndexDefinitionHolder> potentiallyCreateCompoundIndexDefinitions(String dotPath, String collection,
Class<?> type) {
if (AnnotationUtils.findAnnotation(type, CompoundIndexes.class) == null) {
return Collections.emptyList();
}
return createCompoundIndexDefinitions(dotPath, collection, type);
}
/**
* Create {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} for {@link CompoundIndexes} of given type.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param fallbackCollection
* @param type
* @return
*/
protected List<IndexDefinitionHolder> createCompoundIndexDefinitions(String dotPath, String fallbackCollection,
Class<?> type) {
CompoundIndexes indexes = AnnotationUtils.findAnnotation(type, CompoundIndexes.class);
List<IndexDefinitionHolder> indexDefinitions = new ArrayList<MongoPersistentEntityIndexResolver.IndexDefinitionHolder>(
indexes.value().length);
for (CompoundIndex index : indexes.value()) {
String path = StringUtils.hasText(index.name()) ? index.name() : dotPath;
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
CompoundIndexDefinition indexDefinition = new CompoundIndexDefinition((DBObject) JSON.parse(index.def()));
if (!index.useGeneratedName()) {
indexDefinition.named(index.name());
}
if (index.unique()) {
indexDefinition.unique(index.dropDups() ? Duplicates.DROP : Duplicates.RETAIN);
}
if (index.sparse()) {
indexDefinition.sparse();
}
if (index.background()) {
indexDefinition.background();
}
if (index.expireAfterSeconds() >= 0) {
indexDefinition.expire(index.expireAfterSeconds(), TimeUnit.SECONDS);
}
indexDefinitions.add(new IndexDefinitionHolder(path, indexDefinition, collection));
}
return indexDefinitions;
}
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link Indexed} for given
* {@link MongoPersistentProperty}.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persitentProperty
* @return
*/
protected IndexDefinitionHolder createIndexDefinition(String dotPath, String fallbackCollection,
MongoPersistentProperty persitentProperty) {
Indexed index = persitentProperty.findAnnotation(Indexed.class);
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
Index indexDefinition = new Index();
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : persitentProperty.getFieldName());
}
indexDefinition.on(persitentProperty.getFieldName(),
IndexDirection.ASCENDING.equals(index.direction()) ? Sort.Direction.ASC : Sort.Direction.DESC);
if (index.unique()) {
indexDefinition.unique(index.dropDups() ? Duplicates.DROP : Duplicates.RETAIN);
}
if (index.sparse()) {
indexDefinition.sparse();
}
if (index.background()) {
indexDefinition.background();
}
if (index.expireAfterSeconds() >= 0) {
indexDefinition.expire(index.expireAfterSeconds(), TimeUnit.SECONDS);
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
/**
* Creates {@link IndexDefinition} wrapped in {@link IndexDefinitionHolder} out of {@link GeoSpatialIndexed} for
* {@link MongoPersistentProperty}.
*
* @param dotPath The properties {@literal "dot"} path representation from its document root.
* @param collection
* @param persistentProperty
* @return
*/
protected IndexDefinitionHolder createGeoSpatialIndexDefinition(String dotPath, String fallbackCollection,
MongoPersistentProperty persistentProperty) {
GeoSpatialIndexed index = persistentProperty.findAnnotation(GeoSpatialIndexed.class);
String collection = StringUtils.hasText(index.collection()) ? index.collection() : fallbackCollection;
GeospatialIndex indexDefinition = new GeospatialIndex(dotPath);
indexDefinition.withBits(index.bits());
indexDefinition.withMin(index.min()).withMax(index.max());
if (!index.useGeneratedName()) {
indexDefinition.named(StringUtils.hasText(index.name()) ? index.name() : persistentProperty.getName());
}
indexDefinition.typed(index.type()).withBucketSize(index.bucketSize()).withAdditionalField(index.additionalField());
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
/**
* Implementation of {@link IndexDefinition} holding additional (property)path information used for creating the
* index. The path itself is the properties {@literal "dot"} path representation from its root document.
*
* @author Christoph Strobl
* @since 1.5
*/
public static class IndexDefinitionHolder implements IndexDefinition {
private String path;
private IndexDefinition indexDefinition;
private String collection;
/**
* Create
*
* @param path
*/
public IndexDefinitionHolder(String path, IndexDefinition definition, String collection) {
this.path = path;
this.indexDefinition = definition;
this.collection = collection;
}
public String getCollection() {
return collection;
}
/**
* Get the {@liteal "dot"} path used to create the index.
*
* @return
*/
public String getPath() {
return path;
}
/**
* Get the {@literal raw} {@link IndexDefinition}.
*
* @return
*/
public IndexDefinition getIndexDefinition() {
return indexDefinition;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexKeys()
*/
@Override
public DBObject getIndexKeys() {
return indexDefinition.getIndexKeys();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexOptions()
*/
@Override
public DBObject getIndexOptions() {
return indexDefinition.getIndexOptions();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.Locale;
/**
* {@link FieldNamingStrategy} that abbreviates field names by using the very first letter of the camel case parts of
* the {@link MongoPersistentProperty}'s name.
@@ -24,23 +22,18 @@ import java.util.Locale;
* @since 1.3
* @author Oliver Gierke
*/
public class CamelCaseAbbreviatingFieldNamingStrategy implements FieldNamingStrategy {
public class CamelCaseAbbreviatingFieldNamingStrategy extends CamelCaseSplittingFieldNamingStrategy {
private static final String CAMEL_CASE_PATTERN = "(?<!(^|[A-Z]))(?=[A-Z])|(?<!^)(?=[A-Z][a-z])";
public CamelCaseAbbreviatingFieldNamingStrategy() {
super("");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
* @see org.springframework.data.mongodb.core.mapping.CamelCaseSplittingFieldNamingStrategy#preparePart(java.lang.String)
*/
public String getFieldName(MongoPersistentProperty property) {
String[] parts = property.getName().split(CAMEL_CASE_PATTERN);
StringBuilder builder = new StringBuilder();
for (String part : parts) {
builder.append(part.substring(0, 1).toLowerCase(Locale.US));
}
return builder.toString();
@Override
protected String preparePart(String part) {
return part.substring(0, 1);
}
}

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.util.ParsingUtils;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* Configurable {@link FieldNamingStrategy} that splits up camel-case property names and reconcatenates them using a
* configured delimiter. Individual parts of the name can be manipulated using {@link #preparePart(String)}.
*
* @author Oliver Gierke
* @since 1.5
*/
public class CamelCaseSplittingFieldNamingStrategy implements FieldNamingStrategy {
private final String delimiter;
/**
* Creates a new {@link CamelCaseSplittingFieldNamingStrategy}.
*
* @param delimiter must not be {@literal null}.
*/
public CamelCaseSplittingFieldNamingStrategy(String delimiter) {
Assert.notNull(delimiter, "Delimiter must not be null!");
this.delimiter = delimiter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.FieldNamingStrategy#getFieldName(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public String getFieldName(MongoPersistentProperty property) {
List<String> parts = ParsingUtils.splitCamelCaseToLower(property.getName());
List<String> result = new ArrayList<String>();
for (String part : parts) {
String candidate = preparePart(part);
if (StringUtils.hasText(candidate)) {
result.add(candidate);
}
}
return StringUtils.collectionToDelimitedString(result, delimiter);
}
/**
* Callback to prepare the uncapitalized part obtained from the split up of the camel case source. Default
* implementation returns the part as is.
*
* @param part
* @return
*/
protected String preparePart(String part) {
return part;
}
}

View File

@@ -21,6 +21,7 @@ package org.springframework.data.mongodb.core.mapping;
* @see DocumentField
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @see SnakeCaseFieldNamingStrategy
* @since 1.3
* @author Oliver Gierke
*/

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
/**
* {@link FieldNamingStrategy} that translates typical camel case Java property names to lower case JSON element names,
* separated by underscores.
*
* @since 1.5
* @author Ryan Tenney
* @author Oliver Gierke
*/
public class SnakeCaseFieldNamingStrategy extends CamelCaseSplittingFieldNamingStrategy {
/**
* Creates a new {@link SnakeCaseFieldNamingStrategy}.
*/
public SnakeCaseFieldNamingStrategy() {
super("_");
}
}

View File

@@ -126,13 +126,13 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
public void onAfterDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
LOG.debug("onAfterDelete({})", dbo);
}
}
public void onBeforeDelete(DBObject dbo) {
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert({})", dbo);
LOG.debug("onBeforeDelete({})", dbo);
}
}
}

View File

@@ -105,6 +105,8 @@ public class MongoRepositoryConfigurationExtension extends RepositoryConfigurati
@Override
public void registerBeansForRoot(BeanDefinitionRegistry registry, RepositoryConfigurationSource configurationSource) {
super.registerBeansForRoot(registry, configurationSource);
if (!registry.containsBeanDefinition(BeanNames.MAPPING_CONTEXT_BEAN_NAME)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);

View File

@@ -3,4 +3,5 @@ http\://www.springframework.org/schema/data/mongo/spring-mongo-1.1.xsd=org/sprin
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.2.xsd=org/springframework/data/mongodb/config/spring-mongo-1.2.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.3.xsd=org/springframework/data/mongodb/config/spring-mongo-1.3.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.4.xsd=org/springframework/data/mongodb/config/spring-mongo-1.4.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.4.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.5.xsd=org/springframework/data/mongodb/config/spring-mongo-1.5.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.5.xsd

View File

@@ -0,0 +1,657 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="authentication-dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the authentication database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="type-mapper-ref" type="typeMapperRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTypeMapper to be used by this MappingMongoConverter.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="abbreviate-field-names" use="optional" default="false">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy">
Enables abbreviating the field names for domain class properties to the
first character of their camel case names, e.g. fooBar -> fb.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="field-naming-strategy-ref" type="fieldNamingStrategyRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.FieldNamingStrategy">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="typeMapperRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoTypeMapper"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="fieldNamingStrategyRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.mapping.FieldNamingStrategy"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="sslSocketFactoryRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="javax.net.ssl.SSLSocketFactory"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl" type="xsd:boolean">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver should us an SSL connection. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl-socket-factory-ref" type="sslSocketFactoryRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The SSLSocketFactory to use for the SSL connection. If none is configured here, SSLSocketFactory#getDefault() will be used.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:simpleType name="converterRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:element name="template">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string"
use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to
type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="gridFsTemplate">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="bucket" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The GridFs bucket string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -21,9 +21,12 @@ import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Set;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanDefinitionParsingException;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.convert.TypeDescriptor;
@@ -45,17 +48,13 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParserIntegrationTests {
DefaultListableBeanFactory factory;
@Rule public ExpectedException exception = ExpectedException.none();
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter.xml"));
}
DefaultListableBeanFactory factory;
/**
* @see DATAMONGO-243
@@ -63,6 +62,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void allowsDbFactoryRefAttribute() {
loadValidConfiguration();
factory.getBeanDefinition("converter");
factory.getBean("converter");
}
@@ -73,6 +73,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void hasCustomTypeMapper() {
loadValidConfiguration();
MappingMongoConverter converter = factory.getBean("converter", MappingMongoConverter.class);
MongoTypeMapper customMongoTypeMapper = factory.getBean(CustomMongoTypeMapper.class);
@@ -85,6 +86,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {
loadValidConfiguration();
CustomConversions conversions = factory.getBean(CustomConversions.class);
assertThat(conversions.hasCustomWriteTarget(Person.class), is(true));
assertThat(conversions.hasCustomWriteTarget(Account.class), is(true));
@@ -96,6 +98,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void activatesAbbreviatingPropertiesCorrectly() {
loadValidConfiguration();
BeanDefinition definition = factory.getBeanDefinition("abbreviatingConverter.mongoMappingContext");
Object value = definition.getPropertyValues().getPropertyValue("fieldNamingStrategy").getValue();
@@ -104,6 +107,47 @@ public class MappingMongoConverterParserIntegrationTests {
assertThat(strategy.getBeanClassName(), is(CamelCaseAbbreviatingFieldNamingStrategy.class.getName()));
}
/**
* @see DATAMONGO-866
*/
@Test
public void rejectsInvalidFieldNamingStrategyConfiguration() {
exception.expect(BeanDefinitionParsingException.class);
exception.expectMessage("abbreviate-field-names");
exception.expectMessage("field-naming-strategy-ref");
BeanDefinitionRegistry factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter-invalid.xml"));
}
/**
* @see DATAMONGO-892
*/
@Test
public void shouldThrowBeanDefinitionParsingExceptionIfConverterDefinedAsNestedBean() {
exception.expect(BeanDefinitionParsingException.class);
exception.expectMessage("Mongo Converter must not be defined as nested bean.");
loadNestedBeanConfiguration();
}
private void loadValidConfiguration() {
this.loadConfiguration("namespace/converter.xml");
}
private void loadNestedBeanConfiguration() {
this.loadConfiguration("namespace/converter-nested-bean-definition.xml");
}
private void loadConfiguration(String configLocation) {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource(configLocation));
}
@Component
public static class SampleConverter implements Converter<Person, DBObject> {
public DBObject convert(Person source) {

View File

@@ -34,7 +34,6 @@ import java.util.List;
import java.util.Map;
import org.bson.types.ObjectId;
import org.hamcrest.CoreMatchers;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Before;
@@ -187,6 +186,8 @@ public class MongoTemplateTests {
template.dropCollection(DocumentWithCollectionOfSimpleType.class);
template.dropCollection(DocumentWithMultipleCollections.class);
template.dropCollection(DocumentWithDBRefCollection.class);
template.dropCollection(SomeContent.class);
template.dropCollection(SomeTemplate.class);
}
@Test
@@ -299,6 +300,9 @@ public class MongoTemplateTests {
@Test
public void rejectsDuplicateIdInInsertAll() {
thrown.expect(DataIntegrityViolationException.class);
thrown.expectMessage("E11000 duplicate key error index: database.person.$_id_");
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
@@ -310,15 +314,7 @@ public class MongoTemplateTests {
records.add(person);
records.add(person);
try {
template.insertAll(records);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
CoreMatchers
.startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
template.insertAll(records);
}
@Test
@@ -2599,6 +2595,151 @@ public class MongoTemplateTests {
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
/**
* @see DATAMONGO-888
*/
@Test
public void sortOnIdFieldPropertyShouldBeMappedCorrectly() {
DoucmentWithNamedIdField one = new DoucmentWithNamedIdField();
one.someIdKey = "1";
one.value = "a";
DoucmentWithNamedIdField two = new DoucmentWithNamedIdField();
two.someIdKey = "2";
two.value = "b";
template.save(one);
template.save(two);
Query query = query(where("_id").in("1", "2")).with(new Sort(Direction.DESC, "someIdKey"));
assertThat(template.find(query, DoucmentWithNamedIdField.class), contains(two, one));
}
/**
* @see DATAMONGO-888
*/
@Test
public void sortOnAnnotatedFieldPropertyShouldBeMappedCorrectly() {
DoucmentWithNamedIdField one = new DoucmentWithNamedIdField();
one.someIdKey = "1";
one.value = "a";
DoucmentWithNamedIdField two = new DoucmentWithNamedIdField();
two.someIdKey = "2";
two.value = "b";
template.save(one);
template.save(two);
Query query = query(where("_id").in("1", "2")).with(new Sort(Direction.DESC, "value"));
assertThat(template.find(query, DoucmentWithNamedIdField.class), contains(two, one));
}
/**
* @see DATAMONGO-913
*/
@Test
public void shouldRetrieveInitializedValueFromDbRefAssociationAfterLoad() {
SomeContent content = new SomeContent();
content.id = "content-1";
content.name = "Content 1";
content.text = "Some text";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "template-1";
tmpl.content = content;
template.save(tmpl);
SomeTemplate result = template.findOne(query(where("content").is(tmpl.getContent())), SomeTemplate.class);
assertThat(result, is(notNullValue()));
assertThat(result.getContent(), is(notNullValue()));
assertThat(result.getContent().getId(), is(notNullValue()));
assertThat(result.getContent().getName(), is(notNullValue()));
assertThat(result.getContent().getText(), is(content.getText()));
}
/**
* @see DATAMONGO-913
*/
@Test
public void shouldReuseExistingDBRefInQueryFromDbRefAssociationAfterLoad() {
SomeContent content = new SomeContent();
content.id = "content-1";
content.name = "Content 1";
content.text = "Some text";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "template-1";
tmpl.content = content;
template.save(tmpl);
SomeTemplate result = template.findOne(query(where("content").is(tmpl.getContent())), SomeTemplate.class);
// Use lazy-loading-proxy in query
result = template.findOne(query(where("content").is(result.getContent())), SomeTemplate.class);
assertNotNull(result.getContent().getName());
assertThat(result.getContent().getName(), is(content.getName()));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@Field(value = "val")//
String value;
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (someIdKey == null ? 0 : someIdKey.hashCode());
result = prime * result + (value == null ? 0 : value.hashCode());
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (!(obj instanceof DoucmentWithNamedIdField)) {
return false;
}
DoucmentWithNamedIdField other = (DoucmentWithNamedIdField) obj;
if (someIdKey == null) {
if (other.someIdKey != null) {
return false;
}
} else if (!someIdKey.equals(other.someIdKey)) {
return false;
}
if (value == null) {
if (other.value != null) {
return false;
}
} else if (!value.equals(other.value)) {
return false;
}
return true;
}
}
static class DocumentWithDBRefCollection {
@Id public String id;
@@ -2793,6 +2934,15 @@ public class MongoTemplateTests {
String id;
String text;
String name;
public String getName() {
return name;
}
public String getId() {
return id;
}
public String getText() {
return text;

View File

@@ -235,7 +235,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
org.mockito.Matchers.isNull(DBObject.class), org.mockito.Matchers.isNull(DBObject.class), eq(false),
captor.capture(), eq(false), eq(false));
Assert.assertThat(captor.getValue().get("$inc"), Is.<Object> is(new BasicDBObject("version", 1)));
Assert.assertThat(captor.getValue().get("$inc"), Is.<Object> is(new BasicDBObject("version", 1L)));
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -33,14 +33,13 @@ import com.mongodb.DBCursor;
* Unit tests for {@link QueryCursorPreparer}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryCursorPreparerUnitTests {
@Mock
MongoDbFactory factory;
@Mock
DBCursor cursor;
@Mock MongoDbFactory factory;
@Mock DBCursor cursor;
/**
* @see DATAMONGO-185
@@ -50,7 +49,7 @@ public class QueryCursorPreparerUnitTests {
Query query = query(where("foo").is("bar")).withHint("hint");
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query);
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query, null);
preparer.prepare(cursor);
verify(cursor).hint("hint");

View File

@@ -47,6 +47,7 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.aggregation.AggregationTests.CarDescriptor.Entry;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.Version;
import org.springframework.test.context.ContextConfiguration;
@@ -822,6 +823,40 @@ public class AggregationTests {
assertThat(invoice.getTotalAmount(), is(closeTo(9.877, 000001)));
}
/**
* @see DATAMONGO-924
*/
@Test
public void shouldAllowGroupingByAliasedFieldDefinedInFormerAggregationStage() {
mongoTemplate.dropCollection(CarPerson.class);
CarPerson person1 = new CarPerson("first1", "last1", new CarDescriptor.Entry("MAKE1", "MODEL1", 2000),
new CarDescriptor.Entry("MAKE1", "MODEL2", 2001), new CarDescriptor.Entry("MAKE2", "MODEL3", 2010),
new CarDescriptor.Entry("MAKE3", "MODEL4", 2014));
CarPerson person2 = new CarPerson("first2", "last2", new CarDescriptor.Entry("MAKE3", "MODEL4", 2014));
CarPerson person3 = new CarPerson("first3", "last3", new CarDescriptor.Entry("MAKE2", "MODEL5", 2011));
mongoTemplate.save(person1);
mongoTemplate.save(person2);
mongoTemplate.save(person3);
TypedAggregation<CarPerson> agg = Aggregation.newAggregation(CarPerson.class,
unwind("descriptors.carDescriptor.entries"), //
project() //
.and("descriptors.carDescriptor.entries.make").as("make") //
.and("descriptors.carDescriptor.entries.model").as("model") //
.and("firstName").as("firstName") //
.and("lastName").as("lastName"), //
group("make"));
AggregationResults<DBObject> result = mongoTemplate.aggregate(agg, DBObject.class);
assertThat(result.getMappedResults(), hasSize(3));
}
private void assertLikeStats(LikeStats like, String id, long count) {
assertThat(like, is(notNullValue()));
@@ -938,4 +973,52 @@ public class AggregationTests {
this.createDate = createDate;
}
}
@org.springframework.data.mongodb.core.mapping.Document
static class CarPerson {
@Id private String id;
private String firstName;
private String lastName;
private Descriptors descriptors;
public CarPerson(String firstname, String lastname, Entry... entries) {
this.firstName = firstname;
this.lastName = lastname;
this.descriptors = new Descriptors();
this.descriptors.carDescriptor = new CarDescriptor(entries);
}
}
static class Descriptors {
private CarDescriptor carDescriptor;
}
static class CarDescriptor {
private List<Entry> entries = new ArrayList<AggregationTests.CarDescriptor.Entry>();
public CarDescriptor(Entry... entries) {
for (Entry entry : entries) {
this.entries.add(entry);
}
}
static class Entry {
private String make;
private String model;
private int year;
public Entry() {}
public Entry(String make, String model, int year) {
this.make = make;
this.model = model;
this.year = year;
}
}
}
}

View File

@@ -179,4 +179,49 @@ public class AggregationUnitTests {
DBObject fields = getAsDBObject(secondProjection, "$group");
assertThat(fields.get("foosum"), is((Object) new BasicDBObject("$sum", "$foo")));
}
/**
* @see DATAMONGO-908
*/
@Test
public void shouldSupportReferingToNestedPropertiesInGroupOperation() {
DBObject agg = newAggregation( //
project("cmsParameterId", "rules"), //
unwind("rules"), //
group("cmsParameterId", "rules.ruleType").count().as("totol") //
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg, is(notNullValue()));
DBObject group = ((List<DBObject>) agg.get("pipeline")).get(2);
DBObject fields = getAsDBObject(group, "$group");
DBObject id = getAsDBObject(fields, "_id");
assertThat(id.get("ruleType"), is((Object) "$rules.ruleType"));
}
/**
* @see DATAMONGO-924
*/
@Test
public void referencingProjectionAliasesFromPreviousStepShouldReferToTheSameFieldTarget() {
DBObject agg = newAggregation( //
project().and("foo.bar").as("ba") //
, project().and("ba").as("b") //
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
DBObject projection0 = extractPipelineElement(agg, 0, "$project");
assertThat(projection0, is((DBObject) new BasicDBObject("ba", "$foo.bar")));
DBObject projection1 = extractPipelineElement(agg, 1, "$project");
assertThat(projection1, is((DBObject) new BasicDBObject("b", "$ba")));
}
private DBObject extractPipelineElement(DBObject agg, int index, String operation) {
List<DBObject> pipeline = (List<DBObject>) agg.get("pipeline");
return (DBObject) pipeline.get(index).get(operation);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,25 +17,40 @@ package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import java.util.Arrays;
import java.util.List;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link TypeBasedAggregationOperationContext}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class TypeBasedAggregationOperationContextUnitTests {
@@ -89,6 +104,104 @@ public class TypeBasedAggregationOperationContextUnitTests {
assertThat(context.getReference("id"), is(new FieldReference(new ExposedField(Fields.field("id", "_id"), true))));
}
/**
* @see DATAMONGO-912
*/
@Test
public void shouldUseCustomConversionIfPresentAndConversionIsRequiredInFirstStage() {
CustomConversions customConversions = customAgeConversions();
converter.setCustomConversions(customConversions);
customConversions.registerConvertersIn((GenericConversionService) converter.getConversionService());
AggregationOperationContext context = getContext(FooPerson.class);
MatchOperation matchStage = match(Criteria.where("age").is(new Age(10)));
ProjectionOperation projectStage = project("age", "name");
DBObject agg = newAggregation(matchStage, projectStage).toDbObject("test", context);
DBObject age = getValue((DBObject) getValue(getPipelineElementFromAggregationAt(agg, 0), "$match"), "age");
assertThat(age, is((DBObject) new BasicDBObject("v", 10)));
}
/**
* @see DATAMONGO-912
*/
@Test
public void shouldUseCustomConversionIfPresentAndConversionIsRequiredInLaterStage() {
CustomConversions customConversions = customAgeConversions();
converter.setCustomConversions(customConversions);
customConversions.registerConvertersIn((GenericConversionService) converter.getConversionService());
AggregationOperationContext context = getContext(FooPerson.class);
MatchOperation matchStage = match(Criteria.where("age").is(new Age(10)));
ProjectionOperation projectStage = project("age", "name");
DBObject agg = newAggregation(projectStage, matchStage).toDbObject("test", context);
DBObject age = getValue((DBObject) getValue(getPipelineElementFromAggregationAt(agg, 1), "$match"), "age");
assertThat(age, is((DBObject) new BasicDBObject("v", 10)));
}
@Document(collection = "person")
public static class FooPerson {
final ObjectId id;
final String name;
final Age age;
@PersistenceConstructor
FooPerson(ObjectId id, String name, Age age) {
this.id = id;
this.name = name;
this.age = age;
}
}
public static class Age {
final int value;
Age(int value) {
this.value = value;
}
}
public CustomConversions customAgeConversions() {
return new CustomConversions(Arrays.<Converter<?, ?>> asList(ageWriteConverter(), ageReadConverter()));
}
Converter<Age, DBObject> ageWriteConverter() {
return new Converter<Age, DBObject>() {
@Override
public DBObject convert(Age age) {
return new BasicDBObject("v", age.value);
}
};
}
Converter<DBObject, Age> ageReadConverter() {
return new Converter<DBObject, Age>() {
@Override
public Age convert(DBObject dbObject) {
return new Age(((Integer) dbObject.get("v")));
}
};
}
@SuppressWarnings("unchecked")
static DBObject getPipelineElementFromAggregationAt(DBObject agg, int index) {
return ((List<DBObject>) agg.get("pipeline")).get(index);
}
@SuppressWarnings("unchecked")
static <T> T getValue(DBObject o, String key) {
return (T) o.get(key);
}
private TypeBasedAggregationOperationContext getContext(Class<?> type) {
return new TypeBasedAggregationOperationContext(type, context, mapper);
}

View File

@@ -1398,6 +1398,7 @@ public class MappingMongoConverterUnitTests {
/**
* @see DATAMONGO-812
* @see DATAMONGO-893
*/
@Test
public void convertsListToBasicDBListAndRetainsTypeInformationForComplexObjects() {
@@ -1407,7 +1408,7 @@ public class MappingMongoConverterUnitTests {
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singletonList(address),
ClassTypeInformation.from(Address.class));
ClassTypeInformation.from(InterfaceType.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
@@ -1833,7 +1834,11 @@ public class MappingMongoConverterUnitTests {
abstract void method();
}
static class Address {
static interface InterfaceType {
}
static class Address implements InterfaceType {
String street;
String city;
}

View File

@@ -39,6 +39,7 @@ import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -568,10 +569,42 @@ public class QueryMapperUnitTests {
assertThat(mappedFields, is(notNullValue()));
}
/**
* @see DATAMONGO-893
*/
@Test
public void classInformationShouldNotBePresentInDBObjectUsedInFinderMethods() {
EmbeddedClass embedded = new EmbeddedClass();
embedded.id = "1";
EmbeddedClass embedded2 = new EmbeddedClass();
embedded2.id = "2";
Query query = query(where("embedded").in(Arrays.asList(embedded, embedded2)));
DBObject dbo = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class));
assertThat(dbo.toString(), equalTo("{ \"embedded\" : { \"$in\" : [ { \"_id\" : \"1\"} , { \"_id\" : \"2\"}]}}"));
}
@Document
public class Foo {
@Id private ObjectId id;
EmbeddedClass embedded;
}
public class EmbeddedClass {
public String id;
}
class IdWrapper {
Object id;
}
class ClassWithEmbedded {
@Id String id;
Sample sample;
}
class ClassWithDefaultId {
String id;

View File

@@ -26,6 +26,7 @@ import java.util.List;
import org.hamcrest.Matcher;
import org.hamcrest.collection.IsIterableContainingInOrder;
import org.hamcrest.core.IsEqual;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -37,12 +38,16 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -452,6 +457,115 @@ public class UpdateMapperUnitTests {
}
/**
* @see DATAMONGO-897
*/
@Test
public void updateOnDbrefPropertyOfInterfaceTypeWithoutExplicitGetterForIdShouldBeMappedCorrectly() {
Update update = new Update().set("referencedDocument", new InterfaceDocumentDefinitionImpl("1", "Foo"));
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithReferenceToInterfaceImpl.class));
DBObject $set = DBObjectTestUtils.getAsDBObject(mappedObject, "$set");
Object model = $set.get("referencedDocument");
DBRef expectedDBRef = new DBRef(factory.getDb(), "interfaceDocumentDefinitionImpl", "1");
assertThat(model, allOf(instanceOf(DBRef.class), IsEqual.<Object> equalTo(expectedDBRef)));
}
/**
* @see DATAMONGO-847
*/
@Test
public void updateMapperConvertsNestedQueryCorrectly() {
Update update = new Update().pull("list", Query.query(Criteria.where("value").in("foo", "bar")));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
DBObject $pull = DBObjectTestUtils.getAsDBObject(mappedUpdate, "$pull");
DBObject list = DBObjectTestUtils.getAsDBObject($pull, "aliased");
DBObject value = DBObjectTestUtils.getAsDBObject(list, "value");
BasicDBList $in = DBObjectTestUtils.getAsDBList(value, "$in");
assertThat($in, IsIterableContainingInOrder.<Object> contains("foo", "bar"));
}
/**
* @see DATAMONGO-847
*/
@Test
public void updateMapperConvertsPullWithNestedQuerfyOnDBRefCorrectly() {
Update update = new Update().pull("dbRefAnnotatedList", Query.query(Criteria.where("id").is("1")));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithDBRefCollection.class));
DBObject $pull = DBObjectTestUtils.getAsDBObject(mappedUpdate, "$pull");
DBObject list = DBObjectTestUtils.getAsDBObject($pull, "dbRefAnnotatedList");
assertThat(list, equalTo(new BasicDBObjectBuilder().add("_id", "1").get()));
}
@org.springframework.data.mongodb.core.mapping.Document(collection = "DocumentWithReferenceToInterface")
static interface DocumentWithReferenceToInterface {
String getId();
InterfaceDocumentDefinitionWithoutId getReferencedDocument();
}
static interface InterfaceDocumentDefinitionWithoutId {
String getValue();
}
static class InterfaceDocumentDefinitionImpl implements InterfaceDocumentDefinitionWithoutId {
@Id String id;
String value;
public InterfaceDocumentDefinitionImpl(String id, String value) {
this.id = id;
this.value = value;
}
@Override
public String getValue() {
return this.value;
}
}
static class DocumentWithReferenceToInterfaceImpl implements DocumentWithReferenceToInterface {
private @Id String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
private InterfaceDocumentDefinitionWithoutId referencedDocument;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public void setModel(InterfaceDocumentDefinitionWithoutId referencedDocument) {
this.referencedDocument = referencedDocument;
}
@Override
public InterfaceDocumentDefinitionWithoutId getReferencedDocument() {
return this.referencedDocument;
}
}
static interface Model {}
static class ModelImpl implements Model {

View File

@@ -15,9 +15,10 @@
*/
package org.springframework.data.mongodb.core.geo;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.List;
import java.util.Map;
import org.junit.Before;
@@ -27,10 +28,13 @@ import org.springframework.dao.DataAccessException;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.config.AbstractIntegrationTests;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.IndexOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.WriteResultChecking;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexType;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexed;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.Document;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@@ -43,6 +47,7 @@ import com.mongodb.WriteConcern;
* @author Laurent Canet
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class GeoSpatialIndexTests extends AbstractIntegrationTests {
@@ -97,6 +102,29 @@ public class GeoSpatialIndexTests extends AbstractIntegrationTests {
}
}
/**
* @see DATAMONGO-827
*/
@Test
public void useGeneratedNameShouldGenerateAnIndexName() {
try {
GeoSpatialEntity2dWithGeneratedIndex geo = new GeoSpatialEntity2dWithGeneratedIndex(45.2, 4.6);
template.save(geo);
IndexOperations indexOps = template.indexOps(GeoSpatialEntity2dWithGeneratedIndex.class);
List<IndexInfo> indexInfo = indexOps.getIndexInfo();
assertThat(indexInfo, hasSize(2));
assertThat(indexInfo.get(1), is(notNullValue()));
assertThat(indexInfo.get(1).getName(), is("location_2d"));
} finally {
template.dropCollection(GeoSpatialEntity2D.class);
}
}
/**
* Returns whether an index with the given name exists for the given entity type.
*
@@ -129,6 +157,7 @@ public class GeoSpatialIndexTests extends AbstractIntegrationTests {
});
}
@Document
static class GeoSpatialEntity2D {
public String id;
@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2D) public org.springframework.data.geo.Point location;
@@ -138,6 +167,7 @@ public class GeoSpatialIndexTests extends AbstractIntegrationTests {
}
}
@Document
static class GeoSpatialEntityHaystack {
public String id;
public String name;
@@ -154,6 +184,7 @@ public class GeoSpatialIndexTests extends AbstractIntegrationTests {
double coordinates[];
}
@Document
static class GeoSpatialEntity2DSphere {
public String id;
@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2DSPHERE) public GeoJsonPoint location;
@@ -163,4 +194,15 @@ public class GeoSpatialIndexTests extends AbstractIntegrationTests {
this.location.coordinates = new double[] { x, y };
}
}
@Document
static class GeoSpatialEntity2dWithGeneratedIndex {
public String id;
@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2D, useGeneratedName = true) public Point location;
public GeoSpatialEntity2dWithGeneratedIndex(double x, double y) {
this.location = new Point(x, y);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright (c) 2011-2014 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb.core.index;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.After;
import org.junit.Test;
@@ -25,7 +25,7 @@ import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@@ -38,13 +38,13 @@ import com.mongodb.MongoException;
* Integration tests for index handling.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class IndexingIntegrationTests {
@Autowired
MongoOperations operations;
@Autowired MongoOperations operations;
@After
public void tearDown() {
@@ -61,11 +61,10 @@ public class IndexingIntegrationTests {
assertThat(hasIndex("_firstname", IndexedPerson.class), is(true));
}
@Document
class IndexedPerson {
@Field("_firstname")
@Indexed
String firstname;
@Field("_firstname") @Indexed String firstname;
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,18 +21,27 @@ import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Date;
import org.hamcrest.core.IsEqual;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.context.ApplicationContext;
import org.springframework.data.geo.Point;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
@@ -41,24 +50,46 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @author Philipp Schneider
* @author Johno Crawford
* @author Christoph Strobl
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoPersistentEntityIndexCreatorUnitTests {
@Mock MongoDbFactory factory;
@Mock ApplicationContext context;
private @Mock MongoDbFactory factory;
private @Mock ApplicationContext context;
private @Mock DB db;
private @Mock DBCollection collection;
ArgumentCaptor<DBObject> keysCaptor;
ArgumentCaptor<DBObject> optionsCaptor;
ArgumentCaptor<String> collectionCaptor;
@Before
public void setUp() {
keysCaptor = ArgumentCaptor.forClass(DBObject.class);
optionsCaptor = ArgumentCaptor.forClass(DBObject.class);
collectionCaptor = ArgumentCaptor.forClass(String.class);
Mockito.when(factory.getDb()).thenReturn(db);
Mockito.when(db.getCollection(collectionCaptor.capture())).thenReturn(collection);
Mockito.doNothing().when(collection).createIndex(keysCaptor.capture(), optionsCaptor.capture());
}
@Test
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = prepareMappingContext(Person.class);
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("fieldname"));
assertThat(creator.name, is("indexName"));
assertThat(creator.background, is(false));
assertThat(creator.expireAfterSeconds, is(-1));
new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(keysCaptor.getValue(), is(notNullValue()));
assertThat(keysCaptor.getValue().keySet(), hasItem("fieldname"));
assertThat(optionsCaptor.getValue().get("name").toString(), is("indexName"));
assertThat(optionsCaptor.getValue().get("background"), nullValue());
assertThat(optionsCaptor.getValue().get("expireAfterSeconds"), nullValue());
}
@Test
@@ -67,7 +98,7 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
MongoMappingContext mappingContext = new MongoMappingContext();
MongoMappingContext personMappingContext = prepareMappingContext(Person.class);
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
MongoPersistentEntityIndexCreator creator = new MongoPersistentEntityIndexCreator(mappingContext, factory);
MongoPersistentEntity<?> entity = personMappingContext.getPersistentEntity(Person.class);
MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event = new MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>(
@@ -75,7 +106,7 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
creator.onApplicationEvent(event);
assertThat(creator.indexDefinition, is(nullValue()));
Mockito.verifyZeroInteractions(collection);
}
/**
@@ -87,7 +118,7 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.initialize();
MongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
MongoPersistentEntityIndexCreator creator = new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.isIndexCreatorFor(mappingContext), is(true));
assertThat(creator.isIndexCreatorFor(new MongoMappingContext()), is(false));
}
@@ -99,12 +130,13 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
public void triggersBackgroundIndexingIfConfigured() {
MongoMappingContext mappingContext = prepareMappingContext(AnotherPerson.class);
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("lastname"));
assertThat(creator.name, is("lastname"));
assertThat(creator.background, is(true));
assertThat(keysCaptor.getValue(), is(notNullValue()));
assertThat(keysCaptor.getValue().keySet(), hasItem("lastname"));
assertThat(optionsCaptor.getValue().get("name").toString(), is("lastname"));
assertThat(optionsCaptor.getValue().get("background"), IsEqual.<Object> equalTo(true));
assertThat(optionsCaptor.getValue().get("expireAfterSeconds"), nullValue());
}
/**
@@ -114,11 +146,39 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
public void expireAfterSecondsIfConfigured() {
MongoMappingContext mappingContext = prepareMappingContext(Milk.class);
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("expiry"));
assertThat(creator.expireAfterSeconds, is(60));
assertThat(keysCaptor.getValue(), is(notNullValue()));
assertThat(keysCaptor.getValue().keySet(), hasItem("expiry"));
assertThat(optionsCaptor.getValue().get("expireAfterSeconds"), IsEqual.<Object> equalTo(60L));
}
/**
* @see DATAMONGO-899
*/
@Test
public void createsNotNestedGeoSpatialIndexCorrectly() {
MongoMappingContext mappingContext = prepareMappingContext(Wrapper.class);
new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(keysCaptor.getValue(), equalTo(new BasicDBObjectBuilder().add("company.address.location", "2d").get()));
assertThat(optionsCaptor.getValue(), equalTo(new BasicDBObjectBuilder().add("name", "location").add("min", -180)
.add("max", 180).add("bits", 26).get()));
}
/**
* @see DATAMONGO-827
*/
@Test
public void autoGeneratedIndexNameShouldGenerateNoName() {
MongoMappingContext mappingContext = prepareMappingContext(EntityWithGeneratedIndexName.class);
new MongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(keysCaptor.getValue().containsField("name"), is(false));
assertThat(keysCaptor.getValue().keySet(), hasItem("lastname"));
assertThat(optionsCaptor.getValue(), is(new BasicDBObjectBuilder().get()));
}
private static MongoMappingContext prepareMappingContext(Class<?> type) {
@@ -130,41 +190,52 @@ public class MongoPersistentEntityIndexCreatorUnitTests {
return mappingContext;
}
@Document
static class Person {
@Indexed(name = "indexName") @Field("fieldname") String field;
@Indexed(name = "indexName")//
@Field("fieldname")//
String field;
}
@Document
static class AnotherPerson {
@Indexed(background = true) String lastname;
}
@Document
static class Milk {
@Indexed(expireAfterSeconds = 60) Date expiry;
}
static class DummyMongoPersistentEntityIndexCreator extends MongoPersistentEntityIndexCreator {
@Document
static class Wrapper {
String id;
Company company;
}
static class Company {
DBObject indexDefinition;
String name;
boolean background;
int expireAfterSeconds;
Address address;
}
public DummyMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
super(mappingContext, mongoDbFactory);
}
static class Address {
@Override
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse, boolean background, int expireAfterSeconds) {
String street;
String city;
this.name = name;
this.indexDefinition = indexDefinition;
this.background = background;
this.expireAfterSeconds = expireAfterSeconds;
}
@GeoSpatialIndexed Point location;
}
@Document
class EntityWithGeneratedIndexName {
@Indexed(useGeneratedName = true, name = "ignored") String lastname;
}
}

View File

@@ -0,0 +1,432 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.collection.IsCollectionWithSize.*;
import static org.hamcrest.collection.IsEmptyCollection.*;
import static org.hamcrest.core.IsEqual.*;
import static org.hamcrest.core.IsInstanceOf.*;
import static org.junit.Assert.*;
import java.util.Collections;
import java.util.List;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
import org.junit.runners.Suite.SuiteClasses;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolverUnitTests.CompoundIndexResolutionTests;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolverUnitTests.GeoSpatialIndexResolutionTests;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolverUnitTests.IndexResolutionTests;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolverUnitTests.MixedIndexResolutionTests;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.BasicDBObjectBuilder;
/**
* @author Christoph Strobl
*/
@RunWith(Suite.class)
@SuiteClasses({ IndexResolutionTests.class, GeoSpatialIndexResolutionTests.class, CompoundIndexResolutionTests.class,
MixedIndexResolutionTests.class })
public class MongoPersistentEntityIndexResolverUnitTests {
/**
* Test resolution of {@link Indexed}.
*
* @author Christoph Strobl
*/
public static class IndexResolutionTests {
/**
* @see DATAMONGO-899
*/
@Test
public void indexPathOnLevelZeroIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexOnLevelZero.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("indexedProperty", "Zero", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void indexPathOnLevelOneIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexOnLevelOne.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("zero.indexedProperty", "One", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void depplyNestedIndexPathIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexOnLevelTwo.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("one.zero.indexedProperty", "Two", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void resolvesIndexPathNameForNamedPropertiesCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexOnLevelOneWithExplicitlyNamedField.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("customZero.customFieldName", "indexOnLevelOneWithExplicitlyNamedField",
indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void resolvesIndexDefinitionCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexOnLevelZero.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(indexDefinition.getIndexOptions(), equalTo(new BasicDBObjectBuilder().add("name", "indexedProperty")
.get()));
}
/**
* @see DATAMONGO-899
*/
@Test
public void resolvesIndexDefinitionOptionsCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(WithOptionsOnIndexedProperty.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(indexDefinition.getIndexOptions(),
equalTo(new BasicDBObjectBuilder().add("name", "indexedProperty").add("unique", true).add("dropDups", true)
.add("sparse", true).add("background", true).add("expireAfterSeconds", 10L).get()));
}
/**
* @see DATAMONGO-899
*/
@Test
public void resolvesIndexCollectionNameCorrectlyWhenDefinedInAnnotation() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(WithOptionsOnIndexedProperty.class);
assertThat(indexDefinitions.get(0).getCollection(), equalTo("CollectionOverride"));
}
@Document(collection = "Zero")
static class IndexOnLevelZero {
@Indexed String indexedProperty;
}
@Document(collection = "One")
static class IndexOnLevelOne {
IndexOnLevelZero zero;
}
@Document(collection = "Two")
static class IndexOnLevelTwo {
IndexOnLevelOne one;
}
@Document(collection = "WithOptionsOnIndexedProperty")
static class WithOptionsOnIndexedProperty {
@Indexed(background = true, collection = "CollectionOverride", direction = IndexDirection.DESCENDING,
dropDups = true, expireAfterSeconds = 10, sparse = true, unique = true)//
String indexedProperty;
}
@Document
static class IndexOnLevelOneWithExplicitlyNamedField {
@Field("customZero") IndexOnLevelZeroWithExplicityNamedField zero;
}
static class IndexOnLevelZeroWithExplicityNamedField {
@Indexed @Field("customFieldName") String namedProperty;
}
}
/**
* Test resolution of {@link GeoSpatialIndexed}.
*
* @author Christoph Strobl
*/
public static class GeoSpatialIndexResolutionTests {
/**
* @see DATAMONGO-899
*/
@Test
public void geoSpatialIndexPathOnLevelZeroIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(GeoSpatialIndexOnLevelZero.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("geoIndexedProperty", "Zero", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void geoSpatialIndexPathOnLevelOneIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(GeoSpatialIndexOnLevelOne.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("zero.geoIndexedProperty", "One", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void depplyNestedGeoSpatialIndexPathIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(GeoSpatialIndexOnLevelTwo.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("one.zero.geoIndexedProperty", "Two", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void resolvesIndexDefinitionOptionsCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(WithOptionsOnGeoSpatialIndexProperty.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(
indexDefinition.getIndexOptions(),
equalTo(new BasicDBObjectBuilder().add("name", "location").add("min", 1).add("max", 100).add("bits", 2).get()));
}
@Document(collection = "Zero")
static class GeoSpatialIndexOnLevelZero {
@GeoSpatialIndexed Point geoIndexedProperty;
}
@Document(collection = "One")
static class GeoSpatialIndexOnLevelOne {
GeoSpatialIndexOnLevelZero zero;
}
@Document(collection = "Two")
static class GeoSpatialIndexOnLevelTwo {
GeoSpatialIndexOnLevelOne one;
}
@Document(collection = "WithOptionsOnGeoSpatialIndexProperty")
static class WithOptionsOnGeoSpatialIndexProperty {
@GeoSpatialIndexed(collection = "CollectionOverride", bits = 2, max = 100, min = 1,
type = GeoSpatialIndexType.GEO_2D)//
Point location;
}
}
/**
* Test resolution of {@link CompoundIndexes}.
*
* @author Christoph Strobl
*/
public static class CompoundIndexResolutionTests {
/**
* @see DATAMONGO-899
*/
@Test
public void compoundIndexPathOnLevelZeroIsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(CompoundIndexOnLevelZero.class);
assertThat(indexDefinitions, hasSize(1));
assertIndexPathAndCollection("compound_index", "CompoundIndexOnLevelZero", indexDefinitions.get(0));
}
/**
* @see DATAMONGO-899
*/
@Test
public void compoundIndexOptionsResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(CompoundIndexOnLevelZero.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(indexDefinition.getIndexOptions(),
equalTo(new BasicDBObjectBuilder().add("name", "compound_index").add("unique", true).add("dropDups", true)
.add("sparse", true).add("background", true).add("expireAfterSeconds", 10L).get()));
assertThat(indexDefinition.getIndexKeys(), equalTo(new BasicDBObjectBuilder().add("foo", 1).add("bar", -1).get()));
}
/**
* @see DATAMONGO-909
*/
@Test
public void compoundIndexOnSuperClassResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(IndexDefinedOnSuperClass.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(indexDefinition.getIndexOptions(),
equalTo(new BasicDBObjectBuilder().add("name", "compound_index").add("unique", true).add("dropDups", true)
.add("sparse", true).add("background", true).add("expireAfterSeconds", 10L).get()));
assertThat(indexDefinition.getIndexKeys(), equalTo(new BasicDBObjectBuilder().add("foo", 1).add("bar", -1).get()));
}
/**
* @see DATAMONGO-827
*/
@Test
public void compoundIndexDoesNotSpecifyNameWhenUsingGenerateName() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(ComountIndexWithAutogeneratedName.class);
IndexDefinition indexDefinition = indexDefinitions.get(0).getIndexDefinition();
assertThat(
indexDefinition.getIndexOptions(),
equalTo(new BasicDBObjectBuilder().add("unique", true).add("dropDups", true).add("sparse", true)
.add("background", true).add("expireAfterSeconds", 10L).get()));
assertThat(indexDefinition.getIndexKeys(), equalTo(new BasicDBObjectBuilder().add("foo", 1).add("bar", -1).get()));
}
@Document(collection = "CompoundIndexOnLevelZero")
@CompoundIndexes({ @CompoundIndex(name = "compound_index", def = "{'foo': 1, 'bar': -1}", background = true,
dropDups = true, expireAfterSeconds = 10, sparse = true, unique = true) })
static class CompoundIndexOnLevelZero {}
static class IndexDefinedOnSuperClass extends CompoundIndexOnLevelZero {
}
@Document(collection = "ComountIndexWithAutogeneratedName")
@CompoundIndexes({ @CompoundIndex(useGeneratedName = true, def = "{'foo': 1, 'bar': -1}", background = true,
dropDups = true, expireAfterSeconds = 10, sparse = true, unique = true) })
static class ComountIndexWithAutogeneratedName {
}
}
public static class MixedIndexResolutionTests {
/**
* @see DATAMONGO-899
*/
@Test
public void multipleIndexesResolvedCorrectly() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(MixedIndexRoot.class);
assertThat(indexDefinitions, hasSize(2));
assertThat(indexDefinitions.get(0).getIndexDefinition(), instanceOf(Index.class));
assertThat(indexDefinitions.get(1).getIndexDefinition(), instanceOf(GeospatialIndex.class));
}
/**
* @see DATAMONGO-899
*/
@Test
public void cyclicPropertyReferenceOverDBRefShouldNotBeTraversed() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(Inner.class);
assertThat(indexDefinitions, hasSize(1));
assertThat(indexDefinitions.get(0).getIndexDefinition().getIndexKeys(),
equalTo(new BasicDBObjectBuilder().add("outer", 1).get()));
}
/**
* @see DATAMONGO-899
*/
@Test
public void associationsShouldNotBeTraversed() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(Outer.class);
assertThat(indexDefinitions, empty());
}
@Document
static class MixedIndexRoot {
@Indexed String first;
NestedGeoIndex nestedGeo;
}
static class NestedGeoIndex {
@GeoSpatialIndexed Point location;
}
@Document
static class Outer {
@DBRef Inner inner;
}
@Document
static class Inner {
@Indexed Outer outer;
}
}
private static List<IndexDefinitionHolder> prepareMappingContextAndResolveIndexForType(Class<?> type) {
MongoMappingContext mappingContext = prepareMappingContext(type);
MongoPersistentEntityIndexResolver resolver = new MongoPersistentEntityIndexResolver(mappingContext);
return resolver.resolveIndexForEntity(mappingContext.getPersistentEntity(type));
}
private static MongoMappingContext prepareMappingContext(Class<?> type) {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(type));
mappingContext.initialize();
return mappingContext;
}
private static void assertIndexPathAndCollection(String expectedPath, String expectedCollection,
IndexDefinitionHolder holder) {
assertThat(holder.getPath(), equalTo(expectedPath));
assertThat(holder.getCollection(), equalTo(expectedCollection));
}
}

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
/**
* Unit tests for {@link SnakeCaseFieldNamingStrategy}.
*
* @author Ryan Tenney
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class SnakeCaseFieldNamingStrategyUnitTests {
FieldNamingStrategy strategy = new SnakeCaseFieldNamingStrategy();
@Mock MongoPersistentProperty property;
/**
* @see DATAMONGO-866
*/
@Test
public void rendersSnakeCaseFieldNames() {
assertFieldNameForPropertyName("fooBar", "foo_bar");
assertFieldNameForPropertyName("FooBar", "foo_bar");
assertFieldNameForPropertyName("foo_bar", "foo_bar");
assertFieldNameForPropertyName("FOO_BAR", "foo_bar");
}
private void assertFieldNameForPropertyName(String propertyName, String fieldName) {
when(property.getName()).thenReturn(propertyName);
assertThat(strategy.getFieldName(property), is(fieldName));
}
}

View File

@@ -863,4 +863,21 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
public void deleteByUsingAnnotatedQueryShouldReturnNumberOfDocumentsRemovedIfReturnTypeIsLong() {
assertThat(repository.removePersonByLastnameUsingAnnotatedQuery("Beauford"), is(1L));
}
/**
* @see DATAMONGO-893
*/
@Test
public void findByNestedPropertyInCollectionShouldFindMatchingDocuments() {
Person p = new Person("Mary", "Poppins");
Address adr = new Address("some", "2", "where");
p.setAddress(adr);
repository.save(p);
Page<Person> result = repository.findByAddressIn(Arrays.asList(adr), new PageRequest(0, 10));
assertThat(result.getContent(), hasSize(1));
}
}

View File

@@ -286,4 +286,8 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
@Query(value = "{ 'lastname' : ?0 }", delete = true)
Long removePersonByLastnameUsingAnnotatedQuery(String lastname);
/**
* @see DATAMONGO-893
*/
Page<Person> findByAddressIn(List<Address> address, Pageable page);
}

View File

@@ -15,9 +15,12 @@
*/
package org.springframework.data.mongodb.repository.config;
import static org.junit.Assert.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -49,9 +52,25 @@ public class MongoRepositoriesRegistrarIntegrationTests {
}
@Autowired PersonRepository personRepository;
@Autowired ApplicationContext context;
@Test
public void testConfiguration() {
}
/**
* @see DATAMONGO-901
*/
@Test
public void registersTypePredictingPostProcessor() {
for (String name : context.getBeanDefinitionNames()) {
if (name.startsWith("org.springframework.data.repository.core.support.RepositoryInterfaceAwareBeanPostProcessor")) {
return;
}
}
fail("Expected to find a bean with name starting with RepositoryInterfaceAwareBeanPostProcessor");
}
}

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:mapping-converter abbreviate-field-names="true" field-naming-strategy-ref="foo" />
</beans>

View File

@@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="factory" />
<bean class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<mongo:mapping-converter />
</constructor-arg>
</bean>
</beans>

View File

@@ -2,13 +2,13 @@ Bundle-SymbolicName: org.springframework.data.mongodb
Bundle-Name: Spring Data MongoDB Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.*;version="${project.version}"
Import-Template:
Import-Template:
com.google.common.base.*;version="[11.0.0,14.0.0)";resolution:=optional,
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo:[=.=.=,+1.0.0)}",
com.mysema.query.*;version="[2.1.1, 3.0.0)";resolution:=optional,
javax.annotation.processing.*;version="0",
javax.enterprise.*;version="${cdi:[=.=.=,+1.0.0)}";resolution:=optional,

View File

@@ -68,7 +68,7 @@
<xi:include href="introduction/introduction.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.M1/src/docbkx/repositories.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.RC1/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -88,10 +88,10 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.M1/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.RC1/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.M1/src/docbkx/repository-query-keywords-reference.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.8.0.RC1/src/docbkx/repository-query-keywords-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-query-keywords-reference.xml" />
</xi:include>
</part>

View File

@@ -312,6 +312,11 @@ public class Person {
document, making searches faster.</para>
</important>
<important>
<para>Automatic index creation is only done for types annotated with
<classname>@Document</classname>.</para>
</important>
<section id="mapping-usage-annotations">
<title>Mapping annotation overview</title>

View File

@@ -1,6 +1,58 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.5.0.RC1 (2014-05-02)
-----------------------------------------
* DATAMONGO-924 - Aggregation not working with as() method in project() pipeline operator.
* DATAMONGO-921 - Upgrade to MongoDB Java driver 2.12.1.
* DATAMONGO-920 - Fix debug messages for delete events in AbstractMongoEventListener.
* DATAMONGO-919 - Release 1.5 RC1.
* DATAMONGO-917 - DefaultDbRefResolver throws NPE when bundled into an uberjar.
* DATAMONGO-914 - Improve resolving of LazyLoading proxies for classes that override equals/hashcode.
* DATAMONGO-913 - Can't query using lazy DBRef objects.
* DATAMONGO-912 - Aggregation#project followed by Aggregation#match with custom converter causes IllegalArgumentException.
* DATAMONGO-910 - Upgrade to latest MongoDB Java driver (2.12).
* DATAMONGO-909 - @CompoundIndex on inherited entity classes.
* DATAMONGO-908 - Nested field references in group operations broken.
* DATAMONGO-907 - Assert compatibility with mongodb 2.6.0.
* DATAMONGO-905 - Remove obsolete CGLib dependency from cross store module.
* DATAMONGO-901 - MongoRepositoryConfigurationExtension fails to invoke super method.
* DATAMONGO-899 - Overhaul automatic index creation.
* DATAMONGO-898 - MapReduce seems not to work when javascript not being escaped.
* DATAMONGO-897 - FindAndUpdate broken when using @DbRef and interface as target.
* DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
* DATAMONGO-895 - Use most specific type for checks against values in DBObjects.
* DATAMONGO-893 - Mapping Convertor does not remove "_class" property on collection of embedded objects.
* DATAMONGO-892 - <mongo:mapping-converter> can't be configured as nested bean definition.
* DATAMONGO-888 - Mapping is not applied to SortObject during queries.
* DATAMONGO-866 - Add new field naming strategy and make it configurable through XML/Java config.
* DATAMONGO-847 - Allow usage of Criteria within Update.
* DATAMONGO-827 - @Indexed and @CompundIndex cannot be created without giving index name.
Changes in version 1.4.2.RELEASE (2014-04-15)
---------------------------------------------
** Fix
* [DATAMONGO-880] - Improved handling of persistence of lazy-loaded DBRefs.
* [DATAMONGO-884] - Improved handling for Object methods in LazyLoadingInterceptor.
* [DATAMONGO-887] - Added unit tests to verify TreeMaps can be converted.
* [DATAMONGO-888] - Sorting now considers mapping information.
* [DATAMONGO-890] - Fixed Point.toString().
* [DATAMONGO-892] - Reject nested MappingMongoConverter declarations in XML.
* [DATAMONGO-893] - Converter must not write "_class" information for know types.
* [DATAMONGO-897] - Fixed potential NullPointerException in QueryMapper.
* [DATAMONGO-908] - Support for nested field references in group operations.
** Improvement
* [DATAMONGO-881] - Allow custom conversions to override default conversions.
** Task
* [DATAMONGO-895] - Use most specific type for checks against values in DBObjects.
* [DATAMONGO-896] - Assert compatibility with latest MongoDB Java driver.
* [DATAMONGO-905] - Removed obsolete dependency to CGLib from cross-store support.
* [DATAMONGO-907] - Assert compatibility with mongodb 2.6.
* [DATAMONGO-911] - Release 1.4.2
Changes in version 1.5.0.M1 (2014-03-31)
----------------------------------------
** Fix
@@ -30,7 +82,7 @@ Changes in version 1.5.0.M1 (2014-03-31)
** New Feature
* [DATAMONGO-566] - Provide support for removeBy… / deleteBy… methods like for findBy… on repository interfaces.
* [DATAMONGO-870] - Add support for sliced query method execution.
** Task
* [DATAMONGO-876] - Adapt to changes introduced for property access configuration.
* [DATAMONGO-883] - Update auditing configuration to enable auditing annotations on accessors.
@@ -58,7 +110,7 @@ Changes in version 1.3.5.RELEASE (2014-03-10)
** Fix
* [DATAMONGO-829] - NearQuery, when used in conjunction with a Query, no longer sets num=0, unless Query specifies otherwise.
* [DATAMONGO-871] - Repository queries support array return type.
** Improvement
* [DATAMONGO-865] - Avoid ClassNotFoundException during test runs.
@@ -84,7 +136,7 @@ Changes in version 1.4.0.RELEASE (2014-02-24)
* [DATAMONGO-848] - Ensure compatibility with Mongo Java driver 2.12.
* [DATAMONGO-853] - Update no longer allows null keys.
* [DATAMONGO-856] - Update documentation.
Changes in version 1.3.4.RELEASE (2014-02-17)
---------------------------------------------
** Bug
@@ -134,7 +186,7 @@ Changes in version 1.4.0.RC1 (2014-01-29)
* [DATAMONGO-824] - Add contribution guidelines
* [DATAMONGO-826] - Release Spring Data MongoDB 1.4.0.RC1
* [DATAMONGO-835] - Code cleanups
Changes in version 1.3.3.RELEASE (2013-12-11)
---------------------------------------------
** Bug
@@ -352,7 +404,7 @@ Changes in version 1.3.0.M1 (2013-06-04)
* [DATAMONGO-672] - Upgrade to latest Spring Data Build and Commons
* [DATAMONGO-678] - Performance improvements in CustomConversions
* [DATAMONGO-690] - Release 1.3 M1
Changes in version 1.2.1.GA (2013-04-17)
----------------------------------------
** Bug
@@ -378,7 +430,7 @@ Changes in version 1.2.1.GA (2013-04-17)
* [DATAMONGO-637] - Typo in Query.query(…)
* [DATAMONGO-651] - WriteResult not available from thrown Exception
* [DATAMONGO-656] - Potential NullPointerException when debugging in MongoTemplate
** Task
* [DATAMONGO-597] - Website is severely out-of-date
* [DATAMONGO-654] - Release 1.2.1
@@ -663,7 +715,7 @@ Changes in version 1.1.0.M1 (2012-05-07)
* [DATAMONGO-396] - Release 1.1.0.M1.
* [DATAMONGO-432] - Upgrade to Spring Data Commons 1.3.0.RC1
* [DATAMONGO-439] - Add performance tests
Changes in version 1.0.1.RELEASE MongoDB (2012-02-11)
-----------------------------------------------------

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.4.0.RELEASE
Spring Data MongoDB 1.5 RC1
Copyright (c) [2010-2014] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -7,4 +7,4 @@ You may not use this product except in compliance with the License.
This product may include a number of subcomponents with
separate copyright notices and license terms. Your use of the source
code for the these subcomponents is subject to the terms and
conditions of the subcomponent's license, as noted in the LICENSE file.
conditions of the subcomponent's license, as noted in the LICENSE file.