Compare commits

...

32 Commits

Author SHA1 Message Date
Spring Buildmaster
b1068687bb DATAMONGO-911 - Release version 1.4.2.RELEASE (Codd SR2). 2014-04-15 10:08:32 -07:00
Christoph Strobl
6eae6d3e2c DATAMONGO-911 - Prepare 1.4.2.RELEASE (Codd SR2).
Updated to Spring Data Build 1.3.2.RELEASE and Spring Data Commons 1.7.2.RELEASE. Update readme, changelog, notice to reflect recent version. Update pom.xml, index.xml to recent version.
2014-04-15 18:29:38 +02:00
Christoph Strobl
abfb98afe1 DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:29:50 +02:00
Christoph Strobl
f361368893 DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:06:54 +02:00
Christoph Strobl
063438002b DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails.

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:01:22 +02:00
Thomas Darimont
9b54a5cd39 DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:57:24 +02:00
Jeff Yemin
14360f2ab4 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:54 +02:00
Christoph Strobl
81c368c851 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:45:09 +02:00
Christoph Strobl
cf3818e04c DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:34:00 +02:00
Oliver Gierke
da9870504f DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 20:45:56 +02:00
Thomas Darimont
1285f4f26e DATAMONGO-884 - Improved handling for Object methods in LazyLoadingInterceptor.
We now handle invocations of equals(…)/hashCode()/toString()  methods that are not overridden with custom proxy aware logic. This avoids potentially NullPointerExceptions and makes it easier to debug code that deals with proxies (due to a proper toString representation of a proxy).

Original pull request: #158.
2014-03-31 15:19:40 +02:00
Thomas Darimont
791938f05d DATAMONGO-884 - Fix potential NullPointerException for lazy DBRefs.
We now initialize the proxy in case an Object-method is called that is overridden in the traget class. Removed the additional check for initialization and to-DBRef-methods as they're repeated in the target method.

Original pull requests: #152, #153.
2014-03-27 17:59:44 +01:00
Oliver Gierke
1b2d98dd3d DATAMONGO-890 - Fixed Point.toString().
The toString() representation no lists x,y instead of the previously (wrong latitude, longitude).
2014-03-27 16:27:05 +01:00
Oliver Gierke
de364c65ab DATAMONGO-887 - Added unit tests to verify TreeMaps can be converted. 2014-03-27 09:28:55 +01:00
Oliver Gierke
57a74b0427 DATAMONGO-880 - Minor polishing in lazy-loading area.
Took the change to add @since tags to the types introduced for lazy loading. Polished JavaDoc where necessary. Removed methods solely existing for testing purposes and use reflection in tests to minimize the API being published.
2014-03-20 09:33:55 +01:00
Thomas Darimont
f35df8fe69 DATAMONGO-880 - Improved handling of persistence of lazy-loaded DBRefs.
Added LazyLoadingProxy interface that will be implemented by every LazyLoading-proxy that is created by the DefaultDbRefResolver. Clients can now cast those proxies to this interface and call it's methods initialize a proxy explicitly or to get the referenced DBRef if possible.

We now keep a reference to the DBRef that lead to the creation of a LazyLoadingProxy in order to be able to reuse it in case one assigns the proxy to a field that should be a DBRef. This avoids unnecessary conversion.

Previously saving of proxies wasn't possible since the mapping infrastructure did not know how to extract the entity information from the proxy. We now either store the DBRef backed by the proxy directly or we initialize the proxy first and use the result of LazyLoadingProxy.initialize().

Original pull request: #151.
2014-03-20 09:33:42 +01:00
Oliver Gierke
2d3aac1826 DATAMONGO-881 - Allow custom conversions to override default conversions.
User provided converters are now registered *after* the default converters to make sure they enjoy precedence over the default ones.

This is achieved by inverting the order of converters after the conversions have been registered. This is necessary as the registration order for convertible pairs is different from the one of the converters. For the pairs, earlier registered instances take precedence, while for the actual converter instances, instances registered later trump ones registered before.
2014-03-18 09:34:03 +01:00
Spring Buildmaster
15db4ba6ea DATAMONGO-860 - Prepare next development iteration. 2014-03-13 12:34:19 +01:00
Spring Buildmaster
f02ac5ea44 DATAMONGO-860 - Release version 1.4.1.RELEASE (Codd SR1). 2014-03-13 04:25:30 -07:00
Christoph Strobl
86633e01db DATAMONGO-860 - Prepare Release 1.4.1.
Update readme.md & mongodb.xml to reflect recent version. Update sd-commons/sb-build versions in pom.xml. Update pom.xml to use release repository.
Update docbkx to use recent sd-commons version. Update changelog to reflect changes and releases.

Original Pull Request: #148.
2014-03-13 12:02:19 +01:00
Oliver Gierke
5fe3763f9c DATAMONGO-877 - Added guard against null-package in AbstractMappingConfiguration.
AbstractMappingConfiguration.getMappingBasePackage() now quards against a null package returned for the configuration class. This can happen if the class resides in the default package.
2014-03-10 13:13:49 +01:00
Thomas Darimont
d1e2b143f3 DATAMONGO-773 - Verify that @DBRef fields can be included in query.
Added test cases to verify that projection search with included @DBRef fields works as expected.
2014-03-06 13:43:01 +01:00
Christoph Strobl
61ab232bc1 DATAMONGO-868 - MongoTemplate.findAndModify(…) increases version if not handled manually.
MongoTemplate.findAndModify(…) increments the version property in case it's not manually set in the Update object given.

Original Pull Request: #141.
2014-03-06 11:52:17 +01:00
Christoph Strobl
443cde6236 DATAMONGO-863 - UpdateMapper doesn't convert raw DBObjects anymore.
UpdateMapper now only performs simple conversion if it encounters a DBObject, instead of deep inspection on keywords used. This allows to use custom clauses nested in Update for operations not directly supported.

Original Pull Request: #138.
2014-03-06 11:46:57 +01:00
Oliver Gierke
b23796fb45 DATAMONGO-821 - Fixed handling of keyword expressions for DBRefs.
Query Mapper skips DBRef conversion in case the given source value is a nested DBObject. This allows to directly use mongodb operators wrapped in DBObject on association properties.

Original Pull Request: #139.
2014-03-06 11:26:37 +01:00
Oliver Gierke
605f7459f7 DATAMONGO-843 - Back-port of defaulting of the MappingContext for auditing.
@EnableMongoAuditing defaults the mapping context to make sure it can be used without a MappingContext defined explicitly.
2014-03-06 09:26:14 +01:00
Oliver Gierke
ef6db5970b DATAMONGO-871 - Add support for arrays as query method return types.
Changed AbstractMongoQuery to potentially convert all query execution results using the DefaultConversionService in case the query result doesn't match the expected return value.

This allows arrays to be returned for collection queries as the conversion service cam transparently convert between collections and arrays.
2014-03-05 10:07:37 +01:00
Thomas Darimont
47a5a32713 DATAMONGO-865 - Adjust test dependencies to avoid ClassNotFoundException during test runs.
Added jul-to-slf4j dependency to avoid exceptions being logged during test runs.

Original pull request: #135.
2014-03-04 09:48:34 +01:00
Christoph Strobl
1675528fc7 DATAMONGO-829 - NearQuery should not default 'num' to zero.
NearQuery now ignores query.getLimit() equal to zero, when adding Query to NearQuery. This has to be done as limit is defaulted to zero within Query which then results in unintended propagation of the parameter.

In case 'num' should be explicitly set to zero one might use 'NearQuery.num(0)' as an alternative to the query approach.

Introduced 'null' check for 'NearQuery.query(Query)' and 'NearQuery.with(Pageable)' along the way.

Original Pull Request: #133
2014-03-03 15:34:00 +01:00
Christoph Strobl
3455cbc634 DATAMONGO-862 - Fixed handling of unmapped paths for updates.
UpdateMapper uses key instead of cleaned property path when not directly pointing to a property.

Original pull request: #132.
2014-02-27 16:56:11 +01:00
Oliver Gierke
ed779e52b7 DATAMONGO-833 - Support for EnumSet and EnumMap in MappingMongoConverter.
Re-implemented the fix we already applied to master without referring to the custom CollctionFactory, which is only introduced in Spring Data Commons' master.

Related pull request: #113.
2014-02-26 05:56:19 +01:00
Spring Buildmaster
c70898b019 DATAMONGO-854 - Prepare next development iteration. 2014-02-24 15:30:19 +01:00
45 changed files with 1742 additions and 166 deletions

View File

@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
</dependency>
```

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent>
@@ -29,7 +29,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.7.0.RELEASE</springdata.commons>
<springdata.commons>1.7.2.RELEASE</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
</properties>
@@ -107,7 +107,7 @@
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo>2.12.0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
</properties>
</profile>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
</dependency>
<dependency>
@@ -56,17 +56,13 @@
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.4.0.RELEASE</version>
<version>1.4.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -137,6 +137,13 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -116,7 +116,9 @@ public abstract class AbstractMongoConfiguration {
* entities.
*/
protected String getMappingBasePackage() {
return getClass().getPackage().getName();
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**

View File

@@ -71,6 +71,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -83,8 +84,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : "mappingConverter";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,24 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AnnotationAuditingConfiguration;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
@@ -57,7 +64,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
registerIsNewStrategyFactoryIfNecessary(registry);
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
@@ -92,14 +99,33 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
}
/**
* @param registry, the {@link BeanDefinitionRegistry} to use to register an {@link IsNewStrategyFactory} to.
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
*/
private void registerIsNewStrategyFactoryIfNecessary(BeanDefinitionRegistry registry) {
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
if (!registry.containsBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY)) {
registry.registerBeanDefinition(BeanNames.IS_NEW_STRATEGY_FACTORY,
BeanDefinitionBuilder.rootBeanDefinition(MappingContextIsNewStrategyFactory.class)
.addConstructorArgReference(BeanNames.MAPPING_CONTEXT).getBeanDefinition());
if (!registry.containsBeanDefinition(MAPPING_CONTEXT)) {
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
registry.registerBeanDefinition(MAPPING_CONTEXT, definition);
}
if (!registry.containsBeanDefinition(IS_NEW_STRATEGY_FACTORY)) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
builder.addConstructorArgReference(MAPPING_CONTEXT);
AbstractBeanDefinition definition = ParsingUtils.getSourceBeanDefinition(builder, source);
definition.setRole(ROLE_INFRASTRUCTURE);
registry.registerBeanDefinition(IS_NEW_STRATEGY_FACTORY, definition);
}
}
}

View File

@@ -352,7 +352,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch) {
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query));
executeQuery(query, collectionName, dch, new QueryCursorPreparer(query, null));
}
/**
@@ -530,7 +530,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query));
new QueryCursorPreparer(query, entityClass));
}
public <T> T findById(Object id, Class<T> entityClass) {
@@ -612,8 +612,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName) {
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass, update, options);
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass, update, options);
}
// Find methods that take a Query to express the query and that return a single object that is also removed from the
@@ -624,8 +624,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> T findAndRemove(Query query, Class<T> entityClass, String collectionName) {
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(), query.getSortObject(),
entityClass);
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass);
}
public long count(Query query, Class<?> entityClass) {
@@ -1573,6 +1574,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
increaseVersionForUpdateIfNecessary(entity, update);
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
@@ -1856,6 +1859,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return converter;
}
private DBObject getMappedSortObject(Query query, Class<?> type) {
if (query == null || query.getSortObject() == null) {
return null;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(type);
return queryMapper.getMappedObject(query.getSortObject(), entity);
}
// Callback implementations
/**
@@ -2049,9 +2062,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
private final Class<?> type;
public QueryCursorPreparer(Query query, Class<?> type) {
public QueryCursorPreparer(Query query) {
this.query = query;
this.type = type;
}
/*
@@ -2079,7 +2095,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
cursorToUse = cursorToUse.sort(query.getSortObject());
cursorToUse = cursorToUse.sort(getMappedSortObject(query, type));
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -69,12 +69,26 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
@Override
public FieldReference getReference(String name) {
Assert.notNull(name, "Name must not be null!");
ExposedField field = exposedFields.getField(name);
if (field != null) {
return new FieldReference(field);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
@@ -93,22 +94,28 @@ public class CustomConversions {
this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>();
this.converters = new ArrayList<Object>();
this.converters.addAll(converters);
this.converters.add(CustomToStringConverter.INSTANCE);
this.converters.add(BigDecimalToStringConverter.INSTANCE);
this.converters.add(StringToBigDecimalConverter.INSTANCE);
this.converters.add(BigIntegerToStringConverter.INSTANCE);
this.converters.add(StringToBigIntegerConverter.INSTANCE);
this.converters.add(URLToStringConverter.INSTANCE);
this.converters.add(StringToURLConverter.INSTANCE);
this.converters.add(DBObjectToStringConverter.INSTANCE);
this.converters.addAll(JodaTimeConverters.getConvertersToRegister());
List<Object> toRegister = new ArrayList<Object>();
for (Object c : this.converters) {
toRegister.addAll(converters);
toRegister.add(CustomToStringConverter.INSTANCE);
toRegister.add(BigDecimalToStringConverter.INSTANCE);
toRegister.add(StringToBigDecimalConverter.INSTANCE);
toRegister.add(BigIntegerToStringConverter.INSTANCE);
toRegister.add(StringToBigIntegerConverter.INSTANCE);
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
// Add user provided converters to make sure they can override the defaults
for (Object c : toRegister) {
registerConversion(c);
}
Collections.reverse(toRegister);
this.converters = Collections.unmodifiableList(toRegister);
this.simpleTypeHolder = new SimpleTypeHolder(customSimpleTypes, MongoSimpleTypes.HOLDER);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,15 +24,22 @@ import com.mongodb.DBRef;
* Used to resolve associations annotated with {@link org.springframework.data.mongodb.core.mapping.DBRef}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.4
*/
public interface DbRefResolver {
/**
* Resolves the given {@link DBRef} into an object of the given {@link MongoPersistentProperty}'s type. The method
* might return a proxy object for the {@link DBRef} or resolve it immediately. In both cases the
* {@link DbRefResolverCallback} will be used to obtain the actual backing object.
*
* @param property will never be {@literal null}.
* @param dbref the {@link DBRef} to resolve.
* @param callback will never be {@literal null}.
* @return
*/
Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback);
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback);
/**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.util.ReflectionUtils.*;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
@@ -51,6 +53,7 @@ import com.mongodb.DBRef;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.4
*/
public class DefaultDbRefResolver implements DbRefResolver {
@@ -78,13 +81,13 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/
@Override
public Object resolveDbRef(MongoPersistentProperty property, DbRefResolverCallback callback) {
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, callback);
return createLazyLoadingProxy(property, dbref, callback);
}
return callback.resolve(property);
@@ -109,10 +112,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
* eventually resolve the value of the property.
*
* @param property must not be {@literal null}.
* @param dbref can be {@literal null}.
* @param callback must not be {@literal null}.
* @return
*/
private Object createLazyLoadingProxy(MongoPersistentProperty property, DbRefResolverCallback callback) {
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) {
ProxyFactory proxyFactory = new ProxyFactory();
Class<?> propertyType = property.getType();
@@ -121,7 +125,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
proxyFactory.addInterface(type);
}
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, exceptionTranslator, callback);
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
proxyFactory.addInterface(LazyLoadingProxy.class);
if (propertyType.isInterface()) {
proxyFactory.addInterface(propertyType);
@@ -141,7 +147,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
}
/**
* @param property
* Returns whether the property shall be resolved lazily.
*
* @param property must not be {@literal null}.
* @return
*/
private boolean isLazyDbRef(MongoPersistentProperty property) {
@@ -154,31 +162,46 @@ public class DefaultDbRefResolver implements DbRefResolver {
* guaranteed to be performed only once.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD;
private final DbRefResolverCallback callback;
private final MongoPersistentProperty property;
private final PersistenceExceptionTranslator exceptionTranslator;
private volatile boolean resolved;
private Object result;
private DBRef dbref;
static {
try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
/**
* Creates a new {@link LazyLoadingInterceptor} for the given {@link MongoPersistentProperty},
* {@link PersistenceExceptionTranslator} and {@link DbRefResolverCallback}.
*
* @param property must not be {@literal null}.
* @param dbref can be {@literal null}.
* @param callback must not be {@literal null}.
*/
public LazyLoadingInterceptor(MongoPersistentProperty property, PersistenceExceptionTranslator exceptionTranslator,
DbRefResolverCallback callback) {
public LazyLoadingInterceptor(MongoPersistentProperty property, DBRef dbref,
PersistenceExceptionTranslator exceptionTranslator, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!");
Assert.notNull(exceptionTranslator, "Exception translator must not be null!");
Assert.notNull(callback, "Callback must not be null!");
this.dbref = dbref;
this.callback = callback;
this.exceptionTranslator = exceptionTranslator;
this.property = property;
@@ -199,9 +222,95 @@ public class DefaultDbRefResolver implements DbRefResolver {
*/
@Override
public Object intercept(Object obj, Method method, Object[] args, MethodProxy proxy) throws Throwable {
return ReflectionUtils.isObjectMethod(method) ? method.invoke(obj, args) : method.invoke(ensureResolved(), args);
if (INITIALIZE_METHOD.equals(method)) {
return ensureResolved();
}
if (TO_DBREF_METHOD.equals(method)) {
return this.dbref;
}
if (isObjectMethod(method) && Object.class.equals(method.getDeclaringClass())) {
if (ReflectionUtils.isToStringMethod(method)) {
return proxyToString(proxy);
}
if (ReflectionUtils.isEqualsMethod(method)) {
return proxyEquals(proxy, args[0]);
}
if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy);
}
}
Object target = ensureResolved();
if (target == null) {
return null;
}
return method.invoke(target, args);
}
/**
* Returns a to string representation for the given {@code proxy}.
*
* @param proxy
* @return
*/
private String proxyToString(Object proxy) {
StringBuilder description = new StringBuilder();
if (dbref != null) {
description.append(dbref.getRef());
description.append(":");
description.append(dbref.getId());
} else {
description.append(System.identityHashCode(proxy));
}
description.append("$").append(LazyLoadingProxy.class.getSimpleName());
return description.toString();
}
/**
* Returns the hashcode for the given {@code proxy}.
*
* @param proxy
* @return
*/
private int proxyHashCode(Object proxy) {
return proxyToString(proxy).hashCode();
}
/**
* Performs an equality check for the given {@code proxy}.
*
* @param proxy
* @param that
* @return
*/
private boolean proxyEquals(Object proxy, Object that) {
if (!(that instanceof LazyLoadingProxy)) {
return false;
}
if (that == proxy) {
return true;
}
return proxyToString(proxy).equals(that.toString());
}
/**
* Will trigger the resolution if the proxy is not resolved already or return a previously resolved result.
*
* @return
*/
private Object ensureResolved() {
if (!resolved) {
@@ -212,16 +321,28 @@ public class DefaultDbRefResolver implements DbRefResolver {
return this.result;
}
/**
* Callback method for serialization.
*
* @param out
* @throws IOException
*/
private void writeObject(ObjectOutputStream out) throws IOException {
ensureResolved();
out.writeObject(this.result);
}
/**
* Callback method for deserialization.
*
* @param in
* @throws IOException
*/
private void readObject(ObjectInputStream in) throws IOException {
try {
this.resolved = true; // Object is guaranteed to be resolved after serializations
this.resolved = true;
this.result = in.readObject();
} catch (ClassNotFoundException e) {
throw new LazyLoadingException("Could not deserialize result", e);
@@ -229,6 +350,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
}
/**
* Resolves the proxy into its backing object.
*
* @return
*/
private synchronized Object resolve() {
@@ -248,14 +371,6 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result;
}
public boolean isResolved() {
return resolved;
}
public Object getResult() {
return result;
}
}
/**
@@ -273,6 +388,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
Enhancer enhancer = new Enhancer();
enhancer.setSuperclass(type);
enhancer.setCallbackType(org.springframework.cglib.proxy.MethodInterceptor.class);
enhancer.setInterfaces(new Class[] { LazyLoadingProxy.class });
Factory factory = (Factory) OBJENESIS.newInstance(enhancer.createClass());
factory.setCallbacks(new Callback[] { interceptor });

View File

@@ -0,0 +1,45 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import com.mongodb.DBRef;
/**
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
*
* @author Thomas Darimont
* @since 1.5
*/
public interface LazyLoadingProxy {
/**
* Initializes the proxy and returns the wrapped value.
*
* @return
* @since 1.5
*/
Object initialize();
/**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.
*
* @return
* @since 1.5
*/
DBRef toDBRef();
}

View File

@@ -19,6 +19,8 @@ import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.EnumMap;
import java.util.EnumSet;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
@@ -274,9 +276,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse();
MongoPersistentProperty property = association.getInverse();
Object obj = dbRefResolver.resolveDbRef(inverseProp, new DbRefResolverCallback() {
Object value = dbo.get(property.getName());
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override
public Object resolve(MongoPersistentProperty property) {
@@ -284,7 +288,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
});
wrapper.setProperty(inverseProp, obj);
wrapper.setProperty(property, obj);
}
});
@@ -401,6 +405,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object propertyObj = wrapper.getProperty(prop, prop.getType(), fieldAccessOnly);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
} else {
@@ -447,13 +452,32 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (prop.isDbReference()) {
DBRef dbRefObj = createDBRef(obj, prop);
DBRef dbRefObj = null;
/*
* If we already have a LazyLoadingProxy, we use it's cached DBRef value instead of
* unnecessarily initializing it only to convert it to a DBRef a few instructions later.
*/
if (obj instanceof LazyLoadingProxy) {
dbRefObj = ((LazyLoadingProxy) obj).toDBRef();
}
dbRefObj = dbRefObj != null ? dbRefObj : createDBRef(obj, prop);
if (null != dbRefObj) {
accessor.put(prop, dbRefObj);
return;
}
}
/*
* If we have a LazyLoadingProxy we make sure it is initialized first.
*/
if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize();
}
// Lookup potential custom target type
Class<?> basicTargetType = conversions.getCustomWriteTarget(obj.getClass(), null);
@@ -794,7 +818,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param sourceValue must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
@SuppressWarnings({ "unchecked", "null" })
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) {
Assert.notNull(targetType);
@@ -807,11 +831,20 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, sourceValue.size());
TypeInformation<?> componentType = targetType.getComponentType();
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
Collection<Object> items;
if (targetType.getType().isArray()) {
items = new ArrayList<Object>();
} else if (EnumSet.class.isAssignableFrom(collectionType)) {
Assert.notNull(rawComponentType, "Component type must not be null for enum sets!");
items = EnumSet.noneOf(rawComponentType.asSubclass(Enum.class));
} else {
items = CollectionFactory.createCollection(collectionType, sourceValue.size());
}
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
@@ -836,31 +869,43 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param dbObject
* @return
*/
@SuppressWarnings("unchecked")
@SuppressWarnings({ "unchecked", "null", "rawtypes" })
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) {
Assert.notNull(dbObject);
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
Map<Object, Object> map = CollectionFactory.createMap(mapType, dbObject.keySet().size());
TypeInformation<?> keyType = type.getComponentType();
Class<?> rawKeyType = keyType == null ? null : keyType.getType();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
Map<Object, Object> map;
if (EnumMap.class.isAssignableFrom(mapType)) {
Assert.notNull(keyType, "Key type must nut be null for enum maps!");
map = new EnumMap(rawKeyType.asSubclass(Enum.class));
} else {
map = CollectionFactory.createMap(mapType, dbObject.keySet().size());
}
Map<String, Object> sourceMap = dbObject.toMap();
for (Entry<String, Object> entry : sourceMap.entrySet()) {
if (typeMapper.isTypeKey(entry.getKey())) {
continue;
}
Object key = potentiallyUnescapeMapKey(entry.getKey());
TypeInformation<?> keyTypeInformation = type.getComponentType();
if (keyTypeInformation != null) {
Class<?> keyType = keyTypeInformation.getType();
key = conversionService.convert(key, keyType);
if (rawKeyType != null) {
key = conversionService.convert(key, rawKeyType);
}
Object value = entry.getValue();
TypeInformation<?> valueType = type.getMapValueType();
Class<?> rawValueType = valueType == null ? null : valueType.getType();
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent));
@@ -911,7 +956,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation == null ? null : ClassTypeInformation.OBJECT;
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation;
if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint);

View File

@@ -138,7 +138,7 @@ public class QueryMapper {
value = getMappedValue(field, rawValue);
}
return Collections.singletonMap(key, value).entrySet().iterator().next();
return createMapEntry(key, value);
}
/**
@@ -168,7 +168,7 @@ public class QueryMapper {
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
newConditions.add(condition instanceof DBObject ? getMappedObject((DBObject) condition, entity)
newConditions.add(isDBObject(condition) ? getMappedObject((DBObject) condition, entity)
: convertSimpleOrDBObject(condition, entity));
}
@@ -209,7 +209,7 @@ public class QueryMapper {
if (documentField.isIdField()) {
if (value instanceof DBObject) {
if (isDBObject(value)) {
DBObject valueDbo = (DBObject) value;
DBObject resultDbo = new BasicDBObject(valueDbo.toMap());
@@ -250,14 +250,31 @@ public class QueryMapper {
* type of the given value is compatible with the type of the given document field in order to deal with potential
* query field exclusions, since MongoDB uses the {@code int} {@literal 0} as an indicator for an excluded field.
*
* @param documentField
* @param documentField must not be {@literal null}.
* @param value
* @return
*/
protected boolean isAssociationConversionNecessary(Field documentField, Object value) {
return documentField.isAssociation() && value != null
&& (documentField.getProperty().getActualType().isAssignableFrom(value.getClass()) //
|| documentField.getPropertyEntity().getIdProperty().getActualType().isAssignableFrom(value.getClass()));
Assert.notNull(documentField, "Document field must not be null!");
if (value == null) {
return false;
}
if (!documentField.isAssociation()) {
return false;
}
Class<? extends Object> type = value.getClass();
MongoPersistentProperty property = documentField.getProperty();
if (property.getActualType().isAssignableFrom(type)) {
return true;
}
MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type);
}
/**
@@ -267,13 +284,13 @@ public class QueryMapper {
* @param entity
* @return
*/
private Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
protected Object convertSimpleOrDBObject(Object source, MongoPersistentEntity<?> entity) {
if (source instanceof BasicDBList) {
return delegateConvertToMongoType(source, entity);
}
if (source instanceof DBObject) {
if (isDBObject(source)) {
return getMappedObject((DBObject) source, entity);
}
@@ -289,7 +306,7 @@ public class QueryMapper {
* @return the converted mongo type or null if source is null
*/
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return converter.convertToMongoType(source);
return converter.convertToMongoType(source, entity == null ? null : entity.getTypeInformation());
}
protected Object convertAssociation(Object source, Field field) {
@@ -305,7 +322,7 @@ public class QueryMapper {
*/
protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBRef) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) {
return source;
}
@@ -329,6 +346,40 @@ public class QueryMapper {
return createDbRefFor(source, property);
}
/**
* Checks whether the given value is a {@link DBObject}.
*
* @param value can be {@literal null}.
* @return
*/
protected final boolean isDBObject(Object value) {
return value instanceof DBObject;
}
/**
* Creates a new {@link Entry} for the given {@link Field} with the given value.
*
* @param field must not be {@literal null}.
* @param value can be {@literal null}.
* @return
*/
protected final Entry<String, Object> createMapEntry(Field field, Object value) {
return createMapEntry(field.getMappedKey(), value);
}
/**
* Creates a new {@link Entry} with the given key and value.
*
* @param key must not be {@literal null} or empty.
* @param value can be {@literal null}
* @return
*/
private Entry<String, Object> createMapEntry(String key, Object value) {
Assert.hasText(key, "Key must not be null or empty!");
return Collections.singletonMap(key, value).entrySet().iterator().next();
}
private DBRef createDbRefFor(Object source, MongoPersistentProperty property) {
if (source instanceof DBRef) {
@@ -560,6 +611,21 @@ public class QueryMapper {
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this(name, entity, context, null);
}
/**
* Creates a new {@link MetadataBackedField} with the given name, {@link MongoPersistentEntity} and
* {@link MappingContext} with the given {@link MongoPersistentProperty}.
*
* @param name must not be {@literal null} or empty.
* @param entity must not be {@literal null}.
* @param context must not be {@literal null}.
* @param property may be {@literal null}.
*/
public MetadataBackedField(String name, MongoPersistentEntity<?> entity,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
MongoPersistentProperty property) {
super(name);
@@ -569,7 +635,7 @@ public class QueryMapper {
this.mappingContext = context;
this.path = getPath(name);
this.property = path == null ? null : path.getLeafProperty();
this.property = path == null ? property : path.getLeafProperty();
this.association = findAssociation();
}
@@ -579,7 +645,7 @@ public class QueryMapper {
*/
@Override
public MetadataBackedField with(String name) {
return new MetadataBackedField(name, entity, mappingContext);
return new MetadataBackedField(name, entity, mappingContext, property);
}
/*

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Collections;
import java.util.Iterator;
import java.util.Map.Entry;
@@ -76,6 +75,10 @@ public class UpdateMapper extends QueryMapper {
@Override
protected Entry<String, Object> getMappedObjectForField(Field field, Object rawValue) {
if (isDBObject(rawValue)) {
return createMapEntry(field, convertSimpleOrDBObject(rawValue, field.getPropertyEntity()));
}
if (!isUpdateModifier(rawValue)) {
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
}
@@ -100,7 +103,7 @@ public class UpdateMapper extends QueryMapper {
throw new IllegalArgumentException(String.format("Unable to map value of type '%s'!", rawValue.getClass()));
}
return Collections.singletonMap(field.getMappedKey(), value).entrySet().iterator().next();
return createMapEntry(field, value);
}
/*
@@ -163,6 +166,15 @@ public class UpdateMapper extends QueryMapper {
this.key = key;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getMappedKey()
*/
@Override
public String getMappedKey() {
return this.getPath() == null ? key : super.getMappedKey();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getPropertyConverter()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,10 +30,8 @@ import org.springframework.util.Assert;
*/
public class Point {
@Field(order = 10)
private final double x;
@Field(order = 20)
private final double y;
@Field(order = 10) private final double x;
@Field(order = 20) private final double y;
@PersistenceConstructor
public Point(double x, double y) {
@@ -69,9 +67,9 @@ public class Point {
int result = 1;
long temp;
temp = Double.doubleToLongBits(x);
result = prime * result + (int) (temp ^ (temp >>> 32));
result = prime * result + (int) (temp ^ temp >>> 32);
temp = Double.doubleToLongBits(y);
result = prime * result + (int) (temp ^ (temp >>> 32));
result = prime * result + (int) (temp ^ temp >>> 32);
return result;
}
@@ -98,6 +96,6 @@ public class Point {
@Override
public String toString() {
return String.format("Point [latitude=%f, longitude=%f]", x, y);
return String.format("Point [x=%f, y=%f]", x, y);
}
}

View File

@@ -31,6 +31,7 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public final class NearQuery {
@@ -143,10 +144,12 @@ public final class NearQuery {
/**
* Configures the {@link Pageable} to use.
*
* @param pageable
* @param pageable must not be {@literal null}
* @return
*/
public NearQuery with(Pageable pageable) {
Assert.notNull(pageable, "Pageable must not be 'null'.");
this.num = pageable.getOffset() + pageable.getPageSize();
this.skip = pageable.getOffset();
return this;
@@ -311,13 +314,18 @@ public final class NearQuery {
/**
* Adds an actual query to the {@link NearQuery} to restrict the objects considered for the actual near operation.
*
* @param query
* @param query must not be {@literal null}.
* @return
*/
public NearQuery query(Query query) {
Assert.notNull(query, "Cannot apply 'null' query on NearQuery.");
this.query = query;
this.skip = query.getSkip();
this.num = query.getLimit();
if (query.getLimit() != 0) {
this.num = query.getLimit();
}
return this;
}

View File

@@ -79,22 +79,24 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(method, parameters);
Query query = createQuery(new ConvertingParameterAccessor(operations.getConverter(), accessor));
Object result = null;
if (method.isGeoNearQuery() && method.isPageQuery()) {
MongoParameterAccessor countAccessor = new MongoParametersParameterAccessor(method, parameters);
Query countQuery = createCountQuery(new ConvertingParameterAccessor(operations.getConverter(), countAccessor));
return new GeoNearExecution(accessor).execute(query, countQuery);
result = new GeoNearExecution(accessor).execute(query, countQuery);
} else if (method.isGeoNearQuery()) {
return new GeoNearExecution(accessor).execute(query);
} else if (method.isCollectionQuery()) {
return new CollectionExecution(accessor.getPageable()).execute(query);
result = new CollectionExecution(accessor.getPageable()).execute(query);
} else if (method.isPageQuery()) {
return new PagedExecution(accessor.getPageable()).execute(query);
result = new PagedExecution(accessor.getPageable()).execute(query);
} else {
result = new SingleEntityExecution(isCountQuery()).execute(query);
}
Object result = new SingleEntityExecution(isCountQuery()).execute(query);
if (result == null) {
return result;
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Sample configuration class in default package.
*
* @see DATAMONGO-877
* @author Oliver Gierke
*/
@Configuration
public class ConfigClassInDefaultPackage extends AbstractMongoConfiguration {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.config.AbstractMongoConfiguration#getDatabaseName()
*/
@Override
protected String getDatabaseName() {
return "default";
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.config.AbstractMongoConfiguration#mongo()
*/
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import org.junit.Test;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
/**
* Unit test for {@link ConfigClassInDefaultPackage}.
*
* @see DATAMONGO-877
* @author Oliver Gierke
*/
public class ConfigClassInDefaultPackageUnitTests {
/**
* @see DATAMONGO-877
*/
@Test
public void loadsConfigClassFromDefaultPackage() {
new AnnotationConfigApplicationContext(ConfigClassInDefaultPackage.class).close();
}
}

View File

@@ -23,6 +23,7 @@ import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.domain.AuditorAware;
@@ -96,6 +97,21 @@ public class AuditingViaJavaConfigRepositoriesTests {
assertThat(createdBy.getFirstname(), is(this.auditor.getFirstname()));
}
/**
* @see DATAMONGO-843
*/
@Test
@SuppressWarnings("resource")
public void defaultsMappingContextIfNoneConfigured() {
new AnnotationConfigApplicationContext(SampleConfig.class);
}
@Repository
static interface AuditablePersonRepository extends MongoRepository<AuditablePerson, String> {}
@Configuration
@EnableMongoAuditing
static class SampleConfig {
}
}

View File

@@ -21,9 +21,11 @@ import static org.junit.Assert.*;
import java.util.Collections;
import java.util.Set;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanDefinitionParsingException;
import org.springframework.beans.factory.support.DefaultListableBeanFactory;
import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
import org.springframework.core.convert.TypeDescriptor;
@@ -45,37 +47,45 @@ import com.mongodb.DBObject;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParserIntegrationTests {
@Rule public ExpectedException exception = ExpectedException.none();
DefaultListableBeanFactory factory;
@Before
public void setUp() {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource("namespace/converter.xml"));
}
/**
* @see DATAMONGO-243
*/
@Test
public void allowsDbFactoryRefAttribute() {
loadValidConfiguration();
factory.getBeanDefinition("converter");
factory.getBean("converter");
}
/**
* @see DATAMONGO-725
*/
@Test
public void hasCustomTypeMapper() {
loadValidConfiguration();
MappingMongoConverter converter = factory.getBean("converter", MappingMongoConverter.class);
MongoTypeMapper customMongoTypeMapper = factory.getBean(CustomMongoTypeMapper.class);
assertThat(converter.getTypeMapper(), is(customMongoTypeMapper));
}
/**
* @see DATAMONGO-301
*/
@Test
public void scansForConverterAndSetsUpCustomConversionsAccordingly() {
loadValidConfiguration();
CustomConversions conversions = factory.getBean(CustomConversions.class);
assertThat(conversions.hasCustomWriteTarget(Person.class), is(true));
assertThat(conversions.hasCustomWriteTarget(Account.class), is(true));
@@ -87,6 +97,7 @@ public class MappingMongoConverterParserIntegrationTests {
@Test
public void activatesAbbreviatingPropertiesCorrectly() {
loadValidConfiguration();
BeanDefinition definition = factory.getBeanDefinition("abbreviatingConverter.mappingContext");
Object value = definition.getPropertyValues().getPropertyValue("fieldNamingStrategy").getValue();
@@ -95,6 +106,32 @@ public class MappingMongoConverterParserIntegrationTests {
assertThat(strategy.getBeanClassName(), is(CamelCaseAbbreviatingFieldNamingStrategy.class.getName()));
}
/**
* @see DATAMONGO-892
*/
@Test
public void shouldThrowBeanDefinitionParsingExceptionIfConverterDefinedAsNestedBean() {
exception.expect(BeanDefinitionParsingException.class);
exception.expectMessage("Mongo Converter must not be defined as nested bean.");
loadNestedBeanConfiguration();
}
private void loadValidConfiguration() {
this.loadConfiguration("namespace/converter.xml");
}
private void loadNestedBeanConfiguration() {
this.loadConfiguration("namespace/converter-nested-bean-definition.xml");
}
private void loadConfiguration(String configLocation) {
factory = new DefaultListableBeanFactory();
XmlBeanDefinitionReader reader = new XmlBeanDefinitionReader(factory);
reader.loadBeanDefinitions(new ClassPathResource(configLocation));
}
@Component
public static class SampleConverter implements Converter<Person, DBObject> {
public DBObject convert(Person source) {

View File

@@ -34,7 +34,6 @@ import java.util.List;
import java.util.Map;
import org.bson.types.ObjectId;
import org.hamcrest.CoreMatchers;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Before;
@@ -61,6 +60,7 @@ import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxy;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
@@ -298,6 +298,9 @@ public class MongoTemplateTests {
@Test
public void rejectsDuplicateIdInInsertAll() {
thrown.expect(DataIntegrityViolationException.class);
thrown.expectMessage("E11000 duplicate key error index: database.person.$_id_");
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
@@ -309,15 +312,7 @@ public class MongoTemplateTests {
records.add(person);
records.add(person);
try {
template.insertAll(records);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
CoreMatchers
.startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
template.insertAll(records);
}
@Test
@@ -2293,8 +2288,6 @@ public class MongoTemplateTests {
}
/**
* <<<<<<< HEAD
*
* @see DATAMONOGO-828
*/
@Test
@@ -2394,7 +2387,7 @@ public class MongoTemplateTests {
assertThat(result.dbRefAnnotatedList.get(0), is(notNullValue()));
assertThat(result.dbRefAnnotatedList.get(0).id, is((Object) "1"));
}
/**
* @see DATAMONGO-852
*/
@@ -2422,7 +2415,7 @@ public class MongoTemplateTests {
* @see DATAMONGO-468
*/
@Test
public void shouldBeAbleToUpdateDbRefPropertyWithDomainObject(){
public void shouldBeAbleToUpdateDbRefPropertyWithDomainObject() {
Sample sample1 = new Sample("1", "A");
Sample sample2 = new Sample("2", "B");
@@ -2434,17 +2427,217 @@ public class MongoTemplateTests {
doc.dbRefProperty = sample1;
template.save(doc);
Update update = new Update().set("dbRefProperty",sample2);
Update update = new Update().set("dbRefProperty", sample2);
Query qry = query(where("id").is("1"));
template.updateFirst(qry, update, DocumentWithDBRefCollection.class);
DocumentWithDBRefCollection updatedDoc = template.findOne(qry, DocumentWithDBRefCollection.class);
assertThat(updatedDoc,is(notNullValue()));
assertThat(updatedDoc.dbRefProperty,is(notNullValue()));
assertThat(updatedDoc.dbRefProperty.id,is(sample2.id));
assertThat(updatedDoc.dbRefProperty.field,is(sample2.field));
assertThat(updatedDoc, is(notNullValue()));
assertThat(updatedDoc.dbRefProperty, is(notNullValue()));
assertThat(updatedDoc.dbRefProperty.id, is(sample2.id));
assertThat(updatedDoc.dbRefProperty.field, is(sample2.field));
}
/**
* @see DATAMONGO-862
*/
@Test
public void testUpdateShouldWorkForPathsOnInterfaceMethods() {
DocumentWithCollection document = new DocumentWithCollection(Arrays.<Model> asList(new ModelA("spring"),
new ModelA("data")));
template.save(document);
Query query = query(where("id").is(document.id).and("models._id").exists(true));
Update update = new Update().set("models.$.value", "mongodb");
template.findAndModify(query, update, DocumentWithCollection.class);
DocumentWithCollection result = template.findOne(query(where("id").is(document.id)), DocumentWithCollection.class);
assertThat(result.models.get(0).value(), is("mongodb"));
}
/**
* @see DATAMONGO-773
*/
@Test
public void testShouldSupportQueryWithIncludedDbRefField() {
Sample sample = new Sample("47111", "foo");
template.save(sample);
DocumentWithDBRefCollection doc = new DocumentWithDBRefCollection();
doc.id = "4711";
doc.dbRefProperty = sample;
template.save(doc);
Query qry = query(where("id").is(doc.id));
qry.fields().include("dbRefProperty");
List<DocumentWithDBRefCollection> result = template.find(qry, DocumentWithDBRefCollection.class);
assertThat(result, is(notNullValue()));
assertThat(result, hasSize(1));
assertThat(result.get(0), is(notNullValue()));
assertThat(result.get(0).dbRefProperty, is(notNullValue()));
assertThat(result.get(0).dbRefProperty.field, is(sample.field));
}
/**
* @see DATAMONGO-880
*/
@Test
public void savingAndReassigningLazyLoadingProxies() {
template.dropCollection(SomeTemplate.class);
template.dropCollection(SomeMessage.class);
template.dropCollection(SomeContent.class);
SomeContent content = new SomeContent();
content.id = "C1";
content.text = "BUBU";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "T1";
tmpl.content = content; // @DBRef(lazy=true) tmpl.content
template.save(tmpl);
SomeTemplate savedTmpl = template.findById(tmpl.id, SomeTemplate.class);
SomeMessage message = new SomeMessage();
message.id = "M1";
message.dbrefContent = savedTmpl.content; // @DBRef message.dbrefContent
message.normalContent = savedTmpl.content;
template.save(message);
SomeMessage savedMessage = template.findById(message.id, SomeMessage.class);
assertThat(savedMessage.dbrefContent.text, is(content.text));
assertThat(savedMessage.normalContent.text, is(content.text));
}
/**
* @see DATAMONGO-884
*/
@Test
public void callingNonObjectMethodsOnLazyLoadingProxyShouldReturnNullIfUnderlyingDbrefWasDeletedInbetween() {
template.dropCollection(SomeTemplate.class);
template.dropCollection(SomeContent.class);
SomeContent content = new SomeContent();
content.id = "C1";
content.text = "BUBU";
template.save(content);
SomeTemplate tmpl = new SomeTemplate();
tmpl.id = "T1";
tmpl.content = content; // @DBRef(lazy=true) tmpl.content
template.save(tmpl);
SomeTemplate savedTmpl = template.findById(tmpl.id, SomeTemplate.class);
template.remove(content);
assertThat(savedTmpl.getContent().toString(), is("someContent:C1$LazyLoadingProxy"));
assertThat(savedTmpl.getContent(), is(instanceOf(LazyLoadingProxy.class)));
assertThat(savedTmpl.getContent().getText(), is(nullValue()));
}
/**
* @see DATAMONGO-888
*/
@Test
public void sortOnIdFieldPropertyShouldBeMappedCorrectly() {
DoucmentWithNamedIdField one = new DoucmentWithNamedIdField();
one.someIdKey = "1";
one.value = "a";
DoucmentWithNamedIdField two = new DoucmentWithNamedIdField();
two.someIdKey = "2";
two.value = "b";
template.save(one);
template.save(two);
Query query = query(where("_id").in("1", "2")).with(new Sort(Direction.DESC, "someIdKey"));
assertThat(template.find(query, DoucmentWithNamedIdField.class), contains(two, one));
}
/**
* @see DATAMONGO-888
*/
@Test
public void sortOnAnnotatedFieldPropertyShouldBeMappedCorrectly() {
DoucmentWithNamedIdField one = new DoucmentWithNamedIdField();
one.someIdKey = "1";
one.value = "a";
DoucmentWithNamedIdField two = new DoucmentWithNamedIdField();
two.someIdKey = "2";
two.value = "b";
template.save(one);
template.save(two);
Query query = query(where("_id").in("1", "2")).with(new Sort(Direction.DESC, "value"));
assertThat(template.find(query, DoucmentWithNamedIdField.class), contains(two, one));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@Field(value = "val")//
String value;
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (someIdKey == null ? 0 : someIdKey.hashCode());
result = prime * result + (value == null ? 0 : value.hashCode());
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null) {
return false;
}
if (!(obj instanceof DoucmentWithNamedIdField)) {
return false;
}
DoucmentWithNamedIdField other = (DoucmentWithNamedIdField) obj;
if (someIdKey == null) {
if (other.someIdKey != null) {
return false;
}
} else if (!someIdKey.equals(other.someIdKey)) {
return false;
}
if (value == null) {
if (other.value != null) {
return false;
}
} else if (!value.equals(other.value)) {
return false;
}
return true;
}
}
static class DocumentWithDBRefCollection {
@@ -2454,7 +2647,7 @@ public class MongoTemplateTests {
@org.springframework.data.mongodb.core.mapping.DBRef//
public List<Sample> dbRefAnnotatedList;
@org.springframework.data.mongodb.core.mapping.DBRef
@org.springframework.data.mongodb.core.mapping.DBRef//
public Sample dbRefProperty;
}
@@ -2482,10 +2675,13 @@ public class MongoTemplateTests {
static interface Model {
String value();
String id();
}
static class ModelA implements Model {
@Id String id;
private String value;
ModelA(String value) {
@@ -2496,6 +2692,11 @@ public class MongoTemplateTests {
public String value() {
return this.value;
}
@Override
public String id() {
return id;
}
}
static class Document {
@@ -2618,4 +2819,30 @@ public class MongoTemplateTests {
@Id String id;
EnumValue value;
}
public static class SomeTemplate {
String id;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) SomeContent content;
public SomeContent getContent() {
return content;
}
}
public static class SomeContent {
String id;
String text;
public String getText() {
return text;
}
}
static class SomeMessage {
String id;
@org.springframework.data.mongodb.core.mapping.DBRef SomeContent dbrefContent;
SomeContent normalContent;
}
}

View File

@@ -25,10 +25,14 @@ import java.util.Collections;
import java.util.regex.Pattern;
import org.bson.types.ObjectId;
import org.hamcrest.core.Is;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.ArgumentMatcher;
import org.mockito.Matchers;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.runners.MockitoJUnitRunner;
@@ -37,6 +41,7 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.Version;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
@@ -48,6 +53,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@@ -58,6 +64,7 @@ import com.mongodb.MongoException;
* Unit tests for {@link MongoTemplate}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@@ -205,6 +212,46 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(entity.id, is(5));
}
/**
* @see DATAMONGO-868
*/
@Test
public void findAndModifyShouldBumpVersionByOneWhenVersionFieldNotIncludedInUpdate() {
VersionedEntity v = new VersionedEntity();
v.id = 1;
v.version = 0;
ArgumentCaptor<DBObject> captor = ArgumentCaptor.forClass(DBObject.class);
template.findAndModify(new Query(), new Update().set("id", "10"), VersionedEntity.class);
verify(collection, times(1)).findAndModify(Matchers.any(DBObject.class),
org.mockito.Matchers.isNull(DBObject.class), org.mockito.Matchers.isNull(DBObject.class), eq(false),
captor.capture(), eq(false), eq(false));
Assert.assertThat(captor.getValue().get("$inc"), Is.<Object> is(new BasicDBObject("version", 1L)));
}
/**
* @see DATAMONGO-868
*/
@Test
public void findAndModifyShouldNotBumpVersionByOneWhenVersionFieldAlreadyIncludedInUpdate() {
VersionedEntity v = new VersionedEntity();
v.id = 1;
v.version = 0;
ArgumentCaptor<DBObject> captor = ArgumentCaptor.forClass(DBObject.class);
template.findAndModify(new Query(), new Update().set("version", 100), VersionedEntity.class);
verify(collection, times(1)).findAndModify(Matchers.any(DBObject.class), isNull(DBObject.class),
isNull(DBObject.class), eq(false), captor.capture(), eq(false), eq(false));
Assert.assertThat(captor.getValue().get("$set"), Is.<Object> is(new BasicDBObject("version", 100)));
Assert.assertThat(captor.getValue().get("$inc"), nullValue());
}
/**
* @see DATAMONGO-533
*/
@@ -249,6 +296,12 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
}
}
static class VersionedEntity {
@Id Integer id;
@Version Integer version;
}
enum MyConverter implements Converter<AutogenerateableId, String> {
INSTANCE;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -33,14 +33,13 @@ import com.mongodb.DBCursor;
* Unit tests for {@link QueryCursorPreparer}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryCursorPreparerUnitTests {
@Mock
MongoDbFactory factory;
@Mock
DBCursor cursor;
@Mock MongoDbFactory factory;
@Mock DBCursor cursor;
/**
* @see DATAMONGO-185
@@ -50,7 +49,7 @@ public class QueryCursorPreparerUnitTests {
Query query = query(where("foo").is("bar")).withHint("hint");
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query);
CursorPreparer preparer = new MongoTemplate(factory).new QueryCursorPreparer(query, null);
preparer.prepare(cursor);
verify(cursor).hint("hint");

View File

@@ -179,4 +179,25 @@ public class AggregationUnitTests {
DBObject fields = getAsDBObject(secondProjection, "$group");
assertThat(fields.get("foosum"), is((Object) new BasicDBObject("$sum", "$foo")));
}
/**
* @see DATAMONGO-908
*/
@Test
public void shouldSupportReferingToNestedPropertiesInGroupOperation() {
DBObject agg = newAggregation( //
project("cmsParameterId", "rules"), //
unwind("rules"), //
group("cmsParameterId", "rules.ruleType").count().as("totol") //
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg, is(notNullValue()));
DBObject group = ((List<DBObject>) agg.get("pipeline")).get(2);
DBObject fields = getAsDBObject(group, "$group");
DBObject id = getAsDBObject(fields, "_id");
assertThat(id.get("ruleType"), is((Object) "$rules.ruleType"));
}
}

View File

@@ -7,6 +7,7 @@ import java.net.URL;
import java.text.DateFormat;
import java.text.Format;
import java.util.Arrays;
import java.util.Date;
import java.util.Locale;
import java.util.UUID;
@@ -183,6 +184,19 @@ public class CustomConversionsUnitTests {
assertThat(conversions.getCustomWriteTarget(DateTime.class, null), is(equalTo((Class) String.class)));
}
/**
* @see DATAMONGO-881
*/
@Test
public void customConverterOverridesDefault() {
CustomConversions conversions = new CustomConversions(Arrays.asList(CustomDateTimeConverter.INSTANCE));
GenericConversionService conversionService = new DefaultConversionService();
conversions.registerConvertersIn(conversionService);
assertThat(conversionService.convert(new DateTime(), Date.class), is(new Date(0)));
}
enum FormatToStringConverter implements Converter<Format, String> {
INSTANCE;
@@ -227,4 +241,14 @@ public class CustomConversionsUnitTests {
return "";
}
}
enum CustomDateTimeConverter implements Converter<DateTime, Date> {
INSTANCE;
@Override
public Date convert(DateTime source) {
return new Date(0);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -50,7 +50,10 @@ import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* Unit tests dor {@link DbRefMappingMongoConverterUnitTests}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class DbRefMappingMongoConverterUnitTests {
@@ -283,6 +286,149 @@ public class DbRefMappingMongoConverterUnitTests {
assertThat(deserializedResult.dbRefToSerializableTarget.getValue(), is(value));
}
/**
* @see DATAMONGO-884
*/
@Test
public void lazyLoadingProxyForToStringObjectMethodOverridingDbref() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToToStringObjectMethodOverride, is(notNullValue()));
assertProxyIsResolved(result.dbRefToToStringObjectMethodOverride, false);
assertThat(result.dbRefToToStringObjectMethodOverride.toString(), is(id + ":" + value));
assertProxyIsResolved(result.dbRefToToStringObjectMethodOverride, true);
}
/**
* @see DATAMONGO-884
*/
@Test
public void callingToStringObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
// calling Object#toString does not initialize the proxy.
String proxyString = result.dbRefToPlainObject.toString();
assertThat(proxyString, is("lazyDbRefTarget" + ":" + id + "$LazyLoadingProxy"));
assertProxyIsResolved(result.dbRefToPlainObject, false);
// calling another method not declared on object triggers proxy initialization.
assertThat(result.dbRefToPlainObject.getValue(), is(value));
assertProxyIsResolved(result.dbRefToPlainObject, true);
}
/**
* @see DATAMONGO-884
*/
@Test
public void equalsObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
assertThat(result.dbRefToPlainObject, is(equalTo(result.dbRefToPlainObject)));
assertThat(result.dbRefToPlainObject, is(not(equalTo(null))));
assertThat(result.dbRefToPlainObject, is(not(equalTo((Object) lazyDbRefs.dbRefToToStringObjectMethodOverride))));
assertProxyIsResolved(result.dbRefToPlainObject, false);
}
/**
* @see DATAMONGO-884
*/
@Test
public void hashcodeObjectMethodOnLazyLoadingDbrefShouldNotInitializeProxy() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefToPlainObject = new LazyDbRefTarget(id, value);
lazyDbRefs.dbRefToToStringObjectMethodOverride = new ToStringObjectMethodOverrideLazyDbRefTarget(id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertThat(result.dbRefToPlainObject, is(notNullValue()));
assertProxyIsResolved(result.dbRefToPlainObject, false);
assertThat(result.dbRefToPlainObject.hashCode(), is(311365444));
assertProxyIsResolved(result.dbRefToPlainObject, false);
}
/**
* @see DATAMONGO-884
*/
@Test
public void lazyLoadingProxyForEqualsAndHashcodeObjectMethodOverridingDbref() {
String id = "42";
String value = "bubu";
MappingMongoConverter converterSpy = spy(converter);
doReturn(new BasicDBObject("_id", id).append("value", value)).when(converterSpy).readRef((DBRef) any());
BasicDBObject dbo = new BasicDBObject();
WithObjectMethodOverrideLazyDbRefs lazyDbRefs = new WithObjectMethodOverrideLazyDbRefs();
lazyDbRefs.dbRefEqualsAndHashcodeObjectMethodOverride1 = new EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(
id, value);
lazyDbRefs.dbRefEqualsAndHashcodeObjectMethodOverride2 = new EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(
id, value);
converterSpy.write(lazyDbRefs, dbo);
WithObjectMethodOverrideLazyDbRefs result = converterSpy.read(WithObjectMethodOverrideLazyDbRefs.class, dbo);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride1, false);
assertThat(result.dbRefEqualsAndHashcodeObjectMethodOverride1, is(notNullValue()));
result.dbRefEqualsAndHashcodeObjectMethodOverride1.equals(null);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride1, true);
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride2, false);
assertThat(result.dbRefEqualsAndHashcodeObjectMethodOverride2, is(notNullValue()));
result.dbRefEqualsAndHashcodeObjectMethodOverride2.hashCode();
assertProxyIsResolved(result.dbRefEqualsAndHashcodeObjectMethodOverride2, true);
}
private Object transport(Object result) {
return SerializationUtils.deserialize(SerializationUtils.serialize(result));
}
@@ -345,6 +491,7 @@ public class DbRefMappingMongoConverterUnitTests {
}
}
@SuppressWarnings("serial")
static class LazyDbRefTargetWithPeristenceConstructor extends LazyDbRefTarget {
boolean persistenceConstructorCalled;
@@ -362,6 +509,7 @@ public class DbRefMappingMongoConverterUnitTests {
}
}
@SuppressWarnings("serial")
static class LazyDbRefTargetWithPeristenceConstructorWithoutDefaultConstructor extends LazyDbRefTarget {
boolean persistenceConstructorCalled;
@@ -387,4 +535,74 @@ public class DbRefMappingMongoConverterUnitTests {
private static final long serialVersionUID = 1L;
}
static class ToStringObjectMethodOverrideLazyDbRefTarget extends LazyDbRefTarget {
private static final long serialVersionUID = 1L;
public ToStringObjectMethodOverrideLazyDbRefTarget() {}
public ToStringObjectMethodOverrideLazyDbRefTarget(String id, String value) {
super(id, value);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return this.id + ":" + this.value;
}
}
static class EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget extends LazyDbRefTarget {
private static final long serialVersionUID = 1L;
public EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget() {}
public EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget(String id, String value) {
super(id, value);
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((id == null) ? 0 : id.hashCode());
result = prime * result + ((value == null) ? 0 : value.hashCode());
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget other = (EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget) obj;
if (id == null) {
if (other.id != null)
return false;
} else if (!id.equals(other.id))
return false;
if (value == null) {
if (other.value != null)
return false;
} else if (!value.equals(other.value))
return false;
return true;
}
}
static class WithObjectMethodOverrideLazyDbRefs {
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) LazyDbRefTarget dbRefToPlainObject;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) ToStringObjectMethodOverrideLazyDbRefTarget dbRefToToStringObjectMethodOverride;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget dbRefEqualsAndHashcodeObjectMethodOverride2;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) EqualsAndHashCodeObjectMethodOverrideLazyDbRefTarget dbRefEqualsAndHashcodeObjectMethodOverride1;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import static org.junit.Assert.*;
import org.springframework.aop.framework.Advised;
import org.springframework.cglib.proxy.Factory;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import org.springframework.test.util.ReflectionTestUtils;
/**
* Utility class to test proxy handling for lazy loading.
@@ -39,8 +40,8 @@ public class LazyLoadingTestUtils {
public static void assertProxyIsResolved(Object target, boolean expected) {
LazyLoadingInterceptor interceptor = extractInterceptor(target);
assertThat(interceptor.isResolved(), is(expected));
assertThat(interceptor.getResult(), is(expected ? notNullValue() : nullValue()));
assertThat(ReflectionTestUtils.getField(interceptor, "resolved"), is((Object) expected));
assertThat(ReflectionTestUtils.getField(interceptor, "result"), is(expected ? notNullValue() : nullValue()));
}
private static LazyLoadingInterceptor extractInterceptor(Object proxy) {

View File

@@ -28,6 +28,8 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Date;
import java.util.EnumMap;
import java.util.EnumSet;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.List;
@@ -35,6 +37,7 @@ import java.util.Locale;
import java.util.Map;
import java.util.Set;
import java.util.SortedMap;
import java.util.TreeMap;
import org.bson.types.ObjectId;
import org.hamcrest.Matcher;
@@ -1387,6 +1390,7 @@ public class MappingMongoConverterUnitTests {
/**
* @see DATAMONGO-812
* @see DATAMONGO-893
*/
@Test
public void convertsListToBasicDBListAndRetainsTypeInformationForComplexObjects() {
@@ -1396,7 +1400,7 @@ public class MappingMongoConverterUnitTests {
address.street = "Foo";
Object result = converter.convertToMongoType(Collections.singletonList(address),
ClassTypeInformation.from(Address.class));
ClassTypeInformation.from(InterfaceType.class));
assertThat(result, is(instanceOf(BasicDBList.class)));
@@ -1454,6 +1458,76 @@ public class MappingMongoConverterUnitTests {
assertThat(dbList.get(0), instanceOf(String.class));
}
/**
* @see DATAMONGO-833
*/
@Test
public void readsEnumSetCorrectly() {
BasicDBList enumSet = new BasicDBList();
enumSet.add("SECOND");
DBObject dbObject = new BasicDBObject("enumSet", enumSet);
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, dbObject);
assertThat(result.enumSet, is(instanceOf(EnumSet.class)));
assertThat(result.enumSet.size(), is(1));
assertThat(result.enumSet, hasItem(SampleEnum.SECOND));
}
/**
* @see DATAMONGO-833
*/
@Test
public void readsEnumMapCorrectly() {
BasicDBObject enumMap = new BasicDBObject("FIRST", "Dave");
ClassWithEnumProperty result = converter.read(ClassWithEnumProperty.class, new BasicDBObject("enumMap", enumMap));
assertThat(result.enumMap, is(instanceOf(EnumMap.class)));
assertThat(result.enumMap.size(), is(1));
assertThat(result.enumMap.get(SampleEnum.FIRST), is("Dave"));
}
/**
* @see DATAMONGO-887
*/
@Test
public void readsTreeMapCorrectly() {
DBObject person = new BasicDBObject("foo", "Dave");
DBObject treeMapOfPerson = new BasicDBObject("key", person);
DBObject document = new BasicDBObject("treeMapOfPersons", treeMapOfPerson);
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, document);
assertThat(result.treeMapOfPersons, is(notNullValue()));
assertThat(result.treeMapOfPersons.get("key"), is(notNullValue()));
assertThat(result.treeMapOfPersons.get("key").firstname, is("Dave"));
}
/**
* @see DATAMONGO-887
*/
@Test
public void writesTreeMapCorrectly() {
Person person = new Person();
person.firstname = "Dave";
ClassWithMapProperty source = new ClassWithMapProperty();
source.treeMapOfPersons = new TreeMap<String, Person>();
source.treeMapOfPersons.put("key", person);
DBObject result = new BasicDBObject();
converter.write(source, result);
DBObject map = getAsDBObject(result, "treeMapOfPersons");
DBObject entry = getAsDBObject(map, "key");
assertThat(entry.get("foo"), is((Object) "Dave"));
}
static class GenericType<T> {
T content;
}
@@ -1462,6 +1536,8 @@ public class MappingMongoConverterUnitTests {
SampleEnum sampleEnum;
List<SampleEnum> enums;
EnumSet<SampleEnum> enumSet;
EnumMap<SampleEnum, String> enumMap;
}
static enum SampleEnum {
@@ -1479,7 +1555,11 @@ public class MappingMongoConverterUnitTests {
abstract void method();
}
static class Address {
static interface InterfaceType {
}
static class Address implements InterfaceType {
String street;
String city;
}
@@ -1519,6 +1599,7 @@ public class MappingMongoConverterUnitTests {
Map<String, Object> mapOfObjects;
Map<String, String[]> mapOfStrings;
Map<String, Person> mapOfPersons;
TreeMap<String, Person> treeMapOfPersons;
}
static class ClassWithNestedMaps {

View File

@@ -39,6 +39,7 @@ import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -57,6 +58,7 @@ import com.mongodb.QueryBuilder;
* @author Oliver Gierke
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryMapperUnitTests {
@@ -501,10 +503,108 @@ public class QueryMapperUnitTests {
assertThat(idValuesAfter, is(idValuesBefore));
}
/**
* @see DATAMONGO-821
*/
@Test
public void queryMapperShouldNotTryToMapDBRefListPropertyIfNestedInsideDBObjectWithinDBObject() {
DBObject queryObject = query(
where("referenceList").is(new BasicDBObject("$nested", new BasicDBObject("$keys", 0L)))).getQueryObject();
DBObject mappedObject = mapper.getMappedObject(queryObject, context.getPersistentEntity(WithDBRefList.class));
DBObject referenceObject = getAsDBObject(mappedObject, "referenceList");
DBObject nestedObject = getAsDBObject(referenceObject, "$nested");
assertThat(nestedObject, is((DBObject) new BasicDBObject("$keys", 0L)));
}
/**
* @see DATAMONGO-821
*/
@Test
public void queryMapperShouldNotTryToMapDBRefPropertyIfNestedInsideDBObjectWithinDBObject() {
DBObject queryObject = query(where("reference").is(new BasicDBObject("$nested", new BasicDBObject("$keys", 0L))))
.getQueryObject();
DBObject mappedObject = mapper.getMappedObject(queryObject, context.getPersistentEntity(WithDBRef.class));
DBObject referenceObject = getAsDBObject(mappedObject, "reference");
DBObject nestedObject = getAsDBObject(referenceObject, "$nested");
assertThat(nestedObject, is((DBObject) new BasicDBObject("$keys", 0L)));
}
/**
* @see DATAMONGO-821
*/
@Test
public void queryMapperShouldMapDBRefPropertyIfNestedInDBObject() {
Reference sample = new Reference();
sample.id = 321L;
DBObject queryObject = query(where("reference").is(new BasicDBObject("$in", Arrays.asList(sample))))
.getQueryObject();
DBObject mappedObject = mapper.getMappedObject(queryObject, context.getPersistentEntity(WithDBRef.class));
DBObject referenceObject = getAsDBObject(mappedObject, "reference");
BasicDBList inObject = getAsDBList(referenceObject, "$in");
assertThat(inObject.get(0), is(instanceOf(com.mongodb.DBRef.class)));
}
/**
* @see DATAMONGO-773
*/
@Test
public void queryMapperShouldBeAbleToProcessQueriesThatIncludeDbRefFields() {
BasicMongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(WithDBRef.class);
Query qry = query(where("someString").is("abc"));
qry.fields().include("reference");
DBObject mappedFields = mapper.getMappedObject(qry.getFieldsObject(), persistentEntity);
assertThat(mappedFields, is(notNullValue()));
}
/**
* @see DATAMONGO-893
*/
@Test
public void classInformationShouldNotBePresentInDBObjectUsedInFinderMethods() {
EmbeddedClass embedded = new EmbeddedClass();
embedded.id = "1";
EmbeddedClass embedded2 = new EmbeddedClass();
embedded2.id = "2";
Query query = query(where("embedded").in(Arrays.asList(embedded, embedded2)));
DBObject dbo = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class));
assertThat(dbo.toString(), equalTo("{ \"embedded\" : { \"$in\" : [ { \"_id\" : \"1\"} , { \"_id\" : \"2\"}]}}"));
}
@Document
public class Foo {
@Id private ObjectId id;
EmbeddedClass embedded;
}
public class EmbeddedClass {
public String id;
}
class IdWrapper {
Object id;
}
class ClassWithEmbedded {
@Id String id;
Sample sample;
}
class ClassWithDefaultId {
String id;
@@ -541,6 +641,12 @@ public class QueryMapperUnitTests {
@DBRef Reference reference;
}
class WithDBRefList {
String someString;
@DBRef List<Reference> referenceList;
}
class Reference {
Long id;

View File

@@ -25,6 +25,8 @@ import java.util.Arrays;
import java.util.List;
import org.hamcrest.Matcher;
import org.hamcrest.collection.IsIterableContainingInOrder;
import org.hamcrest.core.IsEqual;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -36,6 +38,7 @@ import org.springframework.data.annotation.Id;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Update;
@@ -376,6 +379,117 @@ public class UpdateMapperUnitTests {
assertThat(setClause.get("dbRefProperty"), is((Object) new DBRef(null, "entity", entity.id)));
}
/**
* @see DATAMONGO-862
*/
@Test
public void rendersUpdateAndPreservesKeyForPathsNotPointingToProperty() {
Update update = new Update().set("listOfInterface.$.value", "expected-value");
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
DBObject setClause = getAsDBObject(mappedObject, "$set");
assertThat(setClause.containsField("listOfInterface.$.value"), is(true));
}
/**
* @see DATAMONGO-863
*/
@Test
public void doesNotConvertRawDbObjects() {
Update update = new Update();
update.pull("options",
new BasicDBObject("_id", new BasicDBObject("$in", converter.convertToMongoType(Arrays.asList(1L, 2L)))));
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
DBObject setClause = getAsDBObject(mappedObject, "$pull");
DBObject options = getAsDBObject(setClause, "options");
DBObject idClause = getAsDBObject(options, "_id");
BasicDBList inClause = getAsDBList(idClause, "$in");
assertThat(inClause, IsIterableContainingInOrder.<Object> contains(1L, 2L));
}
/**
* @see DATAMONGO-897
*/
@Test
public void updateOnDbrefPropertyOfInterfaceTypeWithoutExplicitGetterForIdShouldBeMappedCorrectly() {
Update update = new Update().set("referencedDocument", new InterfaceDocumentDefinitionImpl("1", "Foo"));
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithReferenceToInterfaceImpl.class));
DBObject $set = DBObjectTestUtils.getAsDBObject(mappedObject, "$set");
Object model = $set.get("referencedDocument");
DBRef expectedDBRef = new DBRef(factory.getDb(), "interfaceDocumentDefinitionImpl", "1");
assertThat(model, allOf(instanceOf(DBRef.class), IsEqual.<Object> equalTo(expectedDBRef)));
}
@org.springframework.data.mongodb.core.mapping.Document(collection = "DocumentWithReferenceToInterface")
static interface DocumentWithReferenceToInterface {
String getId();
InterfaceDocumentDefinitionWithoutId getReferencedDocument();
}
static interface InterfaceDocumentDefinitionWithoutId {
String getValue();
}
static class InterfaceDocumentDefinitionImpl implements InterfaceDocumentDefinitionWithoutId {
@Id String id;
String value;
public InterfaceDocumentDefinitionImpl(String id, String value) {
this.id = id;
this.value = value;
}
@Override
public String getValue() {
return this.value;
}
}
static class DocumentWithReferenceToInterfaceImpl implements DocumentWithReferenceToInterface {
private @Id String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
private InterfaceDocumentDefinitionWithoutId referencedDocument;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public void setModel(InterfaceDocumentDefinitionWithoutId referencedDocument) {
this.referencedDocument = referencedDocument;
}
@Override
public InterfaceDocumentDefinitionWithoutId getReferencedDocument() {
return this.referencedDocument;
}
}
static interface Model {}
static class ModelImpl implements Model {
@@ -384,6 +498,7 @@ public class UpdateMapperUnitTests {
public ModelImpl(int value) {
this.value = value;
}
}
public class ModelWrapper {
@@ -411,6 +526,9 @@ public class UpdateMapperUnitTests {
@Field("aliased")//
List<? extends AbstractChildClass> list;
@Field//
List<Model> listOfInterface;
public ParentClass(String id, List<? extends AbstractChildClass> list) {
this.id = id;
this.list = list;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import static org.junit.Assert.*;
import org.junit.Test;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Metric;
import org.springframework.data.mongodb.core.geo.Metrics;
@@ -31,6 +32,7 @@ import org.springframework.data.mongodb.core.geo.Point;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class NearQueryUnitTests {
@@ -123,4 +125,36 @@ public class NearQueryUnitTests {
assertThat(query.getSkip(), is(pageable.getPageNumber() * pageable.getPageSize()));
assertThat((Integer) query.toDBObject().get("num"), is((pageable.getPageNumber() + 1) * pageable.getPageSize()));
}
/**
* @see DATAMONGO-829
*/
@Test
public void nearQueryShouldInoreZeroLimitFromQuery() {
NearQuery query = NearQuery.near(new Point(1, 2)).query(Query.query(Criteria.where("foo").is("bar")));
assertThat(query.toDBObject().get("num"), nullValue());
}
/**
* @see DATAMONOGO-829
*/
@Test(expected = IllegalArgumentException.class)
public void nearQueryShouldThrowExceptionWhenGivenANullQuery() {
NearQuery.near(new Point(1, 2)).query(null);
}
/**
* @see DATAMONGO-829
*/
@Test
public void numShouldNotBeAlteredByQueryWithoutPageable() {
int num = 100;
NearQuery query = NearQuery.near(new Point(1, 2));
query.num(num);
query.query(Query.query(Criteria.where("foo").is("bar")));
assertThat(DBObjectTestUtils.getTypedValue(query.toDBObject(), "num", Integer.class), is(num));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -50,6 +50,7 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@RunWith(SpringJUnit4ClassRunner.class)
public abstract class AbstractPersonRepositoryIntegrationTests {
@@ -738,4 +739,53 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(result.size(), is(1));
assertThat(result.get(0), is(dave));
}
/**
* @see DATAMONGO-871
*/
@Test
public void findsPersonsByFirstnameAsArray() {
Person[] result = repository.findByThePersonsFirstnameAsArray("Leroi");
assertThat(result, is(arrayWithSize(1)));
assertThat(result, is(arrayContaining(leroi)));
}
/**
* @see DATAMONGO-821
*/
@Test
public void findUsingAnnotatedQueryOnDBRef() {
operations.remove(new org.springframework.data.mongodb.core.query.Query(), User.class);
User user = new User();
user.username = "Terria";
operations.save(user);
alicia.creator = user;
repository.save(alicia);
Page<Person> result = repository.findByHavingCreator(new PageRequest(0, 100));
assertThat(result.getNumberOfElements(), is(1));
assertThat(result.getContent().get(0), is(alicia));
}
/**
* @see DATAMONGO-893
*/
@Test
public void findByNestedPropertyInCollectionShouldFindMatchingDocuments() {
Person p = new Person("Mary", "Poppins");
Address adr = new Address("some", "2", "where");
p.setAddress(adr);
repository.save(p);
Page<Person> result = repository.findByAddressIn(Arrays.asList(adr), new PageRequest(0, 10));
assertThat(result.getContent(), hasSize(1));
}
}

View File

@@ -37,6 +37,7 @@ import org.springframework.data.querydsl.QueryDslPredicateExecutor;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface PersonRepository extends MongoRepository<Person, String>, QueryDslPredicateExecutor<Person> {
@@ -70,6 +71,12 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
@Query(value = "{ 'firstname' : ?0 }", fields = "{ 'firstname': 1, 'lastname': 1}")
List<Person> findByThePersonsFirstname(String firstname);
/**
* @see DATAMONGO-871
*/
@Query(value = "{ 'firstname' : ?0 }")
Person[] findByThePersonsFirstnameAsArray(String firstname);
/**
* Returns all {@link Person}s with a firstname matching the given one (*-wildcard supported).
*
@@ -245,4 +252,14 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
List<Person> findByFirstnameContainingIgnoreCase(String firstName);
/**
* @see DATAMONGO-821
*/
@Query("{ creator : { $exists : true } }")
Page<Person> findByHavingCreator(Pageable page);
/**
* @see DATAMONGO-893
*/
Page<Person> findByAddressIn(List<Address> address, Pageable page);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -45,6 +45,7 @@ import com.mongodb.DBObject;
* Unit tests for {@link StringBasedMongoQuery}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class StringBasedMongoQueryUnitTests {
@@ -126,6 +127,19 @@ public class StringBasedMongoQueryUnitTests {
assertThat(query.getQueryObject().get("address"), is(nullValue()));
}
/**
* @see DATAMONGO-821
*/
@Test
public void bindsDbrefCorrectly() throws Exception {
StringBasedMongoQuery mongoQuery = createQueryForMethod("findByHavingSizeFansNotZero");
ConvertingParameterAccessor accessor = StubParameterAccessor.getAccessor(converter, new Object[] {});
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accessor);
assertThat(query.getQueryObject(), is(new BasicQuery("{ fans : { $not : { $size : 0 } } }").getQueryObject()));
}
private StringBasedMongoQuery createQueryForMethod(String name, Class<?>... parameters) throws Exception {
Method method = SampleRepository.class.getMethod(name, parameters);
@@ -143,5 +157,9 @@ public class StringBasedMongoQueryUnitTests {
@Query("{ 'lastname' : ?0, 'address' : ?1 }")
Person findByLastnameAndAddress(String lastname, Address address);
@Query("{ fans : { $not : { $size : 0 } } }")
Person findByHavingSizeFansNotZero();
}
}

View File

@@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="factory" />
<bean class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<mongo:mapping-converter />
</constructor-arg>
</bean>
</beans>

View File

@@ -68,7 +68,7 @@
<xi:include href="introduction/introduction.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.0.RELEASE/src/docbkx/repositories.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.2.RELEASE/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -88,10 +88,10 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.0.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.2.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.0.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:include href="https://raw.github.com/spring-projects/spring-data-commons/1.7.2.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-query-keywords-reference.xml" />
</xi:include>
</part>

View File

@@ -107,7 +107,7 @@
&lt;dependency&gt;
&lt;groupId&gt;org.springframework.data&lt;/groupId&gt;
&lt;artifactId&gt;spring-data-mongodb&lt;/artifactId&gt;
&lt;version&gt;1.3.4.RELEASE&lt;/version&gt;
&lt;version&gt;1.4.2.RELEASE&lt;/version&gt;
&lt;/dependency&gt;
&lt;/dependencies&gt;</programlisting>

View File

@@ -1,9 +1,96 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.4.0.RELEASE (2014-02-24)
Changes in version 1.4.2.RELEASE (2014-04-15)
---------------------------------------------
** Fix
* [DATAMONGO-880] - Improved handling of persistence of lazy-loaded DBRefs.
* [DATAMONGO-884] - Improved handling for Object methods in LazyLoadingInterceptor.
* [DATAMONGO-887] - Added unit tests to verify TreeMaps can be converted.
* [DATAMONGO-888] - Sorting now considers mapping information.
* [DATAMONGO-890] - Fixed Point.toString().
* [DATAMONGO-892] - Reject nested MappingMongoConverter declarations in XML.
* [DATAMONGO-893] - Converter must not write "_class" information for know types.
* [DATAMONGO-897] - Fixed potential NullPointerException in QueryMapper.
* [DATAMONGO-908] - Support for nested field references in group operations.
** Improvement
* [DATAMONGO-881] - Allow custom conversions to override default conversions.
** Task
* [DATAMONGO-895] - Use most specific type for checks against values in DBObjects.
* [DATAMONGO-896] - Assert compatibility with latest MongoDB Java driver.
* [DATAMONGO-905] - Removed obsolete dependency to CGLib from cross-store support.
* [DATAMONGO-907] - Assert compatibility with mongodb 2.6.
* [DATAMONGO-911] - Release 1.4.2
Changes in version 1.5.0.M1 (2014-03-31)
----------------------------------------
** Fix
* [DATAMONGO-471] - Update operation $addToSet does not support adding a list with $each.
* [DATAMONGO-773] - Spring Data MongoDB projection search on @DBref fields.
* [DATAMONGO-821] - MappingException for $size queries on subcollections containing dbrefs.
* [DATAMONGO-829] - NearQuery, when used in conjunction with a Query, it sets num=0, unless Query specifies otherwise.
* [DATAMONGO-833] - EnumSet is not handled correctly.
* [DATAMONGO-843] - Unable to use @EnableMongoAuditing annotation in Java config.
* [DATAMONGO-862] - Update Array Field Using Positional Operator ($) Does Not Work.
* [DATAMONGO-863] - QueryMapper.getMappedValue Fails To Handle Arrays Mapped To $in.
* [DATAMONGO-868] - findAndModify method does not increment @Version field.
* [DATAMONGO-871] - Declarative query method with array return type causes NPE.
* [DATAMONGO-877] - AbstractMongoConfiguration.getMappingBasePackage() throws NullPointerException if config class resides in default package.
* [DATAMONGO-880] - Error when trying to persist an object containing a DBRef which was lazy loaded.
* [DATAMONGO-884] - Potential NullPointerException for lazy DBRefs.
* [DATAMONGO-887] - Repository not instantiated when entity contains field of type TreeMap.
* [DATAMONGO-890] - Point class toString method is confusing.
** Improvement
* [DATAMONGO-809] - Make filename optional in GridFsOperations doc and GridFsTemplate implementation.
* [DATAMONGO-858] - Add support for common geospatial structures.
* [DATAMONGO-865] - Adjust test dependencies to avoid ClassNotFoundException during test runs.
* [DATAMONGO-881] - Cannot override default converters in CustomConversions.
* [DATAMONGO-882] - Adapt to removal of obsolete generics in BeanWrapper.
** New Feature
* [DATAMONGO-566] - Provide support for removeBy… / deleteBy… methods like for findBy… on repository interfaces.
* [DATAMONGO-870] - Add support for sliced query method execution.
** Task
* [DATAMONGO-876] - Adapt to changes introduced for property access configuration.
* [DATAMONGO-883] - Update auditing configuration to enable auditing annotations on accessors.
* [DATAMONGO-859] - Release 1.5 M1.
Changes in version 1.4.1.RELEASE (2014-03-13)
---------------------------------------------
** Fix
* [DATAMONGO-773] - Verify that @DBRef fields can be included in query.
* [DATAMONGO-821] - Fixed handling of keyword expressions for DBRefs.
* [DATAMONGO-829] - NearQuery should not default 'num' to zero.
* [DATAMONGO-833] - Support for EnumSet and EnumMap in MappingMongoConverter.
* [DATAMONGO-843] - Back-port of defaulting of the MappingContext for auditing.
* [DATAMONGO-862] - Fixed handling of unmapped paths for updates.
* [DATAMONGO-863] - UpdateMapper doesn't convert raw DBObjects anymore.
* [DATAMONGO-868] - MongoTemplate.findAndModify(…) increases version if not handled manually.
* [DATAMONGO-871] - Add support for arrays as query method return types.
* [DATAMONGO-877] - Added guard against null-package in AbstractMappingConfiguration.
** Improvement
* [DATAMONGO-865] - Adjust test dependencies to avoid ClassNotFoundException during test runs.
Changes in version 1.3.5.RELEASE (2014-03-10)
---------------------------------------------
** Fix
* [DATAMONGO-829] - NearQuery, when used in conjunction with a Query, no longer sets num=0, unless Query specifies otherwise.
* [DATAMONGO-871] - Repository queries support array return type.
** Improvement
* [DATAMONGO-865] - Avoid ClassNotFoundException during test runs.
Changes in version 1.4.0.RELEASE (2014-02-24)
---------------------------------------------
** Fix
* [DATAMONGO-354] - MongoTemplate should support multiple $pushAll in one update.
* [DATAMONGO-404] - Removing a DBRef using pull does not work.
@@ -23,7 +110,7 @@ Changes in version 1.4.0.RELEASE (2014-02-24)
* [DATAMONGO-848] - Ensure compatibility with Mongo Java driver 2.12.
* [DATAMONGO-853] - Update no longer allows null keys.
* [DATAMONGO-856] - Update documentation.
Changes in version 1.3.4.RELEASE (2014-02-17)
---------------------------------------------
** Bug

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.4.0.RELEASE
Spring Data MongoDB 1.4.2.RELEASE
Copyright (c) [2010-2014] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").

View File

@@ -1,4 +1,4 @@
SPRING DATA MongoDB 1.4.0.RELEASE
Spring Data MongoDB 1.4.2.RELEASE
---------------------------------
Spring Data MongoDB is released under the terms of the Apache Software License Version 2.0 (see license.txt).