Compare commits

...

19 Commits

Author SHA1 Message Date
Mark Paluch
479dc3a0d6 DATAMONGO-1755 - Release version 1.10.7 (Ingalls SR7). 2017-09-11 11:45:05 +02:00
Mark Paluch
59ebbd3d35 DATAMONGO-1755 - Prepare 1.10.7 (Ingalls SR7). 2017-09-11 11:44:20 +02:00
Mark Paluch
38556f522f DATAMONGO-1755 - Updated changelog. 2017-09-11 11:44:15 +02:00
Christoph Strobl
166304849a DATAMONGO-1772 - Fix UpdateMapper type key rendering for abstract list elements contained in concrete typed ones.
Original pull request: #497.
2017-09-05 10:58:55 +02:00
Mark Paluch
f71b38b731 DATAMONGO-1768 - Polishing.
Extend javadocs. Make methods static/reorder methods where possible. Formatting.

Original pull request: #496.
2017-08-25 10:48:42 +02:00
Christoph Strobl
c4af78d81d DATAMONGO-1768 - Allow ignoring type restriction when issuing QBE.
We now allow to remove the type restriction inferred by the QBE mapping via an ignored path expression on the ExampleMatcher. This allows to create untyped QBE expressions returning all entities matching the query without limiting the result to types assignable to the probe itself.

Original pull request: #496.
2017-08-25 10:41:37 +02:00
Oliver Gierke
a281ec83b5 DATAMONGO-1765 - Polishing.
Formatting.
2017-08-07 17:35:11 +02:00
Oliver Gierke
407087b3a7 DATAMONGO-1765 - DefaultDbRefResolver now maps duplicate references correctly.
On bulk resolution of a DBRef array we now map the resulting documents back to their ids to make sure that reoccurring identifiers are mapped to the corresponding documents.
2017-08-07 17:35:11 +02:00
Mark Paluch
90411decce DATAMONGO-1756 - Polishing.
Add author tag.

Original pull request: #491.
2017-08-02 08:52:45 +02:00
Christoph Strobl
71135395c1 DATAMONGO-1756 - Fix nested field name resolution for arithmetic aggregation ops.
Original pull request: #491.
2017-08-02 08:52:45 +02:00
Oliver Gierke
9c43ece3a7 DATAMONGO-1750 - After release cleanups. 2017-07-27 00:21:39 +02:00
Oliver Gierke
283bfce2fe DATAMONGO-1750 - Prepare next development iteration. 2017-07-27 00:15:08 +02:00
Oliver Gierke
42cc6ff37f DATAMONGO-1750 - Release version 1.10.6 (Ingalls SR6). 2017-07-26 23:47:20 +02:00
Oliver Gierke
9ded78b13c DATAMONGO-1750 - Prepare 1.10.6 (Ingalls SR6). 2017-07-26 23:45:52 +02:00
Oliver Gierke
b0842a89fd DATAMONGO-1750 - Updated changelog. 2017-07-26 23:45:43 +02:00
Oliver Gierke
5a9eef7c96 DATAMONGO-1751 - Updated changelog. 2017-07-25 16:15:49 +02:00
Oliver Gierke
77425736e9 DATAMONGO-1717 - Updated changelog. 2017-07-25 10:04:03 +02:00
Oliver Gierke
6aa8f84428 DATAMONGO-1711 - After release cleanups. 2017-07-24 19:25:08 +02:00
Oliver Gierke
b83f2e9198 DATAMONGO-1711 - Prepare next development iteration. 2017-07-24 19:25:06 +02:00
18 changed files with 418 additions and 103 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.9.5.RELEASE</version>
<version>1.9.7.RELEASE</version>
</parent>
<modules>
@@ -28,7 +28,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.13.5.RELEASE</springdata.commons>
<springdata.commons>1.13.7.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<jmh.version>1.19</jmh.version>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
</dependency>
<dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.5.RELEASE</version>
<version>1.10.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1408,7 +1408,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(getField().getName()).toString());
result.add(context.getReference(getField()).toString());
for (Object element : values) {

View File

@@ -25,7 +25,9 @@ import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
@@ -46,7 +48,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -62,6 +63,8 @@ import com.mongodb.DBRef;
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final String ID = "_id";
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
@@ -144,10 +147,37 @@ public class DefaultDbRefResolver implements DbRefResolver {
ids.add(ref.getId());
}
Map<Object, DBObject> documentsById = getDocumentsById(ids, collection);
List<DBObject> result = new ArrayList<DBObject>(ids.size());
for (Object id : ids) {
result.add(documentsById.get(id));
}
return result;
}
/**
* Returns all documents with the given ids contained in the given collection mapped by their ids.
*
* @param ids must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
private Map<Object, DBObject> getDocumentsById(List<Object> ids, String collection) {
Assert.notNull(ids, "Ids must not be null!");
Assert.hasText(collection, "Collection must not be null or empty!");
DB db = mongoDbFactory.getDb();
List<DBObject> result = db.getCollection(collection)
.find(new BasicDBObjectBuilder().add("_id", new BasicDBObject("$in", ids)).get()).toArray();
Collections.sort(result, new DbRefByReferencePositionComparator(ids));
BasicDBObject query = new BasicDBObject(ID, new BasicDBObject("$in", ids));
List<DBObject> documents = db.getCollection(collection).find(query).toArray();
Map<Object, DBObject> result = new HashMap<Object, DBObject>(documents.size());
for (DBObject document : documents) {
result.put(document.get(ID), document);
}
return result;
}
@@ -469,7 +499,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
*/
@Override
public int compare(DBObject o1, DBObject o2) {
return Integer.compare(reference.indexOf(o1.get("_id")), reference.indexOf(o2.get("_id")));
return Integer.compare(reference.indexOf(o1.get(ID)), reference.indexOf(o2.get(ID)));
}
}
}

View File

@@ -28,6 +28,7 @@ import java.util.Stack;
import java.util.regex.Pattern;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.data.domain.ExampleMatcher.NullHandler;
import org.springframework.data.domain.ExampleMatcher.PropertyValueTransformer;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
@@ -41,6 +42,7 @@ import org.springframework.data.repository.core.support.ExampleMatcherAccessor;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
@@ -48,9 +50,13 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Mapper from {@link Example} to a query {@link DBObject}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.8
* @see Example
* @see org.springframework.data.domain.ExampleMatcher
*/
public class MongoExampleMapper {
@@ -58,6 +64,11 @@ public class MongoExampleMapper {
private final MongoConverter converter;
private final Map<StringMatcher, Type> stringMatcherPartMapping = new HashMap<StringMatcher, Type>();
/**
* Create a new {@link MongoTypeMapper} given {@link MongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoExampleMapper(MongoConverter converter) {
this.converter = converter;
@@ -99,8 +110,10 @@ public class MongoExampleMapper {
DBObject reference = (DBObject) converter.convertToMongoType(example.getProbe());
if (entity.hasIdProperty() && entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
if (entity.hasIdProperty() && ClassUtils.isAssignable(entity.getType(), example.getProbeType())) {
if (entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
}
}
ExampleMatcherAccessor matcherAccessor = new ExampleMatcherAccessor(example.getMatcher());
@@ -111,80 +124,7 @@ public class MongoExampleMapper {
: new BasicDBObject(SerializationUtils.flattenMap(reference));
DBObject result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> foo = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
foo.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", foo);
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
return updateTypeRestrictions(result, example);
}
private void applyPropertySpecs(String path, DBObject source, Class<?> probeType,
@@ -246,7 +186,102 @@ public class MongoExampleMapper {
}
}
private boolean isEmptyIdProperty(Entry<String, Object> entry) {
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
}
private DBObject updateTypeRestrictions(DBObject query, Example example) {
DBObject result = new BasicDBObject();
if (isTypeRestricting(example.getMatcher())) {
result.putAll(query);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
for (String key : query.keySet()) {
if (!this.converter.getTypeMapper().isTypeKey(key)) {
result.put(key, query.get(key));
}
}
return result;
}
private boolean isTypeRestricting(ExampleMatcher matcher) {
if (matcher.getIgnoredPaths().isEmpty()) {
return true;
}
for (String path : matcher.getIgnoredPaths()) {
if (this.converter.getTypeMapper().isTypeKey(path)) {
return false;
}
}
return true;
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private static boolean isEmptyIdProperty(Entry<String, Object> entry) {
return entry.getKey().equals("_id") && entry.getValue() == null;
}
@@ -272,4 +307,15 @@ public class MongoExampleMapper {
dbo.put("$options", "i");
}
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> or = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
or.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", or);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Collection;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
@@ -161,6 +162,10 @@ public class UpdateMapper extends QueryMapper {
return info;
}
if (source instanceof Collection) {
return NESTED_DOCUMENT;
}
if (!type.equals(source.getClass())) {
return info;
}

View File

@@ -166,6 +166,32 @@ public class QueryByExampleTests {
assertThat(result, hasItems(p1, p2));
}
@Test // DATAMONGO-1768
public void typedExampleMatchesNothingIfTypesDoNotMatch() {
NotAPersonButStillMatchingFields probe = new NotAPersonButStillMatchingFields();
probe.lastname = "stark";
Query query = new Query(new Criteria().alike(Example.of(probe)));
List<Person> result = operations.find(query, Person.class);
assertThat(result, hasSize(0));
}
@Test // DATAMONGO-1768
public void untypedExampleMatchesCorrectly() {
NotAPersonButStillMatchingFields probe = new NotAPersonButStillMatchingFields();
probe.lastname = "stark";
Query query = new Query(
new Criteria().alike(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_class"))));
List<Person> result = operations.find(query, Person.class);
assertThat(result, hasSize(2));
assertThat(result, hasItems(p1, p3));
}
@Document(collection = "dramatis-personae")
@EqualsAndHashCode
@ToString
@@ -175,4 +201,12 @@ public class QueryByExampleTests {
String firstname, middlename;
@Field("last_name") String lastname;
}
@EqualsAndHashCode
@ToString
static class NotAPersonButStillMatchingFields {
String firstname, middlename;
@Field("last_name") String lastname;
}
}

View File

@@ -40,7 +40,7 @@ import com.mongodb.util.JSON;
/**
* Unit tests for {@link Aggregation}.
*
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
@@ -564,6 +564,16 @@ public class AggregationUnitTests {
assertThat(getAsDBObject(fields, "foosum"), isBsonObject().containing("$first.$cond.else", "no-answer"));
}
@Test // DATAMONGO-1756
public void projectOperationShouldRenderNestedFieldNamesCorrectly() {
DBObject agg = newAggregation(project().and("value1.value").plus("value2.value").as("val")).toDbObject("collection",
Aggregation.DEFAULT_CONTEXT);
assertThat((BasicDBObject) extractPipelineElement(agg, 0, "$project"), is(equalTo(new BasicDBObject("val",
new BasicDBObject("$add", new BasicDbListBuilder().add("$value1.value").add("$value2.value").get())))));
}
private DBObject extractPipelineElement(DBObject agg, int index, String operation) {
List<DBObject> pipeline = (List<DBObject>) agg.get("pipeline");

View File

@@ -47,6 +47,7 @@ import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -54,10 +55,11 @@ import com.mongodb.util.JSON;
/**
* Unit tests for {@link TypeBasedAggregationOperationContext}.
*
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class TypeBasedAggregationOperationContextUnitTests {
@@ -336,6 +338,19 @@ public class TypeBasedAggregationOperationContextUnitTests {
assertThat(age, isBsonObject().containing("$ifNull.[1]._class", Age.class.getName()));
}
@Test // DATAMONGO-1756
public void projectOperationShouldRenderNestedFieldNamesCorrectlyForTypedAggregation() {
AggregationOperationContext context = getContext(Wrapper.class);
DBObject agg = newAggregation(Wrapper.class, project().and("nested1.value1").plus("nested2.value2").as("val"))
.toDbObject("collection", context);
BasicDBObject project = (BasicDBObject) getPipelineElementFromAggregationAt(agg, 0).get("$project");
assertThat(project, is(equalTo(new BasicDBObject("val", new BasicDBObject("$add",
new BasicDbListBuilder().add("$nested1.value1").add("$field2.nestedValue2").get())))));
}
@Document(collection = "person")
public static class FooPerson {
@@ -406,4 +421,15 @@ public class TypeBasedAggregationOperationContextUnitTests {
String name;
}
static class Wrapper {
Nested nested1;
@org.springframework.data.mongodb.core.mapping.Field("field2") Nested nested2;
}
static class Nested {
String value1;
@org.springframework.data.mongodb.core.mapping.Field("nestedValue2") String value2;
}
}

View File

@@ -64,7 +64,7 @@ public class DefaultDbRefResolverUnitTests {
when(factoryMock.getDb()).thenReturn(dbMock);
when(dbMock.getCollection(anyString())).thenReturn(collectionMock);
when(collectionMock.find(Mockito.any(DBObject.class))).thenReturn(cursorMock);
when(cursorMock.toArray()).thenReturn(Collections.<DBObject>emptyList());
when(cursorMock.toArray()).thenReturn(Collections.<DBObject> emptyList());
resolver = new DefaultDbRefResolver(factoryMock);
}
@@ -100,7 +100,7 @@ public class DefaultDbRefResolverUnitTests {
@Test // DATAMONGO-1194
public void bulkFetchShouldReturnEarlyForEmptyLists() {
resolver.bulkFetch(Collections.<DBRef>emptyList());
resolver.bulkFetch(Collections.<DBRef> emptyList());
verify(collectionMock, never()).find(Mockito.any(DBObject.class));
}
@@ -118,4 +118,17 @@ public class DefaultDbRefResolverUnitTests {
assertThat(resolver.bulkFetch(Arrays.asList(ref1, ref2)), contains(o1, o2));
}
@Test // DATAMONGO-1765
public void bulkFetchContainsDuplicates() {
DBObject document = new BasicDBObject("_id", new ObjectId());
DBRef ref1 = new DBRef("collection-1", document.get("_id"));
DBRef ref2 = new DBRef("collection-1", document.get("_id"));
when(cursorMock.toArray()).thenReturn(Arrays.asList(document));
assertThat(resolver.bulkFetch(Arrays.asList(ref1, ref2)), contains(document, document));
}
}

View File

@@ -24,6 +24,7 @@ import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import java.util.Arrays;
import java.util.List;
import java.util.Set;
import java.util.regex.Pattern;
import org.bson.BSONObject;
@@ -36,8 +37,7 @@ import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.data.domain.ExampleMatcher.GenericPropertyMatchers;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
import org.springframework.data.domain.ExampleMatcher.*;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapperUnitTests.ClassWithGeoTypes;
@@ -47,6 +47,7 @@ import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.test.util.IsBsonObject;
import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -288,7 +289,7 @@ public class MongoExampleMapperUnitTests {
DBObject dbo = mapper.getMappedExample(of(probe), context.getPersistentEntity(WithDBRef.class));
com.mongodb.DBRef reference = getTypedValue(dbo, "referenceDocument", com.mongodb.DBRef.class);
assertThat(reference.getId(), Is.<Object>is("200"));
assertThat(reference.getId(), Is.<Object> is("200"));
assertThat(reference.getCollectionName(), is("refDoc"));
}
@@ -311,8 +312,8 @@ public class MongoExampleMapperUnitTests {
DBObject dbo = mapper.getMappedExample(of(probe), context.getPersistentEntity(WithDBRef.class));
assertThat(dbo.get("legacyPoint.x"), Is.<Object>is(10D));
assertThat(dbo.get("legacyPoint.y"), Is.<Object>is(20D));
assertThat(dbo.get("legacyPoint.x"), Is.<Object> is(10D));
assertThat(dbo.get("legacyPoint.y"), Is.<Object> is(20D));
}
@Test // DATAMONGO-1245
@@ -426,6 +427,52 @@ public class MongoExampleMapperUnitTests {
assertThat(mapper.getMappedExample(example), isBsonObject().containing("$or").containing("_class"));
}
@Test // DATAMONGO-1768
public void allowIgnoringTypeRestrictionBySettingUpTypeKeyAsAnIgnoredPath() {
WrapperDocument probe = new WrapperDocument();
probe.flatDoc = new FlatDocument();
probe.flatDoc.stringValue = "conflux";
DBObject dbo = mapper.getMappedExample(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_class")));
assertThat(dbo, isBsonObject().notContaining("_class"));
}
@Test // DATAMONGO-1768
public void allowIgnoringTypeRestrictionBySettingUpTypeKeyAsAnIgnoredPathWhenUsingCustomTypeMapper() {
WrapperDocument probe = new WrapperDocument();
probe.flatDoc = new FlatDocument();
probe.flatDoc.stringValue = "conflux";
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(new DefaultDbRefResolver(factory), context);
mappingMongoConverter.setTypeMapper(new DefaultMongoTypeMapper() {
@Override
public boolean isTypeKey(String key) {
return "_foo".equals(key);
}
@Override
public void writeTypeRestrictions(DBObject dbo, Set<Class<?>> restrictedTypes) {
dbo.put("_foo", "bar");
}
@Override
public void writeType(TypeInformation<?> info, DBObject sink) {
sink.put("_foo", "bar");
}
});
mappingMongoConverter.afterPropertiesSet();
DBObject dbo = new MongoExampleMapper(mappingMongoConverter)
.getMappedExample(Example.of(probe, ExampleMatcher.matching().withIgnorePaths("_foo")));
assertThat(dbo, isBsonObject().notContaining("_class").notContaining("_foo"));
}
static class FlatDocument {
@Id String id;

View File

@@ -901,6 +901,34 @@ public class UpdateMapperUnitTests {
}
}
@Test // DATAMONGO-1772
public void mappingShouldAddTypeKeyInListOfInterfaceTypeContainedInConcreteObjectCorrectly() {
ConcreteInner inner = new ConcreteInner();
inner.interfaceTypeList = Collections.<SomeInterfaceType> singletonList(new SomeInterfaceImpl());
List<ConcreteInner> list = Collections.singletonList(inner);
DBObject mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].interfaceTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
}
@Test // DATAMONGO-1772
public void mappingShouldAddTypeKeyInListOfAbstractTypeContainedInConcreteObjectCorrectly() {
ConcreteInner inner = new ConcreteInner();
inner.abstractTypeList = Collections.<SomeAbstractType> singletonList(new SomeInterfaceImpl());
List<ConcreteInner> list = Collections.singletonList(inner);
DBObject mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].abstractTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
@@ -1187,4 +1215,26 @@ public class UpdateMapperUnitTests {
Integer intValue;
int primIntValue;
}
static class Outer {
List<ConcreteInner> concreteInnerList;
}
static class ConcreteInner {
List<SomeInterfaceType> interfaceTypeList;
List<SomeAbstractType> abstractTypeList;
}
interface SomeInterfaceType {
}
static abstract class SomeAbstractType {
}
static class SomeInterfaceImpl extends SomeAbstractType implements SomeInterfaceType {
}
}

View File

@@ -1,6 +1,60 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.10.7.RELEASE (2017-09-11)
----------------------------------------------
* DATAMONGO-1772 - Type hint not added when updating nested list elements with inheritance.
* DATAMONGO-1768 - QueryByExample FindOne : probe type.
* DATAMONGO-1765 - Duplicate elements in DBRefs list not correctly mapped.
* DATAMONGO-1756 - Aggregation project and arithmetic operation not working with nested fields.
* DATAMONGO-1755 - Release 1.10.7 (Ingalls SR7).
Changes in version 1.10.6.RELEASE (2017-07-26)
----------------------------------------------
* DATAMONGO-1750 - Release 1.10.6 (Ingalls SR6).
Changes in version 2.0.0.RC2 (2017-07-25)
-----------------------------------------
* DATAMONGO-1753 - IndexEnsuringQueryCreationListener should skip queries without criteria.
* DATAMONGO-1752 - Executing repository methods with closed projection fails.
* DATAMONGO-1751 - Release 2.0 RC2 (Kay).
Changes in version 2.0.0.RC1 (2017-07-25)
-----------------------------------------
* DATAMONGO-1748 - Add Kotlin extensions for Criteria API.
* DATAMONGO-1746 - Inherit Project Reactor version from dependency management.
* DATAMONGO-1744 - Improve default setup for MappingMongoConverter.
* DATAMONGO-1739 - Change TerminatingFindOperation.stream() to return a Stream directly.
* DATAMONGO-1738 - Move to fluent API for repository query execution.
* DATAMONGO-1735 - Sort and fields objects in Query should not be null.
* DATAMONGO-1734 - Add count() & exists to fluent API.
* DATAMONGO-1733 - Allow usage of projection interfaces in FluentMongoOperations.
* DATAMONGO-1730 - Adapt to API changes in mapping subsystem.
* DATAMONGO-1729 - Open projection does not fetch all properties.
* DATAMONGO-1728 - ExecutableFindOperation.find(…).first() fails with NPE.
* DATAMONGO-1726 - Add terminating findOne/findFirst methods to FluentMongoOperations returning null value instead of Optional.
* DATAMONGO-1725 - Potential NullPointerException in CloseableIterableCursorAdapter.
* DATAMONGO-1723 - Fix unit tests after API changes in Spring Data Commons.
* DATAMONGO-1721 - Fix dependency cycles.
* DATAMONGO-1720 - Add JMH benchmark module.
* DATAMONGO-1719 - Add fluent alternative for ReactiveMongoOperations.
* DATAMONGO-1718 - MongoTemplate.findAndRemoveAll(Query, String) delegates to wrong overload.
* DATAMONGO-1717 - Release 2.0 RC1 (Kay).
* DATAMONGO-1715 - Remove spring-data-mongodb-log4j module.
* DATAMONGO-1713 - MongoCredentialPropertyEditor improperly resolves the credential string.
* DATAMONGO-1705 - Deprecate cross-store support.
* DATAMONGO-1703 - Allow referencing views in object graphs containing circular dependencies.
* DATAMONGO-1702 - Switch repository implementation to use fragments.
* DATAMONGO-1697 - @Version used by @EnableMongoAuditing does not increase when using collection name in MongoTemplate's updateFirst. Unexpected, not described by reference documentation.
* DATAMONGO-1682 - Add partial index support to ReactiveIndexOperations.
* DATAMONGO-1678 - DefaultBulkOperations do not map Query and Update objects properly.
* DATAMONGO-1646 - Support reactive aggregation streaming.
* DATAMONGO-1519 - Change MongoTemplate.insertDBObjectList(…) to return List<Object> instead of List<ObjectId>.
Changes in version 1.10.5.RELEASE (2017-07-24)
----------------------------------------------
* DATAMONGO-1744 - Improve default setup for MappingMongoConverter.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.10.5
Spring Data MongoDB 1.10.7
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").