Compare commits

...

41 Commits

Author SHA1 Message Date
Mark Paluch
dbf4990f60 DATAMONGO-1918 - Release version 2.0.7 (Kay SR7). 2018-05-08 14:15:27 +02:00
Mark Paluch
c5c43158c2 DATAMONGO-1918 - Prepare 2.0.7 (Kay SR7). 2018-05-08 14:14:32 +02:00
Mark Paluch
56ffe7913d DATAMONGO-1918 - Updated changelog. 2018-05-08 14:14:23 +02:00
Mark Paluch
eae263eebc DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:53 +02:00
Mark Paluch
0dd2fa3dce DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Christoph Strobl
e648ea5903 DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Mark Paluch
f389812b7c DATAMONGO-1869 - Updated changelog. 2018-04-13 15:11:29 +02:00
Mark Paluch
2127ddcbb8 DATAMONGO-1893 - Polishing.
Inherit fields from previous operation if at least one field is excluded. Extend FieldsExposingAggregationOperation to conditionally inherit fields.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Christoph Strobl
7f9ab3bb44 DATAMONGO-1893 - Allow exclusion of other fields than _id in aggregation $project.
As of MongoDB 3.4 exclusion of fields other than _id is allowed so we removed the limitation in our code.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Mark Paluch
aea40ca490 DATAMONGO-1888 - After release cleanups. 2018-04-04 16:42:33 +02:00
Mark Paluch
fb8d03db31 DATAMONGO-1888 - Prepare next development iteration. 2018-04-04 16:42:30 +02:00
Mark Paluch
890f08f19a DATAMONGO-1888 - Release version 2.0.6 (Kay SR6). 2018-04-04 15:53:22 +02:00
Mark Paluch
be58472777 DATAMONGO-1888 - Prepare 2.0.6 (Kay SR6). 2018-04-04 15:52:31 +02:00
Mark Paluch
b082d4ad98 DATAMONGO-1888 - Updated changelog. 2018-04-04 15:52:22 +02:00
Mark Paluch
e80b031f54 DATAMONGO-1857 - Updated changelog. 2018-04-04 15:16:20 +02:00
Mark Paluch
50b017c08b DATAMONGO-1903 - Polishing.
Remove client side operating system check as operating system-dependant constraints depend on the server. Add check on whitespaces. Add author tags. Extend tests.

Adapt check in SimpleReactiveMongoDatabaseFactory accordingly. Remove superfluous UnknownHostException declaration in reactive database factory. Replace references to legacy types in Javadoc with references to current ones.

Original pull request: #546.
2018-04-03 13:44:19 +02:00
George Moraitis
78429eb33d DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
We now test database names against the current (3.6) MongoDB specifications for database names.

Original pull request: #546.
2018-04-03 13:44:16 +02:00
Mark Paluch
3ed0bd7a18 DATAMONGO-1916 - Polishing.
Remove unused final keywords from method parameters and unused variables. Add nullable annotations to parameters that can be null. Fix generics.

Original pull request: #547.
2018-04-03 11:33:13 +02:00
Christoph Strobl
cbc923c727 DATAMONGO-1916 - Fix potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
Original pull request: #547.
2018-04-03 11:32:53 +02:00
Mark Paluch
f6ca0049b6 DATAMONGO-1834 - Polishing.
Increase visibility of Timezone factory methods. Add missing nullable annotation. Tweaked Javadoc. Add tests for Timezone using expressions/field references.

Original Pull Request: #539
2018-03-28 11:40:14 +02:00
Christoph Strobl
82c9b0c662 DATAMONGO-1834 - Polishing.
Remove DateFactory and split up tests.
Introduce dedicated Timezone abstraction and update existing factories to apply the timezone if appropriate. Update builders and align code style.

Original Pull Request: #539
2018-03-28 11:39:58 +02:00
Matt Morrissette
3ca2349ce3 DATAMONGO-1834 - Add support for MongoDB 3.6 DateOperators $dateFromString, $dateFromParts and $dateToParts including timezones.
Original Pull Request: #539
2018-03-28 11:34:57 +02:00
Oliver Gierke
a76f157457 DATAMONGO-1915 - Removed explicit declaration of Jackson library versions. 2018-03-27 19:35:24 +02:00
Christoph Strobl
560a6a5bc2 DATAMONGO-1911 - Polishing.
Use native MongoDB Codec facilities to render binary and uuid.

Original Pull Request: #544
2018-03-27 14:17:46 +02:00
Mark Paluch
51d5c52193 DATAMONGO-1911 - Fix UUID serialization in String-based queries.
We now render to the correct UUID representation in String-based queries. Unquoted values render to $binary representation, quoted UUIDs are rendered with their toString() value.

Previously we used JSON.serialize() to encode values to JSON. The com.mongodb.util.JSON serializer does not produce JSON that is compatible with Document.parse. It uses an older JSON format that preceded the MongoDB Extended JSON specification.

Original Pull Request: #544
2018-03-27 14:04:34 +02:00
Mark Paluch
56b6748068 DATAMONGO-1913 - Add missing nullable annotations to GridFsTemplate. 2018-03-26 14:10:53 +02:00
Felipe Zanardo Affonso
1e19f405cc DATAMONGO-1909 - Fix typo on return statement.
Original pull request: #523.
2018-03-21 16:05:25 +01:00
Mark Paluch
54d2c122eb DATAMONGO-1907 - Polishing.
Rename test method to reflect test subject.

Switch from flatMap(…) to map(…) to avoid overhead of Mono creation.

Original pull request: #541.
2018-03-21 09:54:07 +01:00
Ruben J Garcia
b47c5704e7 DATAMONGO-1907 - Adjust SimpleReactiveMongoRepository.findOne(…) to complete without exception on empty result
We now no longer emit an exception via SimpleReactiveMongoRepository.findOne(Example) if the query completes without yielding a result. Previously findOne(Example) emitted a NoSuchElementException if the query returned no result.

Original pull request: #541.
2018-03-21 09:51:32 +01:00
Oliver Gierke
6b0b1cd97d DATAMONGO-1904 - Optimizations in MappingMongoConverter.readCollectionOrArray(…).
Switched to ClassUtils.isAssignableValue(…) in getPotentiallyConvertedSimpleRead(…) as it transparently handles primitives and their wrapper types so that we can avoid the superfluous invocation of the converter infrastructure.
2018-03-15 15:03:53 +01:00
Oliver Gierke
35bbc604aa DATAMONGO-1904 - Fixed handling of nested arrays on reads in MappingMongoConverter.
We now properly forward the component type information into recursive calls to MappingMongoConverter.readCollectionOrArray(…).
2018-03-15 15:03:51 +01:00
Oliver Gierke
9ade830a10 DATAMONGO-1901 - Added project.root configuration to make JavaDoc generation work again.
Related ticket: https://github.com/spring-projects/spring-data-build/issues/527.
2018-03-14 09:37:46 +01:00
Oliver Gierke
8fbff50f4f DATAMONGO-1898 - Added unit tests for the conversion handling of enums implementing interfaces.
Related tickets: DATACMNS-1278.
2018-03-12 11:07:40 +01:00
Oliver Gierke
14b49638a0 DATAMONGO-1896 - SimpleMongoRepository.saveAll(…) now consistently uses aggregate collection for inserts.
We previously used MongoTemplate.insertAll(…) which determines the collection to insert the individual elements based on the type, which - in cases of entity inheritance - will use dedicated collections for sub-types of the aggregate root. Subsequent lookups of the entities will then fail, as those are executed against the collection the aggregate root is mapped to.

We now rather use ….insert(Collection, String) handing the collection of the aggregate root explicitly.
2018-03-09 00:03:44 +01:00
Mark Paluch
dc31f4f32f DATAMONGO-1882 - After release cleanups. 2018-02-28 10:43:35 +01:00
Mark Paluch
708f9ac7b3 DATAMONGO-1882 - Prepare next development iteration. 2018-02-28 10:43:34 +01:00
Mark Paluch
17d6100426 DATAMONGO-1882 - Release version 2.0.5 (Kay SR5). 2018-02-28 10:14:58 +01:00
Mark Paluch
27a4e25880 DATAMONGO-1882 - Prepare 2.0.5 (Kay SR5). 2018-02-28 10:14:05 +01:00
Mark Paluch
d378bcb442 DATAMONGO-1882 - Updated changelog. 2018-02-28 10:13:57 +01:00
Mark Paluch
f6505c7758 DATAMONGO-1859 - After release cleanups. 2018-02-19 20:29:08 +01:00
Mark Paluch
d25f88c70e DATAMONGO-1859 - Prepare next development iteration. 2018-02-19 20:29:06 +01:00
35 changed files with 2562 additions and 330 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
</parent>
<modules>
@@ -27,7 +27,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.4.RELEASE</springdata.commons>
<springdata.commons>2.0.7.RELEASE</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -17,6 +17,7 @@
<jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate>
<java-module-name>spring.data.mongodb.cross.store</java-module-name>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -49,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.4.RELEASE</version>
<version>2.0.7.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -19,6 +19,7 @@
<objenesis>1.3</objenesis>
<equalsverifier>1.7.8</equalsverifier>
<java-module-name>spring.data.mongodb</java-module-name>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -145,7 +146,7 @@
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
<artifactId>javax.interceptor-api</artifactId>
@@ -214,7 +215,6 @@
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
@@ -288,7 +288,7 @@
</dependencies>
<build>
<plugins>
<plugin>
@@ -340,8 +340,8 @@
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -23,16 +23,16 @@ import com.mongodb.DB;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link DB} instances.
*
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
*/
public interface MongoDbFactory {
/**
* Creates a default {@link DB} instance.
*
* Creates a default {@link MongoDatabase} instance.
*
* @return
* @throws DataAccessException
*/
@@ -40,7 +40,7 @@ public interface MongoDbFactory {
/**
* Creates a {@link DB} instance to access the database with the given name.
*
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
@@ -49,7 +49,7 @@ public interface MongoDbFactory {
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
@@ -31,12 +29,14 @@ import com.mongodb.WriteConcern;
import com.mongodb.client.MongoDatabase;
/**
* Factory to create {@link DB} instances from a {@link MongoClient} instance.
*
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author George Moraitis
* @author Mark Paluch
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -49,9 +49,8 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) {
@@ -60,7 +59,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
@@ -70,7 +69,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
}
/**
* @param client
* @param mongoClient
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
@@ -79,8 +78,8 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[\\w-]+"),
"Database name must only contain letters, numbers, underscores and dashes!");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongoClient = mongoClient;
this.databaseName = databaseName;
@@ -90,7 +89,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
*
* @param writeConcern the writeConcern to set
*/
public void setWriteConcern(WriteConcern writeConcern) {
@@ -124,7 +123,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
/**
* Clean up the Mongo instance if it was created by the factory itself.
*
*
* @see DisposableBean#destroy()
*/
public void destroy() throws Exception {
@@ -133,7 +132,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
}
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/

View File

@@ -72,8 +72,8 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[\\w-]+"),
"Database name must only contain letters, numbers, underscores and dashes!");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongo = client;
this.databaseName = databaseName;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016. the original author or authors.
* Copyright 2016-2018. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,12 +20,15 @@ import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.bson.Document;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* @author Christoph Strobl
* @author Matt Morrissette
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
@@ -46,29 +49,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
Object valueToUse;
if (value instanceof List) {
List<Object> arguments = (List<Object>) value;
List<Object> args = new ArrayList<Object>(arguments.size());
for (Object val : arguments) {
args.add(unpack(val, context));
}
valueToUse = args;
} else if (value instanceof java.util.Map) {
Document dbo = new Document();
for (java.util.Map.Entry<String, Object> entry : ((java.util.Map<String, Object>) value).entrySet()) {
dbo.put(entry.getKey(), unpack(entry.getValue(), context));
}
valueToUse = dbo;
} else {
valueToUse = unpack(value, context);
}
return new Document(getMongoMethod(), valueToUse);
return new Document(getMongoMethod(), unpack(value, context));
}
protected static List<Field> asFields(String... fieldRefs) {
@@ -94,14 +75,23 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
if (value instanceof List) {
List<Object> sourceList = (List<Object>) value;
List<Object> mappedList = new ArrayList<Object>(sourceList.size());
List<Object> mappedList = new ArrayList<>(sourceList.size());
sourceList.stream().map((item) -> unpack(item, context)).forEach(mappedList::add);
for (Object item : sourceList) {
mappedList.add(unpack(item, context));
}
return mappedList;
}
if (value instanceof Map) {
Document targetDocument = new Document();
Map<String, Object> sourceMap = (Map<String, Object>) value;
sourceMap.forEach((k, v) -> targetDocument.append(k, unpack(v, context)));
return targetDocument;
}
return value;
}
@@ -112,9 +102,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof List) {
for (Object val : (List) value) {
clone.add(val);
}
clone.addAll((List) value);
} else {
clone.add(value);
}
@@ -127,10 +115,9 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
if (!(this.value instanceof java.util.Map)) {
throw new IllegalArgumentException("o_O");
}
java.util.Map<String, Object> clone = new LinkedHashMap<String, Object>((java.util.Map<String, Object>) this.value);
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
java.util.Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.put(key, value);
return clone;
@@ -144,7 +131,67 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<Object>(Collections.singletonList(value));
return new ArrayList<>(Collections.singletonList(value));
}
/**
* Get the value at a given index.
*
* @param index
* @param <T>
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected <T> T get(int index) {
return (T) values().get(index);
}
/**
* Get the value for a given key.
*
* @param key
* @param <T>
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected <T> T get(Object key) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return (T) ((java.util.Map<String, Object>) this.value).get(key);
}
/**
* Get the argument map.
*
* @since 2.1
* @return
*/
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> argumentMap() {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return Collections.unmodifiableMap((java.util.Map) value);
}
/**
* Check if the given key is available.
*
* @param key
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected boolean contains(Object key) {
if (!(this.value instanceof java.util.Map)) {
return false;
}
return ((java.util.Map<String, Object>) this.value).containsKey(key);
}
protected abstract String getMongoMethod();

View File

@@ -59,7 +59,7 @@ class AggregationOperationRenderer {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation) {
if (operation instanceof InheritsFieldsAggregationOperation || exposedFieldsOperation.inheritsFields()) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT

View File

@@ -33,9 +33,26 @@ public interface FieldsExposingAggregationOperation extends AggregationOperation
*/
ExposedFields getFields();
/**
* @return {@literal true} to conditionally inherit fields from previous operations.
* @since 2.0.6
*/
default boolean inheritsFields() {
return false;
}
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {}
interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#inheritsFields()
*/
@Override
default boolean inheritsFields() {
return true;
}
}
}

View File

@@ -140,11 +140,6 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
public ProjectionOperation andExclude(String... fieldNames) {
for (String fieldName : fieldNames) {
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
}
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
return new ProjectionOperation(this.projections, excludeProjections);
}
@@ -188,6 +183,18 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return fields != null ? fields : ExposedFields.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#inheritsFields()
*/
@Override
public boolean inheritsFields() {
return projections.stream().filter(FieldProjection.class::isInstance) //
.map(FieldProjection.class::cast) //
.anyMatch(FieldProjection::isExcluded);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
@@ -1344,6 +1351,13 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return projections;
}
/**
* @return {@literal true} if this field is excluded.
*/
public boolean isExcluded() {
return Boolean.FALSE.equals(value);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)

View File

@@ -264,7 +264,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
path);
}
@Nullable
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final Document bson, final ObjectPath path) {
DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(bson, spELContext);
@@ -316,7 +315,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
for (MongoPersistentProperty prop : entity) {
if (prop.isAssociation() && !entity.isConstructorArgument(prop)) {
readAssociation(prop.getAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
readAssociation(prop.getRequiredAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
continue;
}
// we skip the id property since it was already set
@@ -329,7 +328,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (prop.isAssociation()) {
readAssociation(prop.getAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
readAssociation(prop.getRequiredAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
continue;
}
@@ -357,7 +356,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
public DBRef toDBRef(Object object, @Nullable MongoPersistentProperty referringProperty) {
org.springframework.data.mongodb.core.mapping.DBRef annotation = null;
org.springframework.data.mongodb.core.mapping.DBRef annotation;
if (referringProperty != null) {
annotation = referringProperty.getDBRef();
@@ -378,7 +377,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @see org.springframework.data.mongodb.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.Document)
*/
public void write(final Object obj, final Bson bson) {
public void write(Object obj, Bson bson) {
if (null == obj) {
return;
@@ -405,9 +404,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @param obj
* @param bson
* @param typeHint
*/
@SuppressWarnings("unchecked")
protected void writeInternal(@Nullable Object obj, final Bson bson, final TypeInformation<?> typeHint) {
protected void writeInternal(@Nullable Object obj, Bson bson, @Nullable TypeInformation<?> typeHint) {
if (null == obj) {
return;
@@ -428,7 +428,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) bson);
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (Collection) bson);
return;
}
@@ -437,7 +437,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
addCustomTypeKeyIfNecessary(typeHint, obj, bson);
}
protected void writeInternal(@Nullable Object obj, final Bson bson, MongoPersistentEntity<?> entity) {
protected void writeInternal(@Nullable Object obj, Bson bson, @Nullable MongoPersistentEntity<?> entity) {
if (obj == null) {
return;
@@ -463,7 +463,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
private void writeProperties(Bson bson, MongoPersistentEntity<?> entity, PersistentPropertyAccessor accessor,
DocumentAccessor dbObjectAccessor, MongoPersistentProperty idProperty) {
DocumentAccessor dbObjectAccessor, @Nullable MongoPersistentProperty idProperty) {
// Write the properties
for (MongoPersistentProperty prop : entity) {
@@ -472,7 +472,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
continue;
}
if (prop.isAssociation()) {
writeAssociation(prop.getAssociation(), accessor, dbObjectAccessor);
writeAssociation(prop.getRequiredAssociation(), accessor, dbObjectAccessor);
continue;
}
@@ -499,7 +499,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
@SuppressWarnings({ "unchecked" })
protected void writePropertyInternal(Object obj, DocumentAccessor accessor, MongoPersistentProperty prop) {
protected void writePropertyInternal(@Nullable Object obj, DocumentAccessor accessor, MongoPersistentProperty prop) {
if (obj == null) {
return;
@@ -654,17 +654,19 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
/**
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
* Populates the given {@link Collection sink} with converted values from the given {@link Collection source}.
*
* @param source the collection to create a {@link BasicDBList} for, must not be {@literal null}.
* @param source the collection to create a {@link Collection} for, must not be {@literal null}.
* @param type the {@link TypeInformation} to consider or {@literal null} if unknown.
* @param sink the {@link BasicDBList} to write to.
* @param sink the {@link Collection} to write to.
* @return
*/
private BasicDBList writeCollectionInternal(Collection<?> source, TypeInformation<?> type, BasicDBList sink) {
private List<Object> writeCollectionInternal(Collection<?> source, @Nullable TypeInformation<?> type, Collection<?> sink) {
TypeInformation<?> componentType = null;
List<Object> collection = sink instanceof List ? (List) sink : new ArrayList<>(sink);
if (type != null) {
componentType = type.getComponentType();
}
@@ -674,17 +676,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> elementType = element == null ? null : element.getClass();
if (elementType == null || conversions.isSimpleType(elementType)) {
sink.add(getPotentiallyConvertedSimpleWrite(element));
collection.add(getPotentiallyConvertedSimpleWrite(element));
} else if (element instanceof Collection || elementType.isArray()) {
sink.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList()));
collection.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList()));
} else {
Document document = new Document();
writeInternal(element, document, componentType);
sink.add(document);
collection.add(document);
}
}
return sink;
return collection;
}
/**
@@ -868,9 +870,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
@Nullable
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(@Nullable Object value, Class<?> target) {
private Object getPotentiallyConvertedSimpleRead(@Nullable Object value, @Nullable Class<?> target) {
if (value == null || target == null || target.isAssignableFrom(value.getClass())) {
if (value == null || target == null || ClassUtils.isAssignableValue(target, value)) {
return value;
}
@@ -933,58 +935,62 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link BasicDBList} into a collection of the given {@link TypeInformation}.
*
* @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}.
* @param source must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object readCollectionOrArray(TypeInformation<?> targetType, List sourceValue, ObjectPath path) {
@SuppressWarnings("unchecked")
private Object readCollectionOrArray(TypeInformation<?> targetType, Collection<?> source, ObjectPath path) {
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) //
? collectionType //
: List.class;
TypeInformation<?> componentType = targetType.getComponentType() != null ? targetType.getComponentType()
TypeInformation<?> componentType = targetType.getComponentType() != null //
? targetType.getComponentType() //
: ClassTypeInformation.OBJECT;
Class<?> rawComponentType = componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<>(sourceValue.size())
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
Collection<Object> items = targetType.getType().isArray() //
? new ArrayList<>(source.size()) //
: CollectionFactory.createCollection(collectionType, rawComponentType, source.size());
if (sourceValue.isEmpty()) {
if (source.isEmpty()) {
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
}
if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceValue)) {
if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(source)) {
List<Object> objects = bulkReadAndConvertDBRefs((List<DBRef>) sourceValue, componentType, path, rawComponentType);
List<Object> objects = bulkReadAndConvertDBRefs((List<DBRef>) source, componentType, path, rawComponentType);
return getPotentiallyConvertedSimpleRead(objects, targetType.getType());
}
for (Object dbObjItem : sourceValue) {
for (Object element : source) {
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem
: readAndConvertDBRef((DBRef) dbObjItem, componentType, path, rawComponentType));
} else if (dbObjItem instanceof Document) {
items.add(read(componentType, (Document) dbObjItem, path));
} else if (dbObjItem instanceof BasicDBObject) {
items.add(read(componentType, (BasicDBObject) dbObjItem, path));
if (element instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? element
: readAndConvertDBRef((DBRef) element, componentType, path, rawComponentType));
} else if (element instanceof Document) {
items.add(read(componentType, (Document) element, path));
} else if (element instanceof BasicDBObject) {
items.add(read(componentType, (BasicDBObject) element, path));
} else {
if (dbObjItem instanceof Collection) {
if (element instanceof Collection) {
if (!rawComponentType.isArray() && !ClassUtils.isAssignable(Iterable.class, rawComponentType)) {
throw new MappingException(
String.format(INCOMPATIBLE_TYPES, dbObjItem, dbObjItem.getClass(), rawComponentType, path));
String.format(INCOMPATIBLE_TYPES, element, element.getClass(), rawComponentType, path));
}
}
if (dbObjItem instanceof List) {
items.add(readCollectionOrArray(ClassTypeInformation.OBJECT, (List) dbObjItem, path));
if (element instanceof List) {
items.add(readCollectionOrArray(componentType, (Collection<Object>) element, path));
} else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
items.add(getPotentiallyConvertedSimpleRead(element, rawComponentType));
}
}
}
@@ -1045,8 +1051,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
map.put(key, DBRef.class.equals(rawValueType) ? value
: readAndConvertDBRef((DBRef) value, defaultedValueType, ObjectPath.ROOT, rawValueType));
} else if (value instanceof List) {
map.put(key,
readCollectionOrArray(valueType != null ? valueType : ClassTypeInformation.LIST, (List) value, path));
map.put(key, readCollectionOrArray(valueType != null ? valueType : ClassTypeInformation.LIST,
(List<Object>) value, path));
} else {
map.put(key, getPotentiallyConvertedSimpleRead(value, rawValueType));
}
@@ -1084,8 +1090,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
"Cannot add key/value pair to %s. as map. Given Bson must be a Document or DBObject!", bson.getClass()));
}
@SuppressWarnings("unchecked")
private static void addAllToMap(Bson bson, Map value) {
private static void addAllToMap(Bson bson, Map<String, ?> value) {
if (bson instanceof Document) {
((Document) bson).putAll(value);
@@ -1140,10 +1145,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj);
}
TypeInformation<?> typeHint = typeInformation;
if (obj instanceof List) {
return maybeConvertList((List<Object>) obj, typeHint);
return maybeConvertList((List<Object>) obj, typeInformation);
}
if (obj instanceof Document) {
@@ -1151,7 +1154,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Document newValueDocument = new Document();
for (String vk : ((Document) obj).keySet()) {
Object o = ((Document) obj).get(vk);
newValueDocument.put(vk, convertToMongoType(o, typeHint));
newValueDocument.put(vk, convertToMongoType(o, typeInformation));
}
return newValueDocument;
}
@@ -1162,7 +1165,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint));
newValueDbo.put(vk, convertToMongoType(o, typeInformation));
}
return newValueDbo;
@@ -1173,18 +1176,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Document result = new Document();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeInformation));
}
return result;
}
if (obj.getClass().isArray()) {
return maybeConvertList(Arrays.asList((Object[]) obj), typeHint);
return maybeConvertList(Arrays.asList((Object[]) obj), typeInformation);
}
if (obj instanceof Collection) {
return maybeConvertList((Collection<?>) obj, typeHint);
return maybeConvertList((Collection<?>) obj, typeInformation);
}
Document newDocument = new Document();
@@ -1219,6 +1222,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param recursively whether to apply the removal recursively
* @return
*/
@SuppressWarnings("unchecked")
private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof Document)) {
@@ -1239,7 +1243,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
removeTypeInfo(element, recursively);
}
} else if (value instanceof List) {
for (Object element : (List) value) {
for (Object element : (List<Object>) value) {
removeTypeInfo(element, recursively);
}
} else {
@@ -1379,7 +1383,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} else if (value instanceof DBRef) {
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof List) {
return (T) readCollectionOrArray(type, (List) value, path);
return (T) readCollectionOrArray(type, (List<Object>) value, path);
} else if (value instanceof Document) {
return (T) read(type, (Document) value, path);
} else if (value instanceof DBObject) {
@@ -1495,7 +1499,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param source must not be {@literal null}.
* @return
*/
private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<Object> source) {
private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<?> source) {
Assert.notNull(source, "Iterable of DBRefs must not be null!");

View File

@@ -56,7 +56,7 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
private final MongoDbFactory dbFactory;
private final String bucket;
private final @Nullable String bucket;
private final MongoConverter converter;
private final QueryMapper queryMapper;
@@ -77,7 +77,7 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
* @param converter must not be {@literal null}.
* @param bucket
*/
public GridFsTemplate(MongoDbFactory dbFactory, MongoConverter converter, String bucket) {
public GridFsTemplate(MongoDbFactory dbFactory, MongoConverter converter, @Nullable String bucket) {
Assert.notNull(dbFactory, "MongoDbFactory must not be null!");
Assert.notNull(converter, "MongoConverter must not be null!");

View File

@@ -19,17 +19,26 @@ import lombok.EqualsAndHashCode;
import lombok.Value;
import lombok.experimental.UtilityClass;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.UUID;
import java.util.function.Supplier;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.xml.bind.DatatypeConverter;
import org.bson.BSON;
import org.bson.codecs.BinaryCodec;
import org.bson.codecs.Codec;
import org.bson.codecs.UuidCodec;
import org.bson.codecs.configuration.CodecConfigurationException;
import org.bson.codecs.configuration.CodecRegistry;
import org.bson.json.JsonWriter;
import org.bson.types.Binary;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBinding;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.expression.EvaluationContext;
@@ -41,6 +50,7 @@ import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
import com.mongodb.MongoClient;
import com.mongodb.util.JSON;
/**
@@ -57,6 +67,7 @@ class ExpressionEvaluatingParameterBinder {
private final SpelExpressionParser expressionParser;
private final EvaluationContextProvider evaluationContextProvider;
private final CodecRegistry codecRegistry;
/**
* Creates new {@link ExpressionEvaluatingParameterBinder}
@@ -72,6 +83,7 @@ class ExpressionEvaluatingParameterBinder {
this.expressionParser = expressionParser;
this.evaluationContextProvider = evaluationContextProvider;
this.codecRegistry = MongoClient.getDefaultCodecRegistry();
}
/**
@@ -212,18 +224,43 @@ class ExpressionEvaluatingParameterBinder {
if (value instanceof byte[]) {
String base64representation = DatatypeConverter.printBase64Binary((byte[]) value);
if (!binding.isQuoted()) {
return "{ '$binary' : '" + base64representation + "', '$type' : '" + BSON.B_GENERAL + "'}";
if (binding.isQuoted()) {
return DatatypeConverter.printBase64Binary((byte[]) value);
}
return base64representation;
return encode(new Binary((byte[]) value), BinaryCodec::new);
}
if (value instanceof UUID) {
if (binding.isQuoted()) {
return value.toString();
}
return encode((UUID) value, UuidCodec::new);
}
return JSON.serialize(value);
}
private <T> String encode(T value, Supplier<Codec<T>> defaultCodec) {
Codec<T> codec;
try {
codec = codecRegistry.get((Class<T>) value.getClass());
} catch (CodecConfigurationException exception) {
codec = defaultCodec.get();
}
StringWriter writer = new StringWriter();
codec.encode(new JsonWriter(writer), value, null);
writer.flush();
return writer.toString();
}
/**
* Evaluates the given {@code expressionString}.
*

View File

@@ -100,7 +100,7 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
if (allNew) {
List<S> result = source.stream().collect(Collectors.toList());
mongoOperations.insertAll(result);
mongoOperations.insert(result, entityInformation.getCollectionName());
return result;
} else {

View File

@@ -45,6 +45,7 @@ import org.springframework.util.Assert;
* @author Mark Paluch
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ruben J Garcia
* @since 2.0
*/
@RequiredArgsConstructor
@@ -91,13 +92,13 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
q.limit(2);
return mongoOperations.find(q, example.getProbeType(), entityInformation.getCollectionName()).buffer(2)
.flatMap(vals -> {
.map(vals -> {
if (vals.size() > 1) {
return Mono.error(new IncorrectResultSizeDataAccessException(1));
throw new IncorrectResultSizeDataAccessException(1);
}
return Mono.just(vals.iterator().next());
}).single();
return vals.iterator().next();
}).next();
}
/*
@@ -314,10 +315,9 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entityStream, "The given Publisher of entities must not be null!");
return Flux.from(entityStream)
.flatMap(entity -> entityInformation.isNew(entity) ? //
mongoOperations.insert(entity, entityInformation.getCollectionName()).then(Mono.just(entity)) : //
mongoOperations.save(entity, entityInformation.getCollectionName()).then(Mono.just(entity)));
return Flux.from(entityStream).flatMap(entity -> entityInformation.isNew(entity) ? //
mongoOperations.insert(entity, entityInformation.getCollectionName()).then(Mono.just(entity)) : //
mongoOperations.save(entity, entityInformation.getCollectionName()).then(Mono.just(entity)));
}
/*

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Optional;
import java.util.Set;
import java.util.regex.Pattern;
@@ -27,10 +28,12 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -93,7 +96,7 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
return super.visit(expr, context);
}
return converter.convertToMongoType(expr.getConstant());
return toQuerydslMongoType(expr.getConstant());
}
/*
@@ -128,7 +131,8 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Document mappedIdValue = mapper.getMappedObject((BasicDBObject) superIdValue, Optional.empty());
return (DBObject) JSON.parse(mappedIdValue.toJson());
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
return super.asDBObject(key, value instanceof Pattern ? value : toQuerydslMongoType(value));
}
/*
@@ -231,4 +235,25 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
return property;
}
private Object toQuerydslMongoType(Object source) {
Object target = converter.convertToMongoType(source);
if (target instanceof List) {
List<Object> newList = new BasicDBList();
for (Object item : (List) target) {
if (item instanceof Document) {
newList.add(new BasicDBObject(BsonUtils.asMap((Document) item)));
} else {
newList.add(item);
}
}
return newList;
}
return target;
}
}

View File

@@ -27,7 +27,7 @@ public class PersonWithVersionPropertyOfTypeLong {
@Override
public String toString() {
return "PersonWithVersionPropertyOfTypeInteger [id=" + id + ", firstName=" + firstName + ", age=" + age
return "PersonWithVersionPropertyOfTypeLong [id=" + id + ", firstName=" + firstName + ", age=" + age
+ ", version=" + version + "]";
}
}

View File

@@ -15,31 +15,25 @@
*/
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.junit.Assert.assertThat;
import static org.springframework.test.util.ReflectionTestUtils.*;
import java.net.UnknownHostException;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
* Unit tests for {@link SimpleMongoDbFactory}.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@@ -49,10 +43,15 @@ public class SimpleMongoDbFactoryUnitTests {
public @Rule ExpectedException expectedException = ExpectedException.none();
@Mock MongoClient mongo;
@Test // DATADOC-254
@Test // DATADOC-254, DATAMONGO-1903
public void rejectsIllegalDatabaseNames() {
rejectsDatabaseName("foo.bar");
rejectsDatabaseName("foo!bar");
rejectsDatabaseName("foo$bar");
rejectsDatabaseName("foo\\bar");
rejectsDatabaseName("foo//bar");
rejectsDatabaseName("foo bar");
rejectsDatabaseName("foo\"bar");
}
@Test // DATADOC-254
@@ -65,7 +64,7 @@ public class SimpleMongoDbFactoryUnitTests {
@Test // DATADOC-295
@SuppressWarnings("deprecation")
public void mongoUriConstructor() throws UnknownHostException {
public void mongoUriConstructor() {
MongoClientURI mongoURI = new MongoClientURI("mongodb://myUsername:myPassword@localhost/myDatabase.myCollection");
MongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongoURI);
@@ -74,7 +73,7 @@ public class SimpleMongoDbFactoryUnitTests {
}
@Test // DATAMONGO-1158
public void constructsMongoClientAccordingToMongoUri() throws UnknownHostException {
public void constructsMongoClientAccordingToMongoUri() {
MongoClientURI uri = new MongoClientURI("mongodb://myUserName:myPassWord@127.0.0.1:27017/myDataBase.myCollection");
SimpleMongoDbFactory factory = new SimpleMongoDbFactory(uri);
@@ -84,12 +83,7 @@ public class SimpleMongoDbFactoryUnitTests {
@SuppressWarnings("deprecation")
private void rejectsDatabaseName(String databaseName) {
try {
new SimpleMongoDbFactory(mongo, databaseName);
fail("Expected database name " + databaseName + " to be rejected!");
} catch (IllegalArgumentException ex) {
}
assertThatThrownBy(() -> new SimpleMongoDbFactory(mongo, databaseName))
.isInstanceOf(IllegalArgumentException.class);
}
}

View File

@@ -0,0 +1,54 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Unit tests for {@link SimpleReactiveMongoDatabaseFactory}.
*
* @author Mark Paluch
*/
@RunWith(MockitoJUnitRunner.class)
public class SimpleReactiveMongoDatabaseFactoryUnitTests {
@Mock MongoClient mongoClient;
@Mock MongoDatabase database;
@Test // DATAMONGO-1903
public void rejectsIllegalDatabaseNames() {
rejectsDatabaseName("foo.bar");
rejectsDatabaseName("foo$bar");
rejectsDatabaseName("foo\\bar");
rejectsDatabaseName("foo//bar");
rejectsDatabaseName("foo bar");
rejectsDatabaseName("foo\"bar");
}
private void rejectsDatabaseName(String databaseName) {
assertThatThrownBy(() -> new SimpleReactiveMongoDatabaseFactory(mongoClient, databaseName))
.isInstanceOf(IllegalArgumentException.class);
}
}

View File

@@ -33,6 +33,7 @@ import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Reduce.P
import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Reduce.Variable;
import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Slice;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Switch.CaseOperator;
import org.springframework.data.mongodb.core.aggregation.DateOperators.Timezone;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.StringOperators.Concat;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
@@ -197,10 +198,23 @@ public class ProjectionOperationUnitTests {
assertThat(oper.get(MOD)).isEqualTo((Object) Arrays.<Object> asList("$a", 3));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-758
public void excludeShouldThrowExceptionForFieldsOtherThanUnderscoreId() {
@Test // DATAMONGO-758, DATAMONGO-1893
public void excludeShouldAllowExclusionOfFieldsOtherThanUnderscoreId/* since MongoDB 3.4 */() {
new ProjectionOperation().andExclude("foo");
ProjectionOperation projectionOp = new ProjectionOperation().andExclude("foo");
Document document = projectionOp.toDocument(Aggregation.DEFAULT_CONTEXT);
Document projectClause = DocumentTestUtils.getAsDocument(document, PROJECT);
assertThat(projectionOp.inheritsFields()).isTrue();
assertThat((Integer) projectClause.get("foo")).isEqualTo(0);
}
@Test // DATAMONGO-1893
public void includeShouldNotInheritFields() {
ProjectionOperation projectionOp = new ProjectionOperation().andInclude("foo");
assertThat(projectionOp.inheritsFields()).isFalse();
}
@Test // DATAMONGO-758
@@ -1066,6 +1080,40 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { dayOfYear: { $dayOfYear: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDayOfYearAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).dayOfYear()).as("dayOfYear")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { dayOfYear: { $dayOfYear: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderTimeZoneFromField() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.ofField("tz")).dayOfYear()).as("dayOfYear")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { dayOfYear: { $dayOfYear: { \"date\" : \"$date\", \"timezone\" : \"$tz\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderTimeZoneFromExpression() {
Document agg = project()
.and(DateOperators.dateOf("date")
.withTimezone(Timezone.ofExpression(LiteralOperators.valueOf("America/Chicago").asLiteral())).dayOfYear())
.as("dayOfYear").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { dayOfYear: { $dayOfYear: { \"date\" : \"$date\", \"timezone\" : { $literal: \"America/Chicago\"} } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderDayOfMonthAggregationExpression() {
@@ -1075,6 +1123,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { day: { $dayOfMonth: \"$date\" }} }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDayOfMonthAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).dayOfMonth()).as("day")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { day: { $dayOfMonth: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderDayOfWeekAggregationExpression() {
@@ -1084,6 +1143,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { dayOfWeek: { $dayOfWeek: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDayOfWeekAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).dayOfWeek()).as("dayOfWeek")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { dayOfWeek: { $dayOfWeek: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderYearAggregationExpression() {
@@ -1093,6 +1163,16 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { year: { $year: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderYearAggregationExpressionWithTimezone() {
Document agg = project().and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).year())
.as("year").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project: { year: { $year: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderMonthAggregationExpression() {
@@ -1102,6 +1182,16 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { month: { $month: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderMonthAggregationExpressionWithTimezone() {
Document agg = project().and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).month())
.as("month").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project: { month: { $month: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderWeekAggregationExpression() {
@@ -1111,6 +1201,16 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { week: { $week: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderWeekAggregationExpressionWithTimezone() {
Document agg = project().and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).week())
.as("week").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project: { week: { $week: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderHourAggregationExpression() {
@@ -1120,6 +1220,16 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { hour: { $hour: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderHourAggregationExpressionWithTimezone() {
Document agg = project().and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).hour())
.as("hour").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project: { hour: { $hour: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderMinuteAggregationExpression() {
@@ -1129,6 +1239,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { minute: { $minute: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderMinuteAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).minute()).as("minute")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { minute: { $minute: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderSecondAggregationExpression() {
@@ -1138,6 +1259,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { second: { $second: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderSecondAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).second()).as("second")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { second: { $second: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderMillisecondAggregationExpression() {
@@ -1147,6 +1279,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project: { msec: { $millisecond: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderMillisecondAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).millisecond()).as("msec")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { msec: { $millisecond: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderDateToString() {
@@ -1167,6 +1310,17 @@ public class ProjectionOperationUnitTests {
Document.parse("{ $project: { time: { $dateToString: { format: \"%H:%M:%S:%L\", date: \"$date\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateToStringAggregationExpressionWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).toString("%H:%M:%S:%L"))
.as("time").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project: { time: { $dateToString: { format: \"%H:%M:%S:%L\", date: \"$date\", \"timezone\" : \"America/Chicago\" } } } } } }"));
}
@Test // DATAMONGO-1536
public void shouldRenderSumAggregationExpression() {
@@ -1423,11 +1577,11 @@ public class ProjectionOperationUnitTests {
.as("finalTotal").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse("{ $project:{ \"finalTotal\" : { \"$let\": {" + //
"\"vars\": {" + //
"\"total\": { \"$add\": [ \"$price\", \"$tax\" ] }," + //
"\"discounted\": { \"$cond\": { \"if\": \"$applyDiscount\", \"then\": 0.9, \"else\": 1.0 } }" + //
"}," + //
"\"in\": { \"$multiply\": [ \"$$total\", \"$$discounted\" ] }" + //
"\"vars\": {" + //
"\"total\": { \"$add\": [ \"$price\", \"$tax\" ] }," + //
"\"discounted\": { \"$cond\": { \"if\": \"$applyDiscount\", \"then\": 0.9, \"else\": 1.0 } }" + //
"}," + //
"\"in\": { \"$multiply\": [ \"$$total\", \"$$discounted\" ] }" + //
"}}}}"));
}
@@ -1446,11 +1600,11 @@ public class ProjectionOperationUnitTests {
.as("finalTotal").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse("{ $project:{ \"finalTotal\" : { \"$let\": {" + //
"\"vars\": {" + //
"\"total\": { \"$add\": [ \"$price\", \"$tax\" ] }," + //
"\"discounted\": { \"$cond\": { \"if\": \"$applyDiscount\", \"then\": 0.9, \"else\": 1.0 } }" + //
"}," + //
"\"in\": { \"$multiply\": [ \"$$total\", \"$$discounted\" ] }" + //
"\"vars\": {" + //
"\"total\": { \"$add\": [ \"$price\", \"$tax\" ] }," + //
"\"discounted\": { \"$cond\": { \"if\": \"$applyDiscount\", \"then\": 0.9, \"else\": 1.0 } }" + //
"}," + //
"\"in\": { \"$multiply\": [ \"$$total\", \"$$discounted\" ] }" + //
"}}}}"));
}
@@ -1643,6 +1797,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project : { dayOfWeek: { $isoDayOfWeek: \"$birthday\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoDayOfWeekWithTimezoneCorrectly() {
Document agg = project()
.and(DateOperators.dateOf("birthday").withTimezone(Timezone.valueOf("America/Chicago")).isoDayOfWeek())
.as("dayOfWeek").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { dayOfWeek: { $isoDayOfWeek: { \"date\" : \"$birthday\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1548
public void shouldRenderIsoWeekCorrectly() {
@@ -1652,6 +1817,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project : { weekNumber: { $isoWeek: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoWeekWithTimezoneCorrectly() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).isoWeek()).as("weekNumber")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { weekNumber: { $isoWeek: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1548
public void shouldRenderIsoWeekYearCorrectly() {
@@ -1661,6 +1837,17 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project : { yearNumber: { $isoWeekYear: \"$date\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoWeekYearWithTimezoneCorrectly() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).isoWeekYear())
.as("yearNumber").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { yearNumber: { $isoWeekYear: { \"date\" : \"$date\", \"timezone\" : \"America/Chicago\" } } } } }"));
}
@Test // DATAMONGO-1548
public void shouldRenderSwitchCorrectly() {
@@ -1711,6 +1898,128 @@ public class ProjectionOperationUnitTests {
assertThat(agg).isEqualTo(Document.parse("{ $project : { a: { $type: \"$a\" } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromPartsWithJustTheYear() {
Document agg = project().and(DateOperators.dateFromParts().year(2018)).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse("{ $project : { newDate: { $dateFromParts: { year : 2018 } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromParts() {
Document agg = project()
.and(DateOperators.dateFromParts().year(2018).month(3).day(23).hour(14).minute(25).second(10).milliseconds(2))
.as("newDate").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { newDate: { $dateFromParts: { year : 2018, month : 3, day : 23, hour : 14, minute : 25, second : 10, milliseconds : 2 } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromPartsWithTimezone() {
Document agg = project()
.and(DateOperators.dateFromParts().withTimezone(Timezone.valueOf("America/Chicago")).year(2018)).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project : { newDate: { $dateFromParts: { year : 2018, timezone : \"America/Chicago\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoDateFromPartsWithJustTheYear() {
Document agg = project().and(DateOperators.dateFromParts().isoWeekYear(2018)).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse("{ $project : { newDate: { $dateFromParts: { isoWeekYear : 2018 } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoDateFromParts() {
Document agg = project().and(DateOperators.dateFromParts().isoWeekYear(2018).isoWeek(12).isoDayOfWeek(5).hour(14)
.minute(30).second(42).milliseconds(2)).as("newDate").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { newDate: { $dateFromParts: { isoWeekYear : 2018, isoWeek : 12, isoDayOfWeek : 5, hour : 14, minute : 30, second : 42, milliseconds : 2 } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderIsoDateFromPartsWithTimezone() {
Document agg = project()
.and(DateOperators.dateFromParts().withTimezone(Timezone.valueOf("America/Chicago")).isoWeekYear(2018))
.as("newDate").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { newDate: { $dateFromParts: { isoWeekYear : 2018, timezone : \"America/Chicago\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateToParts() {
Document agg = project().and(DateOperators.dateOf("date").toParts()).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse("{ $project : { newDate: { $dateToParts: { date : \"$date\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateToIsoParts() {
Document agg = project().and(DateOperators.dateOf("date").toParts().iso8601()).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(
Document.parse("{ $project : { newDate: { $dateToParts: { date : \"$date\", iso8601 : true } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateToPartsWithTimezone() {
Document agg = project()
.and(DateOperators.dateOf("date").withTimezone(Timezone.valueOf("America/Chicago")).toParts()).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project : { newDate: { $dateToParts: { date : \"$date\", timezone : \"America/Chicago\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromString() {
Document agg = project().and(DateOperators.dateFromString("2017-02-08T12:10:40.787")).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document
.parse("{ $project : { newDate: { $dateFromString: { dateString : \"2017-02-08T12:10:40.787\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromStringWithFieldReference() {
Document agg = project().and(DateOperators.dateOf("date").fromString()).as("newDate")
.toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg)
.isEqualTo(Document.parse("{ $project : { newDate: { $dateFromString: { dateString : \"$date\" } } } }"));
}
@Test // DATAMONGO-1834
public void shouldRenderDateFromStringWithTimezone() {
Document agg = project()
.and(DateOperators.dateFromString("2017-02-08T12:10:40.787").withTimezone(Timezone.valueOf("America/Chicago")))
.as("newDate").toDocument(Aggregation.DEFAULT_CONTEXT);
assertThat(agg).isEqualTo(Document.parse(
"{ $project : { newDate: { $dateFromString: { dateString : \"2017-02-08T12:10:40.787\", timezone : \"America/Chicago\" } } } }"));
}
private static Document exctractOperation(String field, Document fromProjectClause) {
return (Document) fromProjectClause.get(field);
}

View File

@@ -22,6 +22,8 @@ import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import lombok.AllArgsConstructor;
import java.util.Arrays;
import java.util.List;
@@ -35,8 +37,8 @@ import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
@@ -188,6 +190,19 @@ public class TypeBasedAggregationOperationContextUnitTests {
.containing("age", "$age.value"));
}
@Test // DATAMONGO-1893
public void considersIncludedFieldsFromSingleExclusionsCorrectly() {
AggregationOperationContext context = getContext(FooPerson.class);
TypedAggregation<FooPerson> agg = newAggregation(FooPerson.class, project() //
.andExclude("name"), sort(Sort.by("age.value", "lastName")));
Document dbo = agg.toDocument("person", context);
Document sort = getPipelineElementFromAggregationAt(dbo, 1);
assertThat(getAsDocument(sort, "$sort"), is(equalTo(new Document("age.value", 1).append("last_name", 1))));
}
@Test // DATAMONGO-1133
public void shouldHonorAliasedFieldsInGroupExpressions() {
@@ -344,18 +359,13 @@ public class TypeBasedAggregationOperationContextUnitTests {
}
@org.springframework.data.mongodb.core.mapping.Document(collection = "person")
@AllArgsConstructor
public static class FooPerson {
final ObjectId id;
final String name;
@org.springframework.data.mongodb.core.mapping.Field("last_name") final String lastName;
final Age age;
@PersistenceConstructor
FooPerson(ObjectId id, String name, Age age) {
this.id = id;
this.name = name;
this.age = age;
}
}
public static class Age {

View File

@@ -16,8 +16,13 @@
package org.springframework.data.mongodb.core.convert;
import static java.time.ZoneId.*;
import static org.assertj.core.api.Assertions.*;
import static org.assertj.core.api.Assertions.assertThat;
import static org.hamcrest.Matchers.*;
import static org.hamcrest.Matchers.not;
import static org.junit.Assert.*;
import static org.junit.Assert.assertThat;
import static org.junit.Assert.fail;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.DocumentTestUtils.*;
@@ -61,6 +66,7 @@ import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.ConversionNotSupportedException;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.core.convert.ConverterNotFoundException;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
@@ -1821,6 +1827,65 @@ public class MappingMongoConverterUnitTests {
assertThat(converter.read(WithArrayInConstructor.class, source).array, is(nullValue()));
}
@Test // DATAMONGO-1898
public void writesInterfaceBackedEnumsToSimpleNameByDefault() {
org.bson.Document document = new org.bson.Document();
DocWithInterfacedEnum source = new DocWithInterfacedEnum();
source.property = InterfacedEnum.INSTANCE;
converter.write(source, document);
assertThat(document) //
.hasSize(2) //
.hasEntrySatisfying("_class", __ -> {}) //
.hasEntrySatisfying("property", value -> InterfacedEnum.INSTANCE.name().equals(value));
}
@Test // DATAMONGO-1898
public void rejectsConversionFromStringToEnumBackedInterface() {
org.bson.Document document = new org.bson.Document("property", InterfacedEnum.INSTANCE.name());
assertThatExceptionOfType(ConverterNotFoundException.class) //
.isThrownBy(() -> converter.read(DocWithInterfacedEnum.class, document));
}
@Test // DATAMONGO-1898
public void readsInterfacedEnumIfConverterIsRegistered() {
org.bson.Document document = new org.bson.Document("property", InterfacedEnum.INSTANCE.name());
Converter<String, SomeInterface> enumConverter = new Converter<String, SomeInterface>() {
@Override
public SomeInterface convert(String source) {
return InterfacedEnum.valueOf(source);
}
};
converter.setCustomConversions(new MongoCustomConversions(Arrays.asList(enumConverter)));
converter.afterPropertiesSet();
DocWithInterfacedEnum result = converter.read(DocWithInterfacedEnum.class, document);
assertThat(result.property).isEqualTo(InterfacedEnum.INSTANCE);
}
@Test // DATAMONGO-1904
public void readsNestedArraysCorrectly() {
List<List<List<Float>>> floats = Arrays.asList(Arrays.asList(Arrays.asList(1.0f, 2.0f)));
org.bson.Document document = new org.bson.Document("nestedFloats", floats);
WithNestedLists result = converter.read(WithNestedLists.class, document);
assertThat(result.nestedFloats).hasSize(1);
assertThat(result.nestedFloats).isEqualTo(new float[][][] { { { 1.0f, 2.0f } } });
}
static class GenericType<T> {
T content;
}
@@ -2182,4 +2247,23 @@ public class MappingMongoConverterUnitTests {
final String[] array;
}
// DATAMONGO-1898
// DATACMNS-1278
static interface SomeInterface {}
static enum InterfacedEnum implements SomeInterface {
INSTANCE;
}
static class DocWithInterfacedEnum {
SomeInterface property;
}
// DATAMONGO-1904
static class WithNestedLists {
float[][][] nestedFloats;
}
}

View File

@@ -27,6 +27,7 @@ import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Optional;
import java.util.UUID;
import java.util.stream.Collectors;
import java.util.stream.Stream;
@@ -1094,6 +1095,17 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(users.get(0), is(dave));
}
@Test // DATAMONGO-1911
public void findByUUIDShouldReturnCorrectResult() {
dave.setUniqueId(UUID.randomUUID());
repository.save(dave);
Person dave = repository.findByUniqueId(this.dave.getUniqueId());
assertThat(dave, is(equalTo(dave)));
}
@Test // DATAMONGO-1245
public void findByExampleShouldResolveStuffCorrectly() {

View File

@@ -19,6 +19,7 @@ import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Set;
import java.util.UUID;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexType;
@@ -34,6 +35,7 @@ import org.springframework.data.mongodb.core.mapping.Field;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
@Document
public class Person extends Contact {
@@ -56,6 +58,8 @@ public class Person extends Contact {
private @Field("add") Address address;
private Set<Address> shippingAddresses;
private UUID uniqueId;
@DBRef User creator;
@DBRef(lazy = true) User coworker;
@@ -196,6 +200,14 @@ public class Person extends Contact {
this.shippingAddresses = addresses;
}
public UUID getUniqueId() {
return uniqueId;
}
public void setUniqueId(UUID uniqueId) {
this.uniqueId = uniqueId;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.Contact#getName()
*/

View File

@@ -19,6 +19,7 @@ import java.util.Collection;
import java.util.Date;
import java.util.List;
import java.util.Optional;
import java.util.UUID;
import java.util.stream.Stream;
import org.springframework.data.domain.Page;
@@ -318,6 +319,10 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
@Query("{ firstname : :#{#firstname}}")
List<Person> findWithSpelByFirstnameForSpELExpressionWithParameterVariableOnly(@Param("firstname") String firstname);
// DATAMONGO-1911
@Query("{ uniqueId: ?0}")
Person findByUniqueId(UUID uniqueId);
/**
* Returns the count of {@link Person} with the given firstname. Uses {@link CountQuery} annotation to define the
* query to be executed.

View File

@@ -54,6 +54,7 @@ import org.springframework.util.ClassUtils;
*
* @author Mark Paluch
* @author Christoph Strobl
* @author Ruben J Garcia
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:reactive-infrastructure.xml")
@@ -441,6 +442,14 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
StepVerifier.create(repository.findOne(example)).expectError(IncorrectResultSizeDataAccessException.class);
}
@Test // DATAMONGO-1907
public void findOneByExampleWithoutResultShouldCompleteEmpty() {
Example<ReactivePerson> example = Example.of(new ReactivePerson("foo", "bar", -1));
StepVerifier.create(repository.findOne(example)).verifyComplete();
}
interface ReactivePersonRepostitory extends ReactiveMongoRepository<ReactivePerson, String> {
Flux<ReactivePerson> findByLastname(String lastname);

View File

@@ -26,6 +26,7 @@ import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import javax.xml.bind.DatatypeConverter;
@@ -321,6 +322,34 @@ public class StringBasedMongoQueryUnitTests {
assertThat(query.getQueryObject().toJson(), is(reference.getQueryObject().toJson()));
}
@Test // DATAMONGO-1911
public void shouldSupportNonQuotedUUIDReplacement() {
UUID uuid = UUID.fromString("864de43b-e3ea-f1e4-3663-fb8240b659b9");
ConvertingParameterAccessor accessor = StubParameterAccessor.getAccessor(converter, (Object) uuid);
StringBasedMongoQuery mongoQuery = createQueryForMethod("findByLastnameAsUUID", UUID.class);
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accessor);
org.springframework.data.mongodb.core.query.Query reference = new BasicQuery(
"{'lastname' : { $binary:\"5PHq4zvkTYa5WbZAgvtjNg==\", $type: \"03\"}}");
assertThat(query.getQueryObject().toJson(), is(reference.getQueryObject().toJson()));
}
@Test // DATAMONGO-1911
public void shouldSupportQuotedUUIDReplacement() {
UUID uuid = UUID.randomUUID();
ConvertingParameterAccessor accessor = StubParameterAccessor.getAccessor(converter, (Object) uuid);
StringBasedMongoQuery mongoQuery = createQueryForMethod("findByLastnameAsStringUUID", UUID.class);
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accessor);
org.springframework.data.mongodb.core.query.Query reference = new BasicQuery(
"{'lastname' : '" + uuid.toString() + "'}");
assertThat(query.getQueryObject().toJson(), is(reference.getQueryObject().toJson()));
}
@Test // DATAMONGO-1454
public void shouldSupportExistsProjection() {
@@ -551,6 +580,12 @@ public class StringBasedMongoQueryUnitTests {
@Query("{ 'lastname' : ?0 }")
Person findByLastnameAsBinary(byte[] lastname);
@Query("{ 'lastname' : ?0 }")
Person findByLastnameAsUUID(UUID lastname);
@Query("{ 'lastname' : '?0' }")
Person findByLastnameAsStringUUID(UUID lastname);
@Query("{ 'lastname' : '?0' }")
Person findByLastnameQuoted(String lastname);

View File

@@ -32,9 +32,9 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.ExampleMatcher.*;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
@@ -365,6 +365,24 @@ public class SimpleMongoRepositoryTests {
assertThat(repository.count(Example.of(sample))).isEqualTo(2L);
}
@Test // DATAMONGO-1896
public void saveAllUsesEntityCollection() {
Person first = new PersonExtended();
first.setEmail("foo@bar.com");
ReflectionTestUtils.setField(first, "id", null);
Person second = new PersonExtended();
second.setEmail("bar@foo.com");
ReflectionTestUtils.setField(second, "id", null);
repository.deleteAll();
repository.saveAll(Arrays.asList(first, second));
assertThat(repository.findAll()).containsExactlyInAnyOrder(first, second);
}
private void assertThatAllReferencePersonsWereStoredCorrectly(Map<String, Person> references, List<Person> saved) {
for (Person person : saved) {

View File

@@ -45,6 +45,8 @@ import org.springframework.data.mongodb.repository.QPerson;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
import com.querydsl.core.types.dsl.BooleanExpression;
import com.querydsl.core.types.dsl.BooleanOperation;
import com.querydsl.core.types.dsl.PathBuilder;
import com.querydsl.core.types.dsl.SimplePath;
@@ -184,6 +186,16 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat(((DBObject) mappedPredicate).get("sex"), is((Object) "f"));
}
@Test // DATAMONGO-1943
public void shouldRemarshallListsAndDocuments() {
BooleanExpression criteria = QPerson.person.firstname.isNotEmpty()
.and(QPerson.person.firstname.containsIgnoreCase("foo")).not();
assertThat(this.serializer.handle(criteria), is(equalTo(JSON.parse("{ \"$or\" : [ { \"firstname\" : { \"$ne\" : { "
+ "\"$ne\" : \"\"}}} , { \"firstname\" : { \"$not\" : { \"$regex\" : \".*\\\\Qfoo\\\\E.*\" , \"$options\" : \"i\"}}}]}"))));
}
class Address {
String id;
String street;

View File

@@ -1833,7 +1833,7 @@ At the time of this writing we provide support for the following Aggregation Ope
| literal
| Date Aggregation Operators
| dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString, isoDayOfWeek, isoWeek, isoWeekYear
| dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString, dateFromString, dateFromParts, dateToParts, isoDayOfWeek, isoWeek, isoWeekYear
| Variable Operators
| map

View File

@@ -1,6 +1,93 @@
Spring Data MongoDB Changelog
=============================
Changes in version 2.0.7.RELEASE (2018-05-08)
---------------------------------------------
* DATAMONGO-1943 - QueryDSL functionality defect while using NOT operator for negate operation.
* DATAMONGO-1918 - Release 2.0.7 (Kay SR7).
* DATAMONGO-1893 - Exclusions in projection.
* DATAMONGO-1819 - Discovering the preferred constructor lead to InaccessibleObjectException with Java 9.
Changes in version 1.10.12.RELEASE (2018-05-08)
-----------------------------------------------
* DATAMONGO-1917 - Release 1.10.12 (Ingalls SR12).
* DATAMONGO-1893 - Exclusions in projection.
Changes in version 2.1.0.M2 (2018-04-13)
----------------------------------------
* DATAMONGO-1923 - Adapt to API changes in Spring Data Commons.
* DATAMONGO-1916 - Potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
* DATAMONGO-1915 - Remove explicit declaration of Jackson library versions.
* DATAMONGO-1913 - Missing @Nullable annotation in GridFsTemplate.bucket.
* DATAMONGO-1912 - Unable to retrieve Mongo's generated id when inserting java.util.Map.
* DATAMONGO-1911 - String-query accepting UUID renders to $uuid.
* DATAMONGO-1909 - Fix typo in tests.
* DATAMONGO-1907 - Error in method findOne(Example example) in ReactiveQueryByExampleExecutor when there is no results.
* DATAMONGO-1906 - Add support for conditional $$REMOVE operator in aggregation $project.
* DATAMONGO-1904 - MappingMongoConverter fails to properly read nested arrays.
* DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
* DATAMONGO-1901 - Declare project.root property to make sure JavaDoc generation works.
* DATAMONGO-1900 - Add section to reference documentation about mapping of enums with interfaces.
* DATAMONGO-1899 - Export composable repositories via CDI.
* DATAMONGO-1898 - Verify handling of enums in MappingMongoConverter.
* DATAMONGO-1896 - In case of inheritance, MongoRepository.saveAll(…) does not insert elements into the aggregate collection.
* DATAMONGO-1893 - Exclusions in projection.
* DATAMONGO-1891 - Improve $jsonSchema documentation.
* DATAMONGO-1881 - Upgrade MongoDB sync & reactive streams Java driver to 3.6.3 and 1.7.1.
* DATAMONGO-1880 - Add support for Client Sessions.
* DATAMONGO-1877 - JsonSchemaProperty needs static method date().
* DATAMONGO-1873 - Add value() alias to @Document(collection= "…").
* DATAMONGO-1872 - SpEL Expressions in @Document annotations are not re-evaluated for repository query executions.
* DATAMONGO-1871 - AggregationExpression rendering does not consider nested property aliasing.
* DATAMONGO-1870 - Skip parameter not working in MongoTemplate#remove(Query, Class).
* DATAMONGO-1869 - Release 2.1 M2 (Lovelace).
* DATAMONGO-1866 - Update travis build to use single node replica set.
* DATAMONGO-1865 - findFirst query method throws IncorrectResultSizeDataAccessException on non-unique result.
* DATAMONGO-1860 - Mongo count operation called twice in QuerydslMongoPredicateExecutor.findAll(Predicate, Pageable).
* DATAMONGO-1834 - Add support for aggregation operators $dateFromString, $dateFromParts and $dateToParts.
* DATAMONGO-1819 - Discovering the preferred constructor lead to InaccessibleObjectException with Java 9.
* DATAMONGO-1813 - Provide method to create GridFsResource from GridFSFile.
Changes in version 2.0.6.RELEASE (2018-04-04)
---------------------------------------------
* DATAMONGO-1916 - Potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
* DATAMONGO-1915 - Remove explicit declaration of Jackson library versions.
* DATAMONGO-1913 - Missing @Nullable annotation in GridFsTemplate.bucket.
* DATAMONGO-1911 - String-query accepting UUID renders to $uuid.
* DATAMONGO-1909 - Fix typo in tests.
* DATAMONGO-1907 - Error in method findOne(Example example) in ReactiveQueryByExampleExecutor when there is no results.
* DATAMONGO-1904 - MappingMongoConverter fails to properly read nested arrays.
* DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
* DATAMONGO-1901 - Declare project.root property to make sure JavaDoc generation works.
* DATAMONGO-1898 - Verify handling of enums in MappingMongoConverter.
* DATAMONGO-1896 - In case of inheritance, MongoRepository.saveAll(…) does not insert elements into the aggregate collection.
* DATAMONGO-1888 - Release 2.0.6 (Kay SR6).
* DATAMONGO-1834 - Add support for aggregation operators $dateFromString, $dateFromParts and $dateToParts.
Changes in version 1.10.11.RELEASE (2018-04-04)
-----------------------------------------------
* DATAMONGO-1915 - Remove explicit declaration of Jackson library versions.
* DATAMONGO-1909 - Fix typo in tests.
* DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
* DATAMONGO-1898 - Verify handling of enums in MappingMongoConverter.
* DATAMONGO-1896 - In case of inheritance, MongoRepository.saveAll(…) does not insert elements into the aggregate collection.
* DATAMONGO-1871 - AggregationExpression rendering does not consider nested property aliasing.
* DATAMONGO-1870 - Skip parameter not working in MongoTemplate#remove(Query, Class).
* DATAMONGO-1860 - Mongo count operation called twice in QuerydslMongoPredicateExecutor.findAll(Predicate, Pageable).
* DATAMONGO-1858 - Fix line endings.
* DATAMONGO-1857 - Release 1.10.11 (Ingalls SR11).
* DATAMONGO-1834 - Add support for aggregation operators $dateFromString, $dateFromParts and $dateToParts.
Changes in version 2.0.5.RELEASE (2018-02-28)
---------------------------------------------
* DATAMONGO-1882 - Release 2.0.5 (Kay SR5).
Changes in version 2.0.4.RELEASE (2018-02-19)
---------------------------------------------
* DATAMONGO-1872 - SpEL Expressions in @Document annotations are not re-evaluated for repository query executions.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 2.0.4
Spring Data MongoDB 2.0.7
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").