Compare commits

...

19 Commits

Author SHA1 Message Date
Oliver Gierke
f4e730ce87 DATAMONGO-1793 - Release version 2.0.1 (Kay SR1). 2017-10-27 15:25:11 +02:00
Oliver Gierke
e3a83ebc42 DATAMONGO-1793 - Prepare 2.0.1 (Kay SR1). 2017-10-27 15:24:24 +02:00
Oliver Gierke
f65c1e324e DATAMONGO-1793 - Updated changelog. 2017-10-27 15:24:14 +02:00
Oliver Gierke
1dd0061f03 DATAMONGO-1815 - Adapt API changes in Property in test cases. 2017-10-27 11:13:31 +02:00
Mark Paluch
5ea860700c DATAMONGO-1814 - Update reference documentation for faceted classification.
Original pull request: #426.
Original ticket: DATAMONGO-1552.
2017-10-26 09:44:50 +02:00
Christoph Strobl
3dd653a702 DATAMONGO-1811 - Update documentation of MongoOperations.executeCommand.
Update Javadoc and reference documentation.
2017-10-24 14:59:47 +02:00
Christoph Strobl
f87847407b DATAMONGO-1805 - Update GridFsOperations documentation.
Fix return type in reference documentation and update Javadoc.
2017-10-24 14:59:40 +02:00
Christoph Strobl
433a125c9e DATAMONGO-1806 - Polishing.
Remove unused import, trailing whitespaces and update Javadoc.

Original Pull Request: #506
2017-10-24 14:59:33 +02:00
hartmut
5827cb0971 DATAMONGO-1806 - Fix Javadoc for GridFsResource.
Original Pull Request: #506
2017-10-24 14:59:24 +02:00
Mark Paluch
0109bf6858 DATAMONGO-1809 - Introduce AssertJ assertions for Document.
Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
49d1555576 DATAMONGO-1809 - Polishing.
Move tests to AssertJ.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
fdbb305b8e DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Mark Paluch
49dd03311a DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:45:33 +02:00
Mark Paluch
a86a3210e1 DATAMONGO-1802 - Polishing.
Reduce converter visibility to MongoConverters's package-scope visibility. Tiny alignment in Javadoc wording. Copyright year, create empty byte array with element count instead initializer.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Christoph Strobl
4b655abfb6 DATAMONGO-1802 - Add Binary to byte array converter.
We now provide and register a Binary to byte[] converter to provide conversion of binary data to a byte array. MongoDB deserializes binary data using the document API to its Binary type. With this converter, we reinstantiated the previous capability to use byte arrays for binary data within domain types.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Oliver Gierke
0963e6cf77 DATAMONGO-1775 - Updated changelog. 2017-10-11 19:03:29 +02:00
Oliver Gierke
3e1b2c4bdb DATAMONGO-1795 - Removed obsolete Kotlin build setup. 2017-10-04 11:05:27 +02:00
Mark Paluch
03e0e0c431 DATAMONGO-1776 - After release cleanups. 2017-10-02 11:38:04 +02:00
Mark Paluch
51900021a1 DATAMONGO-1776 - Prepare next development iteration. 2017-10-02 11:38:03 +02:00
22 changed files with 938 additions and 305 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
</parent>
<modules>
@@ -27,7 +27,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.0.RELEASE</springdata.commons>
<springdata.commons>2.0.1.RELEASE</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -49,7 +49,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.1.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -288,74 +288,9 @@
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>kotlin-maven-plugin</artifactId>
<groupId>org.jetbrains.kotlin</groupId>
<version>${kotlin}</version>
<configuration>
<jvmTarget>${source.level}</jvmTarget>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>test-compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/test/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
<execution>
<id>default-testCompile</id>
<phase>none</phase>
</execution>
<execution>
<id>java-compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>java-test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
@@ -384,7 +319,6 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
@@ -406,6 +340,8 @@
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -72,11 +72,11 @@ public interface MongoOperations extends FluentMongoOperations {
String getCollectionName(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* Execute the a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* obtain the {@link Document} holding the actual command. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
* @param jsonCommand a MongoDB command expressed as a JSON string. Must not be {@literal null}.
* @return a result object returned by the action.
*/
Document executeCommand(String jsonCommand);
@@ -851,8 +851,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -908,8 +908,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
*/
@@ -925,8 +925,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's
* Type Conversion"</a> for more details.
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.convert;
import reactor.core.publisher.Flux;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
@@ -29,9 +27,9 @@ import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import org.bson.Document;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.ConditionalConverter;
@@ -41,6 +39,7 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils;
@@ -66,9 +65,9 @@ abstract class MongoConverters {
* @return
* @since 1.9
*/
public static Collection<Object> getConvertersToRegister() {
static Collection<Object> getConvertersToRegister() {
List<Object> converters = new ArrayList<Object>();
List<Object> converters = new ArrayList<>();
converters.add(BigDecimalToStringConverter.INSTANCE);
converters.add(StringToBigDecimalConverter.INSTANCE);
@@ -86,6 +85,7 @@ abstract class MongoConverters {
converters.add(AtomicLongToLongConverter.INSTANCE);
converters.add(LongToAtomicLongConverter.INSTANCE);
converters.add(IntegerToAtomicIntegerConverter.INSTANCE);
converters.add(BinaryToByteArrayConverter.INSTANCE);
return converters;
}
@@ -95,7 +95,7 @@ abstract class MongoConverters {
*
* @author Oliver Gierke
*/
public static enum ObjectIdToStringConverter implements Converter<ObjectId, String> {
enum ObjectIdToStringConverter implements Converter<ObjectId, String> {
INSTANCE;
public String convert(ObjectId id) {
@@ -108,7 +108,7 @@ abstract class MongoConverters {
*
* @author Oliver Gierke
*/
public static enum StringToObjectIdConverter implements Converter<String, ObjectId> {
enum StringToObjectIdConverter implements Converter<String, ObjectId> {
INSTANCE;
public ObjectId convert(String source) {
@@ -121,7 +121,7 @@ abstract class MongoConverters {
*
* @author Oliver Gierke
*/
public static enum ObjectIdToBigIntegerConverter implements Converter<ObjectId, BigInteger> {
enum ObjectIdToBigIntegerConverter implements Converter<ObjectId, BigInteger> {
INSTANCE;
public BigInteger convert(ObjectId source) {
@@ -134,7 +134,7 @@ abstract class MongoConverters {
*
* @author Oliver Gierke
*/
public static enum BigIntegerToObjectIdConverter implements Converter<BigInteger, ObjectId> {
enum BigIntegerToObjectIdConverter implements Converter<BigInteger, ObjectId> {
INSTANCE;
public ObjectId convert(BigInteger source) {
@@ -142,7 +142,7 @@ abstract class MongoConverters {
}
}
public static enum BigDecimalToStringConverter implements Converter<BigDecimal, String> {
enum BigDecimalToStringConverter implements Converter<BigDecimal, String> {
INSTANCE;
public String convert(BigDecimal source) {
@@ -150,7 +150,7 @@ abstract class MongoConverters {
}
}
public static enum StringToBigDecimalConverter implements Converter<String, BigDecimal> {
enum StringToBigDecimalConverter implements Converter<String, BigDecimal> {
INSTANCE;
public BigDecimal convert(String source) {
@@ -158,7 +158,7 @@ abstract class MongoConverters {
}
}
public static enum BigIntegerToStringConverter implements Converter<BigInteger, String> {
enum BigIntegerToStringConverter implements Converter<BigInteger, String> {
INSTANCE;
public String convert(BigInteger source) {
@@ -166,7 +166,7 @@ abstract class MongoConverters {
}
}
public static enum StringToBigIntegerConverter implements Converter<String, BigInteger> {
enum StringToBigIntegerConverter implements Converter<String, BigInteger> {
INSTANCE;
public BigInteger convert(String source) {
@@ -174,7 +174,7 @@ abstract class MongoConverters {
}
}
public static enum URLToStringConverter implements Converter<URL, String> {
enum URLToStringConverter implements Converter<URL, String> {
INSTANCE;
public String convert(URL source) {
@@ -182,7 +182,7 @@ abstract class MongoConverters {
}
}
public static enum StringToURLConverter implements Converter<String, URL> {
enum StringToURLConverter implements Converter<String, URL> {
INSTANCE;
private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class);
@@ -199,7 +199,7 @@ abstract class MongoConverters {
}
@ReadingConverter
public static enum DocumentToStringConverter implements Converter<Document, String> {
enum DocumentToStringConverter implements Converter<Document, String> {
INSTANCE;
@@ -219,7 +219,7 @@ abstract class MongoConverters {
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@@ -233,7 +233,7 @@ abstract class MongoConverters {
* @author Christoph Strobl
* @since 1.7
*/
public static enum DocumentToNamedMongoScriptConverter implements Converter<Document, NamedMongoScript> {
enum DocumentToNamedMongoScriptConverter implements Converter<Document, NamedMongoScript> {
INSTANCE;
@@ -255,7 +255,7 @@ abstract class MongoConverters {
* @author Christoph Strobl
* @since 1.7
*/
public static enum NamedMongoScriptToDocumentConverter implements Converter<NamedMongoScript, Document> {
enum NamedMongoScriptToDocumentConverter implements Converter<NamedMongoScript, Document> {
INSTANCE;
@@ -282,7 +282,7 @@ abstract class MongoConverters {
* @since 1.9
*/
@WritingConverter
public static enum CurrencyToStringConverter implements Converter<Currency, String> {
enum CurrencyToStringConverter implements Converter<Currency, String> {
INSTANCE;
@@ -303,7 +303,7 @@ abstract class MongoConverters {
* @since 1.9
*/
@ReadingConverter
public static enum StringToCurrencyConverter implements Converter<String, Currency> {
enum StringToCurrencyConverter implements Converter<String, Currency> {
INSTANCE;
@@ -326,7 +326,7 @@ abstract class MongoConverters {
* @since 1.9
*/
@WritingConverter
public static enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>, ConditionalConverter {
enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>, ConditionalConverter {
INSTANCE;
@@ -391,7 +391,7 @@ abstract class MongoConverters {
* @since 1.10
*/
@WritingConverter
public static enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
INSTANCE;
@Override
@@ -407,7 +407,7 @@ abstract class MongoConverters {
* @since 1.10
*/
@WritingConverter
public static enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
INSTANCE;
@Override
@@ -423,7 +423,7 @@ abstract class MongoConverters {
* @since 1.10
*/
@ReadingConverter
public static enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
INSTANCE;
@Override
@@ -439,7 +439,7 @@ abstract class MongoConverters {
* @since 1.10
*/
@ReadingConverter
public static enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
INSTANCE;
@Override
@@ -447,4 +447,22 @@ abstract class MongoConverters {
return source != null ? new AtomicInteger(source) : null;
}
}
/**
* {@link Converter} implementation converting {@link Binary} into {@code byte[]}.
*
* @author Christoph Strobl
* @since 2.0.1
*/
@ReadingConverter
enum BinaryToByteArrayConverter implements Converter<Binary, byte[]> {
INSTANCE;
@Nullable
@Override
public byte[] convert(Binary source) {
return source.getData();
}
}
}

View File

@@ -930,7 +930,7 @@ public class QueryMapper {
try {
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d", ""), entity.getTypeInformation());
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d+", ""), entity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();

View File

@@ -25,11 +25,10 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import com.mongodb.client.gridfs.GridFSFindIterable;
import com.mongodb.gridfs.GridFSFile;
/**
* Collection of operations to store and read files from MongoDB GridFS.
*
*
* @author Oliver Gierke
* @author Philipp Schneider
* @author Thomas Darimont
@@ -40,98 +39,102 @@ public interface GridFsOperations extends ResourcePatternResolver {
/**
* Stores the given content into a file with the given name.
*
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, String filename);
/**
* Stores the given content into a file with the given name.
*
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable Object metadata);
/**
* Stores the given content into a file with the given name.
*
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable Document metadata);
/**
* Stores the given content into a file with the given name and content type.
*
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType);
/**
* Stores the given content into a file with the given name using the given metadata. The metadata object will be
* marshalled before writing.
*
*
* @param content must not be {@literal null}.
* @param filename can be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable String filename, @Nullable Object metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Object metadata);
ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata);
/**
* Stores the given content into a file with the given name using the given metadata.
*
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable String filename, @Nullable Document metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link GridFSFile} just created
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Document metadata);
ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata);
/**
* Returns all files matching the given query. Note, that currently {@link Sort} criterias defined at the
* {@link Query} will not be regarded as MongoDB does not support ordering for GridFS file access.
*
*
* @see <a href="https://jira.mongodb.org/browse/JAVA-431">MongoDB Jira: JAVA-431</a>
* @param query must not be {@literal null}.
* @return
* @return {@link GridFSFindIterable} to obtain results from. Eg. by calling
* {@link GridFSFindIterable#into(java.util.Collection)}.
*/
GridFSFindIterable find(Query query);
/**
* Returns a single file matching the given query or {@literal null} in case no file matches.
*
* Returns a single {@link com.mongodb.client.gridfs.model.GridFSFile} matching the given query or {@literal null} in
* case no file matches.
*
* @param query must not be {@literal null}.
* @return
*/
@@ -140,14 +143,14 @@ public interface GridFsOperations extends ResourcePatternResolver {
/**
* Deletes all files matching the given {@link Query}.
*
*
* @param query must not be {@literal null}.
*/
void delete(Query query);
/**
* Returns all {@link GridFsResource} with the given file name.
*
* Returns the {@link GridFsResource} with the given file name.
*
* @param filename must not be {@literal null}.
* @return the resource if it exists or {@literal null}.
* @see ResourcePatternResolver#getResource(String)
@@ -156,7 +159,7 @@ public interface GridFsOperations extends ResourcePatternResolver {
/**
* Returns all {@link GridFsResource}s matching the given file name pattern.
*
*
* @param filenamePattern must not be {@literal null}.
* @return
* @see ResourcePatternResolver#getResources(String)

View File

@@ -23,13 +23,13 @@ import org.springframework.core.io.InputStreamResource;
import org.springframework.core.io.Resource;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.gridfs.GridFSDBFile;
/**
* {@link GridFSDBFile} based {@link Resource} implementation.
*
* {@link GridFSFile} based {@link Resource} implementation.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Hartmut Lang
*/
public class GridFsResource extends InputStreamResource {
@@ -38,8 +38,8 @@ public class GridFsResource extends InputStreamResource {
private final GridFSFile file;
/**
* Creates a new {@link GridFsResource} from the given {@link GridFSDBFile}.
*
* Creates a new {@link GridFsResource} from the given {@link GridFSFile}.
*
* @param file must not be {@literal null}.
*/
public GridFsResource(GridFSFile file) {
@@ -47,8 +47,8 @@ public class GridFsResource extends InputStreamResource {
}
/**
* Creates a new {@link GridFsResource} from the given {@link GridFSDBFile} and {@link InputStream}.
*
* Creates a new {@link GridFsResource} from the given {@link GridFSFile} and {@link InputStream}.
*
* @param file must not be {@literal null}.
* @param inputStream must not be {@literal null}.
*/
@@ -87,8 +87,8 @@ public class GridFsResource extends InputStreamResource {
/**
* Returns the {@link Resource}'s id.
*
* @return
*
* @return never {@literal null}.
*/
public Object getId() {
return file.getId();
@@ -96,8 +96,10 @@ public class GridFsResource extends InputStreamResource {
/**
* Returns the {@link Resource}'s content type.
*
* @return
*
* @return never {@literal null}.
* @throws com.mongodb.MongoGridFSException in case no content type declared on {@link GridFSFile#getMetadata()} nor
* provided via {@link GridFSFile#getContentType()}.
*/
@SuppressWarnings("deprecation")
public String getContentType() {

View File

@@ -15,8 +15,7 @@
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.assertj.core.api.Assertions.*;
import java.util.Iterator;
import java.util.List;
@@ -66,9 +65,9 @@ public abstract class DocumentTestUtils {
*/
public static Document getAsDocument(List<?> source, int index) {
assertThat(source.size(), greaterThanOrEqualTo(index + 1));
assertThat(source.size()).isGreaterThanOrEqualTo(index + 1);
Object value = source.get(index);
assertThat(value, is(instanceOf(Document.class)));
assertThat(value).isInstanceOf(Document.class);
return (Document) value;
}
@@ -76,8 +75,8 @@ public abstract class DocumentTestUtils {
public static <T> T getTypedValue(Document source, String key, Class<T> type) {
Object value = source.get(key);
assertThat(value, is(notNullValue()));
assertThat(value, is(instanceOf(type)));
assertThat(value).isNotNull();
assertThat(value).isInstanceOf(type);
return (T) value;
}
@@ -92,8 +91,8 @@ public abstract class DocumentTestUtils {
while (keyIterator.hasNext()) {
String key = keyIterator.next();
if (key.equals("_class")) {
assertThat((String) document.get(key), is(equalTo(expectedTypeString)));
assertThat(keyIterator.hasNext(), is(false));
assertThat(document.get(key)).isEqualTo(expectedTypeString);
assertThat(keyIterator.hasNext()).isFalse();
return;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,16 +15,22 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.UUID;
import org.bson.types.Binary;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.context.ContextConfiguration;
@@ -34,37 +40,88 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
* Integration tests for {@link MongoConverters}.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoConvertersIntegrationTests {
static final String COLLECTION = "_sample";
static final String COLLECTION = "converter-tests";
@Autowired
MongoOperations template;
@Autowired MongoOperations template;
@Before
public void setUp() {
template.dropCollection(COLLECTION);
}
@Test
@Test // DATAMONGO-422
public void writesUUIDBinaryCorrectly() {
Wrapper wrapper = new Wrapper();
wrapper.uuid = UUID.randomUUID();
template.save(wrapper);
assertThat(wrapper.id, is(notNullValue()));
assertThat(wrapper.id).isNotNull();
Wrapper result = template.findOne(Query.query(Criteria.where("id").is(wrapper.id)), Wrapper.class);
assertThat(result.uuid, is(wrapper.uuid));
assertThat(result.uuid).isEqualTo(wrapper.uuid);
}
@Test // DATAMONGO-1802
public void shouldConvertBinaryDataOnRead() {
WithBinaryDataInArray wbd = new WithBinaryDataInArray();
wbd.data = "calliope-mini".getBytes();
template.save(wbd);
assertThat(template.findOne(query(where("id").is(wbd.id)), WithBinaryDataInArray.class)).isEqualTo(wbd);
}
@Test // DATAMONGO-1802
public void shouldConvertEmptyBinaryDataOnRead() {
WithBinaryDataInArray wbd = new WithBinaryDataInArray();
wbd.data = new byte[0];
template.save(wbd);
assertThat(template.findOne(query(where("id").is(wbd.id)), WithBinaryDataInArray.class)).isEqualTo(wbd);
}
@Test // DATAMONGO-1802
public void shouldReadBinaryType() {
WithBinaryDataType wbd = new WithBinaryDataType();
wbd.data = new Binary("calliope-mini".getBytes());
template.save(wbd);
assertThat(template.findOne(query(where("id").is(wbd.id)), WithBinaryDataType.class)).isEqualTo(wbd);
}
@Document(collection = COLLECTION)
static class Wrapper {
String id;
UUID uuid;
}
@Data
@Document(collection = COLLECTION)
static class WithBinaryDataInArray {
@Id String id;
byte[] data;
}
@Data
@Document(collection = COLLECTION)
static class WithBinaryDataType {
@Id String id;
Binary data;
}
}

View File

@@ -15,11 +15,12 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.DocumentTestUtils.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import static org.springframework.data.mongodb.test.util.Assertions.*;
import lombok.AllArgsConstructor;
import lombok.NoArgsConstructor;
import java.time.LocalDate;
import java.util.Arrays;
@@ -29,9 +30,6 @@ import java.util.Map;
import java.util.concurrent.atomic.AtomicInteger;
import org.bson.Document;
import org.hamcrest.collection.IsIterableContainingInOrder;
import org.hamcrest.core.Is;
import org.hamcrest.core.IsEqual;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -132,7 +130,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ModelWrapper.class));
Document set = getAsDocument(mappedObject, "$set");
assertThat(set.get("_class"), nullValue());
assertThat(set.get("_class")).isNull();
}
@Test // DATAMONGO-807
@@ -145,7 +143,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ModelWrapper.class));
Document set = getAsDocument(mappedObject, "$set");
assertThat(set.get("_class"), nullValue());
assertThat(set.get("_class")).isNull();
}
@Test // DATAMONGO-407
@@ -172,8 +170,8 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ParentClass.class));
Document set = getAsDocument(mappedObject, "$set");
assertThat(set.get("aliased.$.value"), is("foo"));
assertThat(set.get("aliased.$.otherValue"), is("bar"));
assertThat(set.get("aliased.$.value")).isEqualTo("foo");
assertThat(set.get("aliased.$.otherValue")).isEqualTo("bar");
}
@Test // DATAMONGO-407
@@ -186,11 +184,10 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ParentClass.class));
Document document = getAsDocument(mappedObject, "$set");
assertThat(document.get("aliased.$.value"), is("foo"));
assertThat(document.get("aliased.$.value")).isEqualTo("foo");
Document someObject = getAsDocument(document, "aliased.$.someObject");
assertThat(someObject, is(notNullValue()));
assertThat(someObject.get("value"), is("bubu"));
assertThat(someObject).isNotNull().containsEntry("value", "bubu");
assertTypeHint(someObject, ConcreteChildClass.class);
}
@@ -205,10 +202,10 @@ public class UpdateMapperUnitTests {
Document values = getAsDocument(push, "values");
List<Object> each = getAsDBList(values, "$each");
assertThat(push.get("_class"), nullValue());
assertThat(values.get("_class"), nullValue());
assertThat(push.get("_class")).isNull();
assertThat(values.get("_class")).isNull();
assertThat(each, IsIterableContainingInOrder.contains("spring", "data", "mongodb"));
assertThat(each).containsExactly("spring", "data", "mongodb");
}
@Test // DATAMONGO-812
@@ -220,8 +217,8 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document values = getAsDocument(push, "values");
assertThat(push.get("_class"), nullValue());
assertThat(values.get("_class"), nullValue());
assertThat(push.get("_class")).isNull();
assertThat(values.get("_class")).isNull();
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@@ -237,7 +234,7 @@ public class UpdateMapperUnitTests {
List<Object> each = getAsDBList(model, "$each");
List<Object> values = getAsDBList((Document) each.get(0), "values");
assertThat(values, IsIterableContainingInOrder.contains("spring", "data", "mongodb"));
assertThat(values).containsExactly("spring", "data", "mongodb");
}
@Test // DATAMONGO-812
@@ -261,8 +258,8 @@ public class UpdateMapperUnitTests {
Document mappedObject = mapper.getMappedObject(update.getUpdateObject(), context.getPersistentEntity(Object.class));
Document push = getAsDocument(mappedObject, "$push");
assertThat(getAsDocument(push, "category").containsKey("$each"), is(true));
assertThat(getAsDocument(push, "type").containsKey("$each"), is(true));
assertThat(getAsDocument(push, "category")).containsKey("$each");
assertThat(getAsDocument(push, "type")).containsKey("$each");
}
@Test // DATAMONGO-943
@@ -275,9 +272,9 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$position"), is(true));
assertThat(key.get("$position"), is(2));
assertThat(getAsDocument(push, "key").containsKey("$each"), is(true));
assertThat(key.containsKey("$position")).isTrue();
assertThat(key.get("$position")).isEqualTo(2);
assertThat(getAsDocument(push, "key")).containsKey("$each");
}
@Test // DATAMONGO-943
@@ -290,9 +287,9 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$position"), is(true));
assertThat(key.get("$position"), is(0));
assertThat(getAsDocument(push, "key").containsKey("$each"), is(true));
assertThat(key.containsKey("$position")).isTrue();
assertThat(key.get("$position")).isEqualTo(0);
assertThat(getAsDocument(push, "key")).containsKey("$each");
}
@Test // DATAMONGO-943
@@ -305,8 +302,8 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$position"), is(false));
assertThat(getAsDocument(push, "key").containsKey("$each"), is(true));
assertThat(key).doesNotContainKey("$position");
assertThat(getAsDocument(push, "key")).containsKey("$each");
}
@Test // DATAMONGO-943
@@ -319,8 +316,8 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$position"), is(false));
assertThat(getAsDocument(push, "key").containsKey("$each"), is(true));
assertThat(key).doesNotContainKey("$position");
assertThat(getAsDocument(push, "key")).containsKey("$each");
}
@Test // DATAMONGO-832
@@ -333,9 +330,8 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$slice"), is(true));
assertThat(key.get("$slice"), is(5));
assertThat(key.containsKey("$each"), is(true));
assertThat(key).containsKey("$slice").containsEntry("$slice", 5);
assertThat(key).containsKey("$each");
}
@Test // DATAMONGO-832
@@ -349,15 +345,13 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "key");
assertThat(key.containsKey("$slice"), is(true));
assertThat((Integer) key.get("$slice"), is(5));
assertThat(key.containsKey("$each"), is(true));
assertThat(key).containsKey("$slice").containsEntry("$slice", 5);
assertThat(key.containsKey("$each")).isTrue();
Document key2 = getAsDocument(push, "key-2");
assertThat(key2.containsKey("$slice"), is(true));
assertThat((Integer) key2.get("$slice"), is(-2));
assertThat(key2.containsKey("$each"), is(true));
assertThat(key2).containsKey("$slice").containsEntry("$slice", -2);
assertThat(key2).containsKey("$each");
}
@Test // DATAMONGO-1141
@@ -371,9 +365,9 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "scores");
assertThat(key.containsKey("$sort"), is(true));
assertThat((Integer) key.get("$sort"), is(-1));
assertThat(key.containsKey("$each"), is(true));
assertThat(key).containsKey("$sort");
assertThat(key).containsEntry("$sort", -1);
assertThat(key).containsKey("$each");
}
@Test // DATAMONGO-1141
@@ -389,9 +383,9 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key = getAsDocument(push, "list");
assertThat(key.containsKey("$sort"), is(true));
assertThat((Document) key.get("$sort"), equalTo(new Document("renamed-value", 1).append("field", 1)));
assertThat(key.containsKey("$each"), is(true));
assertThat(key).containsKey("$sort");
assertThat(key.get("$sort")).isEqualTo(new Document("renamed-value", 1).append("field", 1));
assertThat(key).containsKey("$each");
}
@Test // DATAMONGO-1141
@@ -405,15 +399,15 @@ public class UpdateMapperUnitTests {
Document push = getAsDocument(mappedObject, "$push");
Document key1 = getAsDocument(push, "authors");
assertThat(key1.containsKey("$sort"), is(true));
assertThat((Integer) key1.get("$sort"), is(1));
assertThat(key1.containsKey("$each"), is(true));
assertThat(key1).containsKey("$sort");
assertThat(key1).containsEntry("$sort", 1);
assertThat(key1).containsKey("$each");
Document key2 = getAsDocument(push, "chapters");
assertThat(key2.containsKey("$sort"), is(true));
assertThat((Document) key2.get("$sort"), equalTo(new Document("order", 1)));
assertThat(key2.containsKey("$each"), is(true));
assertThat(key2).containsKey("$sort");
assertThat(key2.get("$sort")).isEqualTo(new Document("order", 1));
assertThat(key2.containsKey("$each")).isTrue();
}
@Test // DATAMONGO-410
@@ -438,7 +432,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(DocumentWithDBRefCollection.class));
Document pullClause = getAsDocument(mappedObject, "$pull");
assertThat(pullClause.get("dbRefAnnotatedList"), is(new DBRef("entity", "2")));
assertThat(pullClause.get("dbRefAnnotatedList")).isEqualTo(new DBRef("entity", "2"));
}
@Test // DATAMONGO-404
@@ -452,7 +446,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(DocumentWithDBRefCollection.class));
Document pullClause = getAsDocument(mappedObject, "$pull");
assertThat(pullClause.get("dbRefAnnotatedList"), is(new DBRef("entity", entity.id)));
assertThat(pullClause.get("dbRefAnnotatedList")).isEqualTo(new DBRef("entity", entity.id));
}
@Test(expected = MappingException.class) // DATAMONGO-404
@@ -470,7 +464,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(Wrapper.class));
Document pullClause = getAsDocument(mappedObject, "$pull");
assertThat(pullClause.containsKey("mapped.dbRefAnnotatedList"), is(true));
assertThat(pullClause.containsKey("mapped.dbRefAnnotatedList")).isTrue();
}
@Test // DATAMONGO-468
@@ -484,7 +478,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(DocumentWithDBRefCollection.class));
Document setClause = getAsDocument(mappedObject, "$set");
assertThat(setClause.get("dbRefProperty"), is(new DBRef("entity", entity.id)));
assertThat(setClause.get("dbRefProperty")).isEqualTo(new DBRef("entity", entity.id));
}
@Test // DATAMONGO-862
@@ -495,7 +489,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ParentClass.class));
Document setClause = getAsDocument(mappedObject, "$set");
assertThat(setClause.containsKey("listOfInterface.$.value"), is(true));
assertThat(setClause.containsKey("listOfInterface.$.value")).isTrue();
}
@Test // DATAMONGO-863
@@ -513,7 +507,7 @@ public class UpdateMapperUnitTests {
Document idClause = getAsDocument(options, "_id");
List<Object> inClause = getAsDBList(idClause, "$in");
assertThat(inClause, IsIterableContainingInOrder.contains(1L, 2L));
assertThat(inClause).containsExactly(1L, 2L);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@@ -528,7 +522,7 @@ public class UpdateMapperUnitTests {
Document values = getAsDocument(addToSet, "values");
List<Object> each = getAsDBList(values, "$each");
assertThat(each, IsIterableContainingInOrder.contains("spring", "data", "mongodb"));
assertThat(each).containsExactly("spring", "data", "mongodb");
}
@Test // DATAMONG0-471
@@ -559,7 +553,7 @@ public class UpdateMapperUnitTests {
Object model = $set.get("referencedDocument");
DBRef expectedDBRef = new DBRef("interfaceDocumentDefinitionImpl", "1");
assertThat(model, allOf(instanceOf(DBRef.class), IsEqual.equalTo(expectedDBRef)));
assertThat(model).isInstanceOf(DBRef.class).isEqualTo(expectedDBRef);
}
@Test // DATAMONGO-847
@@ -574,7 +568,7 @@ public class UpdateMapperUnitTests {
Document value = DocumentTestUtils.getAsDocument(list, "value");
List<Object> $in = DocumentTestUtils.getAsDBList(value, "$in");
assertThat($in, IsIterableContainingInOrder.contains("foo", "bar"));
assertThat($in).containsExactly("foo", "bar");
}
@Test // DATAMONGO-847
@@ -587,7 +581,7 @@ public class UpdateMapperUnitTests {
Document $pull = DocumentTestUtils.getAsDocument(mappedUpdate, "$pull");
Document list = DocumentTestUtils.getAsDocument($pull, "dbRefAnnotatedList");
assertThat(list, equalTo(new org.bson.Document().append("_id", "1")));
assertThat(list).isEqualTo(new org.bson.Document().append("_id", "1"));
}
@Test // DATAMONGO-1077
@@ -601,7 +595,7 @@ public class UpdateMapperUnitTests {
Document $unset = DocumentTestUtils.getAsDocument(mappedUpdate, "$unset");
assertThat($unset, equalTo(new org.bson.Document().append("dbRefAnnotatedList.$", 1)));
assertThat($unset).isEqualTo(new org.bson.Document().append("dbRefAnnotatedList.$", 1));
}
@Test // DATAMONGO-1210
@@ -613,8 +607,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithNestedCollection.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[0]._class"));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[1]._class"));
assertThat(mappedUpdate).doesNotContainKey("$addToSet.nestedDocs.$each.[0]._class")
.doesNotContainKey("$addToSet.nestedDocs.$each.[1]._class");
}
@Test // DATAMONGO-1210
@@ -625,8 +619,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ListModelWrapper.class));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[0]._class", ModelImpl.class.getName()));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[1]._class", ModelImpl.class.getName()));
assertThat(mappedUpdate).containsEntry("$addToSet.models.$each.[0]._class", ModelImpl.class.getName());
assertThat(mappedUpdate).containsEntry("$addToSet.models.$each.[1]._class", ModelImpl.class.getName());
}
@Test // DATAMONGO-1210
@@ -638,10 +632,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[0]._class", ConcreteChildClass.class.getName()));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[1]._class", ConcreteChildClass.class.getName()));
assertThat(mappedUpdate).containsEntry("$addToSet.aliased.$each.[0]._class", ConcreteChildClass.class.getName());
assertThat(mappedUpdate).containsEntry("$addToSet.aliased.$each.[1]._class", ConcreteChildClass.class.getName());
}
@Test // DATAMONGO-1210
@@ -654,12 +646,11 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWithListOfConcreteTypesHavingSingleInterfaceTypeAttribute.class));
assertThat(mappedUpdate,
isBsonObject().notContaining("$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0]._class"));
assertThat(mappedUpdate,
isBsonObject().containing(
"$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0].interfaceType._class",
ModelImpl.class.getName()));
assertThat(mappedUpdate)
.doesNotContainKey("$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0]._class");
assertThat(mappedUpdate).containsEntry(
"$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0].interfaceType._class",
ModelImpl.class.getName());
}
@Test // DATAMONGO-1210
@@ -672,9 +663,26 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteTypeWithListAttributeOfInterfaceType._class"));
assertThat(mappedUpdate, isBsonObject()
.containing("$set.concreteTypeWithListAttributeOfInterfaceType.models.[0]._class", ModelImpl.class.getName()));
assertThat(mappedUpdate).doesNotContainKey("$set.concreteTypeWithListAttributeOfInterfaceType._class");
assertThat(mappedUpdate).containsEntry("$set.concreteTypeWithListAttributeOfInterfaceType.models.[0]._class",
ModelImpl.class.getName());
}
@Test // DATAMONGO-1809
public void pathShouldIdentifyPositionalParameterWithMoreThanOneDigit() {
Document at2digitPosition = mapper.getMappedObject(new Update()
.addToSet("concreteInnerList.10.concreteTypeList", new SomeInterfaceImpl("szeth")).getUpdateObject(),
context.getPersistentEntity(Outer.class));
Document at3digitPosition = mapper.getMappedObject(new Update()
.addToSet("concreteInnerList.123.concreteTypeList", new SomeInterfaceImpl("lopen")).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(at2digitPosition).isEqualTo(new Document("$addToSet",
new Document("concreteInnerList.10.concreteTypeList", new Document("value", "szeth"))));
assertThat(at3digitPosition).isEqualTo(new Document("$addToSet",
new Document("concreteInnerList.123.concreteTypeList", new Document("value", "lopen"))));
}
@Test // DATAMONGO-1236
@@ -684,8 +692,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.value.name", "kaladin"));
assertThat(mappedUpdate, isBsonObject().containing("$set.value._class", NestedDocument.class.getName()));
assertThat(mappedUpdate).containsEntry("$set.value.name", "kaladin");
assertThat(mappedUpdate).containsEntry("$set.value._class", NestedDocument.class.getName());
}
@Test // DATAMONGO-1236
@@ -695,8 +703,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteValue.name", "shallan"));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteValue._class"));
assertThat(mappedUpdate).containsEntry("$set.concreteValue.name", "shallan");
assertThat(mappedUpdate).doesNotContainKey("$set.concreteValue._class");
}
@Test // DATAMONGO-1236
@@ -706,8 +714,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithAliasedObject.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.renamed-value.name", "adolin"));
assertThat(mappedUpdate, isBsonObject().containing("$set.renamed-value._class", NestedDocument.class.getName()));
assertThat(mappedUpdate).containsEntry("$set.renamed-value.name", "adolin");
assertThat(mappedUpdate).containsEntry("$set.renamed-value._class", NestedDocument.class.getName());
}
@Test // DATAMONGO-1236
@@ -719,8 +727,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.map.szeth.name", "son-son-vallano"));
assertThat(mappedUpdate, isBsonObject().containing("$set.map.szeth._class", NestedDocument.class.getName()));
assertThat(mappedUpdate).containsEntry("$set.map.szeth.name", "son-son-vallano");
assertThat(mappedUpdate).containsEntry("$set.map.szeth._class", NestedDocument.class.getName());
}
@Test // DATAMONGO-1236
@@ -732,8 +740,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteMap.jasnah.name", "kholin"));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteMap.jasnah._class"));
assertThat(mappedUpdate).containsEntry("$set.concreteMap.jasnah.name", "kholin");
assertThat(mappedUpdate).doesNotContainKey("$set.concreteMap.jasnah._class");
}
@Test // DATAMONGO-1250
@@ -757,7 +765,7 @@ public class UpdateMapperUnitTests {
Document result = mapper.getMappedObject(update.getUpdateObject(),
mappingContext.getPersistentEntity(ClassWithEnum.class));
assertThat(result, isBsonObject().containing("$set.allocation", ClassWithEnum.Allocation.AVAILABLE.code));
assertThat(result).containsEntry("$set.allocation", ClassWithEnum.Allocation.AVAILABLE.code);
}
@Test // DATAMONGO-1251
@@ -769,8 +777,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ConcreteChildClass.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.containsKey("value"), is(true));
assertThat($set.get("value"), nullValue());
assertThat($set).containsKey("value").containsEntry("value", null);
}
@Test // DATAMONGO-1251
@@ -782,8 +789,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ClassWithJava8Date.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.containsKey("date"), is(true));
assertThat($set.get("value"), nullValue());
assertThat($set).containsKey("date").doesNotContainKey("value");
}
@Test // DATAMONGO-1251
@@ -795,8 +801,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(ListModel.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.containsKey("values"), is(true));
assertThat($set.get("value"), nullValue());
assertThat($set).containsKey("values").doesNotContainKey("value");
}
@Test // DATAMONGO-1251
@@ -808,8 +813,8 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(EntityWithObject.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.containsKey("concreteValue.name"), is(true));
assertThat($set.get("concreteValue.name"), nullValue());
assertThat($set).containsKey("concreteValue.name");
assertThat($set).containsEntry("concreteValue.name", null);
}
@Test // DATAMONGO-1288
@@ -820,7 +825,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(SimpleValueHolder.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.get("intValue"), Is.is(10));
assertThat($set.get("intValue")).isEqualTo(10);
}
@Test // DATAMONGO-1288
@@ -831,7 +836,7 @@ public class UpdateMapperUnitTests {
context.getPersistentEntity(SimpleValueHolder.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.get("primIntValue"), Is.is(10));
assertThat($set.get("primIntValue")).isEqualTo(10);
}
@Test // DATAMONGO-1404
@@ -841,7 +846,7 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(SimpleValueHolder.class));
assertThat(mappedUpdate, isBsonObject().containing("$min", new Document("minfield", 10)));
assertThat(mappedUpdate).containsEntry("$min", new Document("minfield", 10));
}
@Test // DATAMONGO-1404
@@ -851,7 +856,7 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(SimpleValueHolder.class));
assertThat(mappedUpdate, isBsonObject().containing("$max", new Document("maxfield", 999)));
assertThat(mappedUpdate).containsEntry("$max", new Document("maxfield", 999));
}
@Test // DATAMONGO-1423
@@ -876,10 +881,10 @@ public class UpdateMapperUnitTests {
mappingContext.getPersistentEntity(ClassWithEnum.class));
Document $set = DocumentTestUtils.getAsDocument(mappedUpdate, "$set");
assertThat($set.containsKey("enumAsMapKey"), is(true));
assertThat($set.containsKey("enumAsMapKey")).isTrue();
Document enumAsMapKey = $set.get("enumAsMapKey", Document.class);
assertThat(enumAsMapKey.get("AVAILABLE"), is(100));
assertThat(enumAsMapKey.get("AVAILABLE")).isEqualTo(100);
}
@Test // DATAMONGO-1176
@@ -889,8 +894,8 @@ public class UpdateMapperUnitTests {
Document mappedObject = mapper.getMappedObject(document, context.getPersistentEntity(SimpleValueHolder.class));
assertThat(mappedObject.get("$set"), is(equalTo(new Document("a", "b").append("x", "y").append("key", "value"))));
assertThat(mappedObject.size(), is(1));
assertThat(mappedObject.get("$set")).isEqualTo(new Document("a", "b").append("x", "y").append("key", "value"));
assertThat(mappedObject).hasSize(1);
}
@Test // DATAMONGO-1176
@@ -900,10 +905,10 @@ public class UpdateMapperUnitTests {
Document mappedObject = mapper.getMappedObject(document, context.getPersistentEntity(SimpleValueHolder.class));
assertThat(mappedObject.get("key"), is(equalTo("value")));
assertThat(mappedObject.get("a"), is(equalTo("b")));
assertThat(mappedObject.get("x"), is(equalTo("y")));
assertThat(mappedObject.size(), is(3));
assertThat(mappedObject).containsEntry("key", "value");
assertThat(mappedObject).containsEntry("a", "b");
assertThat(mappedObject).containsEntry("x", "y");
assertThat(mappedObject).hasSize(3);
}
@Test // DATAMONGO-1176
@@ -913,9 +918,9 @@ public class UpdateMapperUnitTests {
Document mappedObject = mapper.getMappedObject(document, context.getPersistentEntity(SimpleValueHolder.class));
assertThat(mappedObject.get("$push"), is(equalTo(new Document("x", "y"))));
assertThat(mappedObject.get("$set"), is(equalTo(new Document("a", "b"))));
assertThat(mappedObject.size(), is(2));
assertThat(mappedObject).containsEntry("$push", new Document("x", "y"));
assertThat(mappedObject).containsEntry("$set", new Document("a", "b"));
assertThat(mappedObject).hasSize(2);
}
@Test // DATAMONGO-1486
@@ -928,7 +933,7 @@ public class UpdateMapperUnitTests {
Document mapToSet = getAsDocument(getAsDocument(mappedUpdate, "$set"), "map");
for (Object key : mapToSet.keySet()) {
assertThat(key, is(instanceOf(String.class)));
assertThat(key).isInstanceOf(String.class);
}
}
@@ -942,8 +947,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].interfaceTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
assertThat(mappedUpdate).containsKey("$set.concreteInnerList.[0].interfaceTypeList.[0]._class")
.doesNotContainKey("$set.concreteInnerList.[0]._class");
}
@Test // DATAMONGO-1772
@@ -956,8 +961,8 @@ public class UpdateMapperUnitTests {
Document mappedUpdate = mapper.getMappedObject(new Update().set("concreteInnerList", list).getUpdateObject(),
context.getPersistentEntity(Outer.class));
assertThat(mappedUpdate, isBsonObject().containing("$set.concreteInnerList.[0].abstractTypeList.[0]._class")
.notContaining("$set.concreteInnerList.[0]._class"));
assertThat(mappedUpdate).containsKey("$set.concreteInnerList.[0].abstractTypeList.[0]._class")
.doesNotContainKey("$set.concreteInnerList.[0]._class");
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
@@ -1254,6 +1259,7 @@ public class UpdateMapperUnitTests {
static class ConcreteInner {
List<SomeInterfaceType> interfaceTypeList;
List<SomeAbstractType> abstractTypeList;
List<SomeInterfaceImpl> concreteTypeList;
}
interface SomeInterfaceType {
@@ -1264,8 +1270,11 @@ public class UpdateMapperUnitTests {
}
@AllArgsConstructor
@NoArgsConstructor
static class SomeInterfaceImpl extends SomeAbstractType implements SomeInterfaceType {
String value;
}
}

View File

@@ -31,9 +31,9 @@ import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.springframework.core.annotation.AliasFor;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.model.Property;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mapping.model.SimpleTypeHolder;
@@ -98,15 +98,16 @@ public class BasicMongoPersistentPropertyUnitTests {
@Test // DATAMONGO-607
public void usesCustomFieldNamingStrategyByDefault() throws Exception {
ClassTypeInformation<Person> type = ClassTypeInformation.from(Person.class);
Field field = ReflectionUtils.findField(Person.class, "lastname");
MongoPersistentProperty property = new BasicMongoPersistentProperty(Property.of(field), entity,
MongoPersistentProperty property = new BasicMongoPersistentProperty(Property.of(type, field), entity,
SimpleTypeHolder.DEFAULT, UppercaseFieldNamingStrategy.INSTANCE);
assertThat(property.getFieldName(), is("LASTNAME"));
field = ReflectionUtils.findField(Person.class, "firstname");
property = new BasicMongoPersistentProperty(Property.of(field), entity, SimpleTypeHolder.DEFAULT,
property = new BasicMongoPersistentProperty(Property.of(type, field), entity, SimpleTypeHolder.DEFAULT,
UppercaseFieldNamingStrategy.INSTANCE);
assertThat(property.getFieldName(), is("foo"));
}
@@ -114,8 +115,10 @@ public class BasicMongoPersistentPropertyUnitTests {
@Test // DATAMONGO-607
public void rejectsInvalidValueReturnedByFieldNamingStrategy() {
ClassTypeInformation<Person> type = ClassTypeInformation.from(Person.class);
Field field = ReflectionUtils.findField(Person.class, "lastname");
MongoPersistentProperty property = new BasicMongoPersistentProperty(Property.of(field), entity,
MongoPersistentProperty property = new BasicMongoPersistentProperty(Property.of(type, field), entity,
SimpleTypeHolder.DEFAULT, InvalidFieldNamingStrategy.INSTANCE);
exception.expect(MappingException.class);
@@ -187,17 +190,17 @@ public class BasicMongoPersistentPropertyUnitTests {
return getPropertyFor(entity, field);
}
private <T> MongoPersistentProperty getPropertyFor(Class<T> type, String fieldname) {
private static <T> MongoPersistentProperty getPropertyFor(Class<T> type, String fieldname) {
return getPropertyFor(new BasicMongoPersistentEntity<T>(ClassTypeInformation.from(type)), fieldname);
}
private MongoPersistentProperty getPropertyFor(MongoPersistentEntity<?> persistentEntity, String fieldname) {
return getPropertyFor(persistentEntity, ReflectionUtils.findField(persistentEntity.getType(), fieldname));
private static MongoPersistentProperty getPropertyFor(MongoPersistentEntity<?> entity, String fieldname) {
return getPropertyFor(entity, ReflectionUtils.findField(entity.getType(), fieldname));
}
private MongoPersistentProperty getPropertyFor(MongoPersistentEntity<?> persistentEntity, Field field) {
return new BasicMongoPersistentProperty(Property.of(field), persistentEntity, SimpleTypeHolder.DEFAULT,
PropertyNameFieldNamingStrategy.INSTANCE);
private static MongoPersistentProperty getPropertyFor(MongoPersistentEntity<?> entity, Field field) {
return new BasicMongoPersistentProperty(Property.of(entity.getTypeInformation(), field), entity,
SimpleTypeHolder.DEFAULT, PropertyNameFieldNamingStrategy.INSTANCE);
}
class Person {

View File

@@ -0,0 +1,80 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.test.util;
import org.assertj.core.error.BasicErrorMessageFactory;
import org.assertj.core.error.ErrorMessageFactory;
import org.assertj.core.internal.StandardComparisonStrategy;
/**
* Utility class providing factory methods for {@link ErrorMessageFactory}.
*
* @author Mark Paluch
*/
class AssertErrors {
/**
* Creates a new {@link ShouldHaveProperty}.
*
* @param actual the actual value in the failed assertion.
* @param key the key used in the failed assertion to compare the actual property key to.
* @param value the value used in the failed assertion to compare the actual property value to.
* @return the created {@link ErrorMessageFactory}.
*/
public static ErrorMessageFactory shouldHaveProperty(Object actual, String key, Object value) {
return new ShouldHaveProperty(actual, key, value);
}
/**
* Creates a new {@link ShouldNotHaveProperty}.
*
* @param actual the actual value in the failed assertion.
* @param key the key used in the failed assertion to compare the actual property key to.
* @param value the value used in the failed assertion to compare the actual property value to.
* @return the created {@link ErrorMessageFactory}.
*/
public static ErrorMessageFactory shouldNotHaveProperty(Object actual, String key, Object value) {
return new ShouldNotHaveProperty(actual, key, value);
}
private static class ShouldHaveProperty extends BasicErrorMessageFactory {
private ShouldHaveProperty(Object actual, String key, Object value) {
super("\n" + //
"Expecting:\n" + //
" <%s>\n" + //
"to have property with key:\n" + //
" <%s>\n" + //
"and value:\n" + //
" <%s>\n" + //
"%s", actual, key, value, StandardComparisonStrategy.instance());
}
}
private static class ShouldNotHaveProperty extends BasicErrorMessageFactory {
private ShouldNotHaveProperty(Object actual, String key, Object value) {
super("\n" + //
"Expecting:\n" + //
" <%s>\n" + //
"not to have property with key:\n" + //
" <%s>\n" + //
"and value:\n" + //
" <%s>\n" + //
"but actually found such property %s", actual, key, value, StandardComparisonStrategy.instance());
}
}
}

View File

@@ -0,0 +1,41 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.test.util;
import org.bson.Document;
/**
* The entry point for all MongoDB assertions. This class extends {@link org.assertj.core.api.Assertions} for
* convenience to statically import a single class.
*
* @author Mark Paluch
*/
public abstract class Assertions extends org.assertj.core.api.Assertions {
private Assertions() {
// no instances allowed.
}
/**
* Create assertion for {@link Document}.
*
* @param actual the actual value.
* @return the created assertion object.
*/
public static DocumentAssert assertThat(Document document) {
return new DocumentAssert(document);
}
}

View File

@@ -0,0 +1,384 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.test.util;
import static org.assertj.core.error.ElementsShouldBe.*;
import static org.assertj.core.error.ShouldContain.*;
import static org.assertj.core.error.ShouldContainKeys.*;
import static org.assertj.core.error.ShouldNotContain.*;
import static org.assertj.core.error.ShouldNotContainKeys.*;
import lombok.AccessLevel;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Arrays;
import java.util.Iterator;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import java.util.function.Consumer;
import org.assertj.core.api.AbstractMapAssert;
import org.assertj.core.api.Condition;
import org.assertj.core.error.ShouldContainAnyOf;
import org.assertj.core.internal.Failures;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Assertions for Mongo's {@link Document}. Assertions based on keys/entries are translated to document paths allowing
* to assert nested elements.
*
* <pre>
* <code>
* Document document = Document.parse("{ $set: { concreteInnerList: [ { foo: "bar", _class: … }] } }");
*
* assertThat(mappedUpdate).containsKey("$set.concreteInnerList.[0].foo").doesNotContainKey("$set.concreteInnerList.[0].bar");
* </code>
* </pre>
*
* @author Mark Paluch
*/
public class DocumentAssert extends AbstractMapAssert<DocumentAssert, Map<String, Object>, String, Object> {
private final Document actual;
DocumentAssert(Document actual) {
super(actual, DocumentAssert.class);
this.actual = actual;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsEntry(java.lang.Object, java.lang.Object)
*/
@Override
public DocumentAssert containsEntry(String key, Object value) {
Assert.hasText(key, "The key to look for must not be empty!");
Lookup<?> lookup = lookup(key);
if (!lookup.isPathFound() || !ObjectUtils.nullSafeEquals(value, lookup.getValue())) {
throw Failures.instance().failure(info, AssertErrors.shouldHaveProperty(actual, key, value));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#doesNotContainEntry(java.lang.Object, java.lang.Object)
*/
@Override
public DocumentAssert doesNotContainEntry(String key, Object value) {
Assert.hasText(key, "The key to look for must not be empty!");
Lookup<?> lookup = lookup(key);
if (lookup.isPathFound() && ObjectUtils.nullSafeEquals(value, lookup.getValue())) {
throw Failures.instance().failure(info, AssertErrors.shouldNotHaveProperty(actual, key, value));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsKey(java.lang.Object)
*/
@Override
public DocumentAssert containsKey(String key) {
return containsKeys(key);
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsKeys(java.lang.Object[])
*/
@Override
public final DocumentAssert containsKeys(String... keys) {
Set<String> notFound = new LinkedHashSet<>();
for (String key : keys) {
if (!lookup(key).isPathFound()) {
notFound.add(key);
}
}
if (!notFound.isEmpty()) {
throw Failures.instance().failure(info, shouldContainKeys(actual, notFound));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#doesNotContainKey(java.lang.Object)
*/
@Override
public DocumentAssert doesNotContainKey(String key) {
return doesNotContainKeys(key);
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#doesNotContainKeys(java.lang.Object[])
*/
@Override
public final DocumentAssert doesNotContainKeys(String... keys) {
Set<String> found = new LinkedHashSet<>();
for (String key : keys) {
if (lookup(key).isPathFound()) {
found.add(key);
}
}
if (!found.isEmpty()) {
throw Failures.instance().failure(info, shouldNotContainKeys(actual, found));
}
return myself;
}
// override methods to annotate them with @SafeVarargs, we unfortunately can't do that in AbstractMapAssert as it is
// used in soft assertions which need to be able to proxy method - @SafeVarargs requiring method to be final prevents
// using proxies.
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#contains(java.util.Map.Entry[])
*/
@SafeVarargs
@Override
public final DocumentAssert contains(Map.Entry<? extends String, ? extends Object>... entries) {
// if both actual and values are empty, then assertion passes.
if (actual.isEmpty() && entries.length == 0) {
return myself;
}
Set<Map.Entry<? extends String, ? extends Object>> notFound = new LinkedHashSet<>();
for (Map.Entry<? extends String, ? extends Object> entry : entries) {
if (!containsEntry(entry)) {
notFound.add(entry);
}
}
if (!notFound.isEmpty()) {
throw Failures.instance().failure(info, shouldContain(actual, entries, notFound));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsAnyOf(java.util.Map.Entry[])
*/
@SafeVarargs
@Override
public final DocumentAssert containsAnyOf(Map.Entry<? extends String, ? extends Object>... entries) {
for (Map.Entry<? extends String, ? extends Object> entry : entries) {
if (containsEntry(entry)) {
return myself;
}
}
throw Failures.instance().failure(info, ShouldContainAnyOf.shouldContainAnyOf(actual, entries));
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsOnly(java.util.Map.Entry[])
*/
@SafeVarargs
@Override
public final DocumentAssert containsOnly(Map.Entry<? extends String, ? extends Object>... entries) {
throw new UnsupportedOperationException();
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#doesNotContain(java.util.Map.Entry[])
*/
@SafeVarargs
@Override
public final DocumentAssert doesNotContain(Map.Entry<? extends String, ? extends Object>... entries) {
Set<Map.Entry<? extends String, ? extends Object>> found = new LinkedHashSet<>();
for (Map.Entry<? extends String, ? extends Object> entry : entries) {
if (containsEntry(entry)) {
found.add(entry);
}
}
if (!found.isEmpty()) {
throw Failures.instance().failure(info, shouldNotContain(actual, entries, found));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#containsExactly(java.util.Map.Entry[])
*/
@SafeVarargs
@Override
public final DocumentAssert containsExactly(Map.Entry<? extends String, ? extends Object>... entries) {
throw new UnsupportedOperationException();
}
private boolean containsEntry(Entry<? extends String, ?> entry) {
Lookup<?> lookup = lookup(entry.getKey());
return lookup.isPathFound() && ObjectUtils.nullSafeEquals(entry.getValue(), lookup.getValue());
}
private <T> Lookup<T> lookup(String path) {
return lookup(actual, path);
}
@SuppressWarnings("unchecked")
private static <T> Lookup<T> lookup(Bson source, String path) {
String[] fragments = path.split("(?<!\\\\)\\.");
if (fragments.length == 1) {
Document document = (Document) source;
String pathToUse = path.replace("\\.", ".");
if (document.containsKey(pathToUse)) {
return Lookup.found((T) document.get(pathToUse));
}
return Lookup.notFound();
}
Iterator<String> it = Arrays.asList(fragments).iterator();
Object current = source;
while (it.hasNext()) {
String key = it.next().replace("\\.", ".");
if (!(current instanceof Bson) && !key.startsWith("[")) {
return Lookup.found(null);
}
if (key.startsWith("[")) {
String indexNumber = key.substring(1, key.indexOf("]"));
if (current instanceof List) {
current = ((List) current).get(Integer.valueOf(indexNumber));
}
if (!it.hasNext()) {
return Lookup.found((T) current);
}
} else {
if (current instanceof Document) {
Document document = (Document) current;
if (!it.hasNext() && !document.containsKey(key)) {
return Lookup.notFound();
}
current = document.get(key);
}
if (!it.hasNext()) {
return Lookup.found((T) current);
}
}
}
return Lookup.notFound();
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#hasEntrySatisfying(java.lang.Object, org.assertj.core.api.Condition)
*/
@Override
public DocumentAssert hasEntrySatisfying(String key, Condition<? super Object> valueCondition) {
Lookup<Object> value = lookup(key);
if (!value.isPathFound() || !valueCondition.matches(value.getValue())) {
throw Failures.instance().failure(info, elementsShouldBe(actual, value, valueCondition));
}
return myself;
}
/*
* (non-Javadoc)
* @see org.assertj.core.api.AbstractMapAssert#hasEntrySatisfying(java.lang.Object, java.util.function.Consumer)
*/
@Override
public DocumentAssert hasEntrySatisfying(String key, Consumer<? super Object> valueRequirements) {
containsKey(key);
valueRequirements.accept(lookup(key).getValue());
return myself;
}
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@Getter
static class Lookup<T> {
private final T value;
private final boolean pathFound;
/**
* Factory method to construct a lookup with a hit.
*
* @param value the actual value.
* @return the lookup object.
*/
static <T> Lookup<T> found(T value) {
return new Lookup<>(value, true);
}
/**
* Factory method to construct a lookup that yielded no match.
*
* @return the lookup object.
*/
static <T> Lookup<T> notFound() {
return new Lookup<>(null, false);
}
}
}

View File

@@ -27,6 +27,7 @@
* Support for `$caseSensitive` and `$diacriticSensitive` text search.
* Support for GeoJSON Polygon with hole.
* Performance improvements by bulk fetching ``DBRef``s.
* Multi-faceted aggregations using `$facet`, `$bucket` and `$bucketAuto` via `Aggregation`.
[[new-features.1-9-0]]
== What's new in Spring Data MongoDB 1.9

View File

@@ -29,5 +29,5 @@ class Config {
----
====
If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableJpaAuditing`.
If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableMongoAuditing`.

View File

@@ -1791,7 +1791,7 @@ At the time of this writing we provide support for the following Aggregation Ope
[cols="2*"]
|===
| Pipeline Aggregation Operators
| count, geoNear, graphLookup, group, limit, lookup, match, project, replaceRoot, skip, sort, unwind
| bucket, bucketAuto, count, facet, geoNear, graphLookup, group, limit, lookup, match, project, replaceRoot, skip, sort, unwind
| Set Aggregation Operators
| setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue
@@ -1870,10 +1870,88 @@ project().and("foo").as("bar"), sort(ASC, "foo")
More examples for project operations can be found in the `AggregationTests` class. Note that further details regarding the projection expressions can be found in the http://docs.mongodb.org/manual/reference/operator/aggregation/project/#pipe._S_project[corresponding section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.facet]]
=== Faceted classification
MongoDB supports as of Version 3.4 faceted classification using the Aggregation Framework. A faceted classification uses semantic categories, either general or subject-specific, that are combined to create the full classification entry. Documents flowing through the aggregation pipeline are classificated into buckets. A multi-faceted classification enables various aggregations on the same set of input documents, without needing to retrieve the input documents multiple times.
==== Buckets
Bucket operations categorize incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. Bucket operations require a grouping field or grouping expression. They can be defined via the `bucket()`/`bucketAuto()` methods of the `Aggregate` class. `BucketOperation` and `BucketAutoOperation` can expose accumulations based on aggregation expressions for input documents. The bucket operation can be extended with additional parameters through a fluent API via the `with…()` methods, the `andOutput(String)` method and aliased via the `as(String)` method. Each bucket is represented as a document in the output.
`BucketOperation` takes a defined set of boundaries to group incoming documents into these categories. Boundaries are required to be sorted.
.Bucket operation examples
====
[source,java]
----
// will generate {$bucket: {groupBy: $price, boundaries: [0, 100, 400]}}
bucket("price").withBoundaries(0, 100, 400);
// will generate {$bucket: {groupBy: $price, default: "Other" boundaries: [0, 100]}}
bucket("price").withBoundaries(0, 100).withDefault("Other");
// will generate {$bucket: {groupBy: $price, boundaries: [0, 100], output: { count: { $sum: 1}}}}
bucket("price").withBoundaries(0, 100).andOutputCount().as("count");
// will generate {$bucket: {groupBy: $price, boundaries: [0, 100], 5, output: { titles: { $push: "$title"}}}
bucket("price").withBoundaries(0, 100).andOutput("title").push().as("titles");
----
====
`BucketAutoOperation` determines boundaries itself in an attempt to evenly distribute documents into a specified number of buckets. `BucketAutoOperation` optionally takes a granularity specifies the https://en.wikipedia.org/wiki/Preferred_number[preferred number] series to use to ensure that the calculated boundary edges end on preferred round numbers or their powers of 10.
.Bucket operation examples
====
[source,java]
----
// will generate {$bucketAuto: {groupBy: $price, buckets: 5}}
bucketAuto("price", 5)
// will generate {$bucketAuto: {groupBy: $price, buckets: 5, granularity: "E24"}}
bucketAuto("price", 5).withGranularity(Granularities.E24).withDefault("Other");
// will generate {$bucketAuto: {groupBy: $price, buckets: 5, output: { titles: { $push: "$title"}}}
bucketAuto("price", 5).andOutput("title").push().as("titles");
----
====
Bucket operations can use `AggregationExpression` via `andOutput()` and <<mongo.aggregation.projection.expressions, SpEL expressions>> via `andOutputExpression()` to create output fields in buckets.
Note that further details regarding bucket expressions can be found in the http://docs.mongodb.org/manual/reference/operator/aggregation/bucket/[`$bucket` section] and
http://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/[`$bucketAuto` section] of the MongoDB Aggregation Framework reference documentation.
==== Multi-faceted aggregation
Multiple aggregation pipelines can be used to create multi-faceted aggregations which characterize data across multiple dimensions, or facets, within a single aggregation stage. Multi-faceted aggregations provide multiple filters and categorizations to guide data browsing and analysis. A common implementation of faceting is how many online retailers provide ways to narrow down search results by applying filters on product price, manufacturer, size, etc.
A `FacetOperation` can be defined via the `facet()` method of the `Aggregation` class. It can be customized with multiple aggregation pipelines via the `and()` method. Each sub-pipeline has its own field in the output document where its results are stored as an array of documents.
Sub-pipelines can project and filter input documents prior grouping. Common cases are extraction of date parts or calculations before categorization.
.Facet operation examples
====
[source,java]
----
// will generate {$facet: {categorizedByPrice: [ { $match: { price: {$exists : true}}}, { $bucketAuto: {groupBy: $price, buckets: 5}}]}}
facet(match(Criteria.where("price").exists(true)), bucketAuto("price", 5)).as("categorizedByPrice"))
// will generate {$facet: {categorizedByYear: [
// { $project: { title: 1, publicationYear: { $year: "publicationDate"}}},
// { $bucketAuto: {groupBy: $price, buckets: 5, output: { titles: {$push:"$title"}}}
// ]}}
facet(project("title").and("publicationDate").extractYear().as("publicationYear"),
bucketAuto("publicationYear", 5).andOutput("title").push().as("titles"))
.as("categorizedByYear"))
----
====
Note that further details regarding facet operation can be found in the http://docs.mongodb.org/manual/reference/operator/aggregation/facet/[`$facet` section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.projection.expressions]]
==== Spring Expression Support in Projection Expressions
As of Version 1.4.0 we support the use of SpEL expression in projection expressions via the `andExpression` method of the `ProjectionOperation` class. This allows you to define the desired expression as a SpEL expression which is translated into a corresponding MongoDB projection expression part on query execution. This makes it much easier to express complex calculations.
We support the use of SpEL expression in projection expressions via the `andExpression` method of the `ProjectionOperation` and `BucketOperation` classes. This allows you to define the desired expression as a SpEL expression which is translated into a corresponding MongoDB projection expression part on query execution. This makes it much easier to express complex calculations.
===== Complex calculations with SpEL expressions
@@ -2419,13 +2497,14 @@ NOTE: Collection creation allows customization via `CollectionOptions` and suppo
[[mongo-template.commands]]
== Executing Commands
You can also get at the MongoDB driver's `DB.command( )` method using the `executeCommand(…)` methods on `MongoTemplate`. These will also perform exception translation into Spring's `DataAccessException` hierarchy.
You can also get at the MongoDB driver's `MongoDatabase.runCommand( )` method using the `executeCommand(…)` methods on `MongoTemplate`. These will also perform exception translation into Spring's `DataAccessException` hierarchy.
[[mongo-template.commands.execution]]
=== Methods for executing commands
* `CommandResult` *executeCommand* `(Document command)` Execute a MongoDB command.
* `CommandResult` *executeCommand* `(String jsonCommand)` Execute the a MongoDB command expressed as a JSON string.
* `Document` *executeCommand* `(Document command)` Execute a MongoDB command.
* `Document` *executeCommand* `(Document command, ReadPreference readPreference)` Execute a MongoDB command using the given nullable MongoDB `ReadPreference`.
* `Document` *executeCommand* `(String jsonCommand)` Execute the a MongoDB command expressed as a JSON string.
[[mongodb.mapping-usage.events]]
== Lifecycle Events
@@ -2602,7 +2681,7 @@ class GridFsClient {
@Test
public void findFilesInGridFs() {
List<GridFSDBFile> result = operations.find(query(whereFilename().is("filename.txt")))
GridFSFindIterable result = operations.find(query(whereFilename().is("filename.txt")))
}
}
----

View File

@@ -1,6 +1,27 @@
Spring Data MongoDB Changelog
=============================
Changes in version 2.0.1.RELEASE (2017-10-27)
---------------------------------------------
* DATAMONGO-1815 - Adapt API changes in Property in test cases.
* DATAMONGO-1814 - Missing documentation on Faceted classification.
* DATAMONGO-1811 - Reference Documentation doesn't match with API Documentation 2.X vesrion.
* DATAMONGO-1809 - Type hint usage broken when using positional parameters with more than one digit.
* DATAMONGO-1806 - GridFsResource wrong type in javaDoc.
* DATAMONGO-1805 - Documentation for operations.find uses wrong result type.
* DATAMONGO-1802 - No converter found capable of converting from type org.bson.types.Binary to type byte[].
* DATAMONGO-1795 - Remove obsolete Kotlin build configuration.
* DATAMONGO-1793 - Release 2.0.1 (Kay SR1).
* DATAMONGO-1696 - Reference documentation uses JPA Annotations.
Changes in version 1.10.8.RELEASE (2017-10-11)
----------------------------------------------
* DATAMONGO-1784 - Add support for AggregationExpression in GroupOperation.sum.
* DATAMONGO-1782 - CyclicPropertyReferenceException on index resolution.
* DATAMONGO-1775 - Release 1.10.8 (Ingalls SR8).
Changes in version 2.0.0.RELEASE (2017-10-02)
---------------------------------------------
* DATAMONGO-1791 - Adapt to changed Spring Framework 5 documentation structure.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 2.0 GA
Spring Data MongoDB 2.0.1
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").