Compare commits

..

17 Commits

Author SHA1 Message Date
Mark Paluch
cec6edfa26 DATAMONGO-1859 - Release version 2.0.4 (Kay SR4). 2018-02-19 19:46:53 +01:00
Mark Paluch
3261936e8a DATAMONGO-1859 - Prepare 2.0.4 (Kay SR4). 2018-02-19 19:46:05 +01:00
Mark Paluch
d2d471d135 DATAMONGO-1859 - Updated changelog. 2018-02-19 19:45:58 +01:00
Mark Paluch
bcd2de000c DATAMONGO-1870 - Polishing.
Extend copyright license years. Slightly reword documentation. Use IntStream and insertAll to create test fixture.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:14 +01:00
Christoph Strobl
c873e49d71 DATAMONGO-1870 - Consider skip/limit on MongoOperations.remove(Query, Class).
We now use _id lookup for remove operations that query with limit or skip parameters. This allows more fine grained control over documents removed.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:05 +01:00
Christoph Strobl
4ebcac19bc DATAMONGO-1860 - Polishing.
Fix references to QuerydslPredicateExecutor.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
78212948bc DATAMONGO-1860 - Polishing.
Fix type references in Javadoc. Change lambdas to method references where applicable.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
38575baec1 DATAMONGO-1860 - Retrieve result count via QuerydslMongoPredicateExecutor only for paging.
We now use AbstractMongodbQuery.fetch() instead of AbstractMongodbQuery.fetchResults() to execute MongoDB queries. fetchResults() executes a find(…) and a count(…) query. Retrieving the record count is an expensive operation in MongoDB and the count is not always required. For regular find(…) method, the count is ignored, for paging the count(…) is only required in certain result/request scenarios.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
f1a3c37a79 DATAMONGO-1865 - Polishing.
Adapt to collection name retrieval during query execution. Slightly reword documentation and JavaDoc.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Christoph Strobl
c668a47243 DATAMONGO-1865 - Avoid IncorrectResultSizeDataAccessException for derived findFirst/findTop queries.
We now return the first result when executing findFirst/findTop queries. This fixes a glitch introduced in the Kay release throwing IncorrectResultSizeDataAccessException for single entity executions returning more than one result, which is explicitly not the desired behavior in this case.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Mark Paluch
6a20ddf5a2 DATAMONGO-1871 - Polishing.
Migrate test to AssertJ.

Original pull request: #533.
2018-02-14 11:05:20 +01:00
Christoph Strobl
cec6526543 DATAMONGO-1871 - Fix AggregationExpression aliasing.
We now make sure to allow a nested property alias by setting the target.

Original pull request: #533.
2018-02-14 11:05:17 +01:00
Oliver Gierke
46ea58f3b9 DATAMONGO-1872 - Polishing.
Fixed @since tag for newly introduced method in MongoEntityMetadata.
2018-02-13 12:24:09 +01:00
Oliver Gierke
ebaea8d22f DATAMONGO-1872 - Repository query execution doesn't prematurely fix collection to be queried.
We now avoid calling ….inCollection(…) with a fixed, one-time calculated collection name to make sure we dynamically resolve the collections. That's necessary to make sure SpEL expressions in @Document are evaluated for every query execution.
2018-02-13 12:18:19 +01:00
Christoph Strobl
ed6aaeed25 DATAMONGO-1794 - Updated changelog. 2018-02-06 11:14:01 +01:00
Mark Paluch
89b1b6fbb2 DATAMONGO-1830 - After release cleanups. 2018-01-24 13:46:10 +01:00
Mark Paluch
23769301b5 DATAMONGO-1830 - Prepare next development iteration. 2018-01-24 13:46:08 +01:00
36 changed files with 609 additions and 299 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
</parent>
<modules>
@@ -27,7 +27,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.3.RELEASE</springdata.commons>
<springdata.commons>2.0.4.RELEASE</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -49,7 +49,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.3.RELEASE</version>
<version>2.0.4.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -710,8 +710,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -723,8 +723,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -737,8 +737,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -754,8 +754,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -1083,6 +1083,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param query the query document that specifies the criteria used to remove a record.
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal entityClass} is {@literal null}.
*/
DeleteResult remove(Query query, Class<?> entityClass);
@@ -1094,6 +1095,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass class of the pojo to be operated on. Can be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query}, {@literal entityClass} or {@literal collectionName} is
* {@literal null}.
*/
DeleteResult remove(Query query, Class<?> entityClass, String collectionName);
@@ -1106,6 +1109,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param query the query document that specifies the criteria used to remove a record.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal collectionName} is {@literal null}.
*/
DeleteResult remove(Query query, String collectionName);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,26 +18,14 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import com.mongodb.*;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.*;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Scanner;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
@@ -128,6 +116,14 @@ import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Cursor;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.AggregateIterable;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MapReduceIterable;
@@ -1587,13 +1583,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
protected <T> DeleteResult doRemove(final String collectionName, final Query query,
@Nullable final Class<T> entityClass) {
Assert.notNull(query, "Query must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
}
final Document queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
final Document queryObject = queryMapper.getMappedObject(query.getQueryObject(), entity);
return execute(collectionName, new CollectionCallback<DeleteResult>() {
@@ -1602,7 +1596,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass, collectionName));
Document mappedQuery = queryMapper.getMappedObject(queryObject, entity);
Document removeQuery = queryObject;
DeleteOptions options = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
@@ -1615,14 +1609,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
DeleteResult dr = null;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(mappedQuery), collectionName });
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
}
if (query.getLimit() > 0 || query.getSkip() > 0) {
MongoCursor<Document> cursor = new QueryCursorPreparer(query, entityClass)
.prepare(collection.find(removeQuery).projection(new Document(ID_FIELD, 1))).iterator();
Set<Object> ids = new LinkedHashSet<>();
while (cursor.hasNext()) {
ids.add(cursor.next().get(ID_FIELD));
}
removeQuery = new Document(ID_FIELD, new Document("$in", ids));
}
if (writeConcernToUse == null) {
dr = collection.deleteMany(mappedQuery, options);
dr = collection.deleteMany(removeQuery, options);
} else {
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(mappedQuery, options);
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(removeQuery, options);
}
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName));
@@ -3097,8 +3104,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
Document aggregate(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
Document command = prepareAggregationCommand(collectionName, aggregation,
context, batchSize);
Document command = prepareAggregationCommand(collectionName, aggregation, context, batchSize);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -125,6 +125,7 @@ import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.Filters;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
@@ -1611,27 +1612,39 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return execute(collectionName, collection -> {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass, collectionName));
Document removeQuey = queryMapper.getMappedObject(queryObject, entity);
Document dboq = queryMapper.getMappedObject(queryObject, entity);
maybeEmitEvent(new BeforeDeleteEvent<T>(removeQuey, entityClass, collectionName));
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass,
null, queryObject);
null, removeQuey);
final DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
MongoCollection<Document> collectionToUse = prepareCollection(collection, writeConcernToUse);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(dboq), collectionName });
new Object[] { serializeToJsonSafely(removeQuey), collectionName });
}
query.getCollation().ifPresent(val -> {
if (query.getLimit() > 0 || query.getSkip() > 0) {
// TODO: add collation support as soon as it's there! See https://jira.mongodb.org/browse/JAVARS-27
throw new IllegalArgumentException("DeleteMany does currently not accept collation settings.");
});
FindPublisher<Document> cursor = new QueryFindPublisherPreparer(query, entityClass)
.prepare(collection.find(removeQuey)) //
.projection(new Document(ID_FIELD, 1));
return collectionToUse.deleteMany(dboq);
return Flux.from(cursor) //
.map(doc -> doc.get(ID_FIELD)) //
.collectList() //
.flatMapMany(val -> {
return collectionToUse.deleteMany(new Document(ID_FIELD, new Document("$in", val)), deleteOptions);
});
} else {
return collectionToUse.deleteMany(removeQuey, deleteOptions);
}
}).doOnNext(deleteResult -> maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName)))
.next();

View File

@@ -455,7 +455,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
}
if (value instanceof AggregationExpression) {
return this.operation.and(new ExpressionProjection(Fields.field(alias), (AggregationExpression) value));
return this.operation.and(new ExpressionProjection(Fields.field(alias, alias), (AggregationExpression) value));
}
return this.operation.and(new FieldProjection(Fields.field(alias, getRequiredName()), null));

View File

@@ -15,8 +15,9 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.MongoQueryExecution.DeleteExecution;
@@ -27,7 +28,6 @@ import org.springframework.data.mongodb.repository.query.MongoQueryExecution.Sli
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.repository.query.ReturnedType;
import org.springframework.util.Assert;
/**
@@ -42,7 +42,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoOperations operations;
private final FindWithProjection<?> findOperationWithProjection;
private final ExecutableFind<?> executableFind;
/**
* Creates a new {@link AbstractMongoQuery} from the given {@link MongoQueryMethod} and {@link MongoOperations}.
@@ -58,11 +58,10 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
this.method = method;
this.operations = operations;
ReturnedType returnedType = method.getResultProcessor().getReturnedType();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
Class<?> type = metadata.getCollectionEntity().getType();
this.findOperationWithProjection = operations//
.query(returnedType.getDomainType())//
.inCollection(method.getEntityInformation().getCollectionName());
this.executableFind = operations.query(type);
}
/*
@@ -90,8 +89,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Class<?> typeToRead = processor.getReturnedType().getTypeToRead();
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
? executableFind //
: executableFind.as(typeToRead);
MongoQueryExecution execution = getExecution(accessor, find);
@@ -119,7 +118,11 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
} else if (isExistsQuery()) {
return q -> operation.matching(q).exists();
} else {
return q -> operation.matching(q).oneValue();
return q -> {
TerminatingFind<?> find = operation.matching(q);
return isLimiting() ? find.firstValue() : find.oneValue();
};
}
}
@@ -174,4 +177,12 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @since 1.5
*/
protected abstract boolean isDeleteQuery();
/**
* Return whether the query has an explicit limit set.
*
* @return
* @since 2.0.4
*/
protected abstract boolean isLimiting();
}

View File

@@ -22,18 +22,20 @@ import org.reactivestreams.Publisher;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.CollectionExecution;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.DeleteExecution;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.GeoNearExecution;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingConverter;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingExecution;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.SingleEntityExecution;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.TailExecution;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.repository.query.ReturnedType;
import org.springframework.util.Assert;
/**
@@ -48,6 +50,7 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
private final ReactiveMongoQueryMethod method;
private final ReactiveMongoOperations operations;
private final EntityInstantiators instantiators;
private final FindWithProjection<?> findOperationWithProjection;
/**
* Creates a new {@link AbstractReactiveMongoQuery} from the given {@link MongoQueryMethod} and
@@ -64,6 +67,11 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
this.method = method;
this.operations = operations;
this.instantiators = new EntityInstantiators();
MongoEntityMetadata<?> metadata = method.getEntityInformation();
Class<?> type = metadata.getCollectionEntity().getType();
this.findOperationWithProjection = operations.query(type);
}
/*
@@ -103,10 +111,16 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
applyQueryMetaAttributesWhenPresent(query);
ResultProcessor processor = method.getResultProcessor().withDynamicProjection(parameterAccessor);
Class<?> typeToRead = processor.getReturnedType().getTypeToRead();
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
String collection = method.getEntityInformation().getCollectionName();
ReactiveMongoQueryExecution execution = getExecution(query, parameterAccessor,
new ResultProcessingConverter(processor, operations, instantiators));
new ResultProcessingConverter(processor, operations, instantiators), find);
return execution.execute(query, processor.getReturnedType().getDomainType(), collection);
}
@@ -120,11 +134,11 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @return
*/
private ReactiveMongoQueryExecution getExecution(Query query, MongoParameterAccessor accessor,
Converter<Object, Object> resultProcessing) {
return new ResultProcessingExecution(getExecutionToWrap(accessor), resultProcessing);
Converter<Object, Object> resultProcessing, FindWithQuery<?> operation) {
return new ResultProcessingExecution(getExecutionToWrap(accessor, operation), resultProcessing);
}
private ReactiveMongoQueryExecution getExecutionToWrap(MongoParameterAccessor accessor) {
private ReactiveMongoQueryExecution getExecutionToWrap(MongoParameterAccessor accessor, FindWithQuery<?> operation) {
if (isDeleteQuery()) {
return new DeleteExecution(operations, method);
@@ -133,9 +147,20 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
} else if (isTailable(method)) {
return new TailExecution(operations, accessor.getPageable());
} else if (method.isCollectionQuery()) {
return new CollectionExecution(operations, accessor.getPageable());
return (q, t, c) -> operation.matching(q.with(accessor.getPageable())).all();
} else if (isCountQuery()) {
return (q, t, c) -> operation.matching(q).count();
} else {
return new SingleEntityExecution(operations, isCountQuery());
return (q, t, c) -> {
TerminatingFind<?> find = operation.matching(q);
if (isCountQuery()) {
return find.count();
}
return isLimiting() ? find.first() : find.one();
};
}
}
@@ -186,4 +211,12 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @since 1.5
*/
protected abstract boolean isDeleteQuery();
/**
* Return whether the query has an explicit limit set.
*
* @return
* @since 2.0.4
*/
protected abstract boolean isLimiting();
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.repository.core.EntityMetadata;
/**
@@ -30,4 +31,12 @@ public interface MongoEntityMetadata<T> extends EntityMetadata<T> {
* @return
*/
String getCollectionName();
/**
* Returns the {@link MongoPersistentEntity} that supposed to determine the collection to be queried.
*
* @return
* @since 2.0.4
*/
MongoPersistentEntity<?> getCollectionEntity();
}

View File

@@ -160,4 +160,13 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
protected boolean isDeleteQuery() {
return tree.isDelete();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return tree.isLimiting();
}
}

View File

@@ -44,29 +44,13 @@ import com.mongodb.client.result.DeleteResult;
* various flavors.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
interface ReactiveMongoQueryExecution {
Object execute(Query query, Class<?> type, String collection);
/**
* {@link ReactiveMongoQueryExecution} for collection returning queries.
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
final class CollectionExecution implements ReactiveMongoQueryExecution {
private final @NonNull ReactiveMongoOperations operations;
private final Pageable pageable;
@Override
public Object execute(Query query, Class<?> type, String collection) {
return operations.find(query.with(pageable), type, collection);
}
}
/**
* {@link ReactiveMongoQueryExecution} for collection returning queries using tailable cursors.
*
@@ -84,23 +68,6 @@ interface ReactiveMongoQueryExecution {
}
}
/**
* {@link ReactiveMongoQueryExecution} to return a single entity.
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
final class SingleEntityExecution implements ReactiveMongoQueryExecution {
private final ReactiveMongoOperations operations;
private final boolean countProjection;
@Override
public Object execute(Query query, Class<?> type, String collection) {
return countProjection ? operations.count(query, type, collection) : operations.findOne(query, type, collection);
}
}
/**
* {@link MongoQueryExecution} to execute geo-near queries.
*

View File

@@ -118,7 +118,7 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createCountQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createCountQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
@@ -127,7 +127,7 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
@@ -136,10 +136,19 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isDeleteQuery()
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return tree.isDelete();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return tree.isLimiting();
}
}

View File

@@ -104,7 +104,7 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
@@ -125,7 +125,7 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
@@ -134,11 +134,20 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isDeleteQuery()
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return this.isDeleteQuery;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return false;
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.Getter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
@@ -26,7 +28,7 @@ import org.springframework.util.Assert;
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final MongoPersistentEntity<?> collectionEntity;
private final @Getter MongoPersistentEntity<?> collectionEntity;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and {@link MongoPersistentEntity} to use for

View File

@@ -174,6 +174,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return countBooleanValues(isCountQuery, isExistsQuery, isDeleteQuery) > 1;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return false;
}
private static int countBooleanValues(boolean... values) {
int count = 0;

View File

@@ -59,7 +59,7 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
/**
* Creates a new {@link QuerydslMongoPredicateExecutor} for the given {@link MongoEntityInformation} and
* {@link MongoTemplate}. Uses the {@link SimpleEntityPathResolver} to create an {@link EntityPath} for the given
* {@link MongoOperations}. Uses the {@link SimpleEntityPathResolver} to create an {@link EntityPath} for the given
* domain class.
*
* @param entityInformation must not be {@literal null}.
@@ -72,7 +72,7 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
/**
* Creates a new {@link QuerydslMongoPredicateExecutor} for the given {@link MongoEntityInformation},
* {@link MongoTemplate} and {@link EntityPathResolver}.
* {@link MongoOperations} and {@link EntityPathResolver}.
*
* @param entityInformation must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
@@ -108,19 +108,19 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate)
*/
@Override
public List<T> findAll(Predicate predicate) {
Assert.notNull(predicate, "Predicate must not be null!");
return createQueryFor(predicate).fetchResults().getResults();
return createQueryFor(predicate).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, com.mysema.query.types.OrderSpecifier<?>[])
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate, com.querydsl.core.types.OrderSpecifier<?>[])
*/
@Override
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
@@ -128,12 +128,12 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
Assert.notNull(predicate, "Predicate must not be null!");
Assert.notNull(orders, "Order specifiers must not be null!");
return createQueryFor(predicate).orderBy(orders).fetchResults().getResults();
return createQueryFor(predicate).orderBy(orders).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, org.springframework.data.domain.Sort)
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate, org.springframework.data.domain.Sort)
*/
@Override
public List<T> findAll(Predicate predicate, Sort sort) {
@@ -141,24 +141,24 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
Assert.notNull(predicate, "Predicate must not be null!");
Assert.notNull(sort, "Sort must not be null!");
return applySorting(createQueryFor(predicate), sort).fetchResults().getResults();
return applySorting(createQueryFor(predicate), sort).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.OrderSpecifier[])
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#findAll(com.querydsl.core.types.OrderSpecifier[])
*/
@Override
public Iterable<T> findAll(OrderSpecifier<?>... orders) {
Assert.notNull(orders, "Order specifiers must not be null!");
return createQuery().orderBy(orders).fetchResults().getResults();
return createQuery().orderBy(orders).fetch();
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#findAll(com.mysema.query.types.Predicate, org.springframework.data.domain.Pageable)
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#findAll(com.querydsl.core.types.Predicate, org.springframework.data.domain.Pageable)
*/
@Override
public Page<T> findAll(Predicate predicate, Pageable pageable) {
@@ -168,13 +168,12 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
AbstractMongodbQuery<T, SpringDataMongodbQuery<T>> query = createQueryFor(predicate);
return PageableExecutionUtils.getPage(applyPagination(query, pageable).fetchResults().getResults(), pageable,
() -> createQueryFor(predicate).fetchCount());
return PageableExecutionUtils.getPage(applyPagination(query, pageable).fetch(), pageable, query::fetchCount);
}
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#count(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#count(com.querydsl.core.types.Predicate)
*/
@Override
public long count(Predicate predicate) {
@@ -186,7 +185,7 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
/*
* (non-Javadoc)
* @see org.springframework.data.querydsl.QueryDslPredicateExecutor#exists(com.mysema.query.types.Predicate)
* @see org.springframework.data.querydsl.QuerydslPredicateExecutor#exists(com.querydsl.core.types.Predicate)
*/
@Override
public boolean exists(Predicate predicate) {
@@ -197,7 +196,7 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
}
/**
* Creates a {@link MongodbQuery} for the given {@link Predicate}.
* Creates a {@link AbstractMongodbQuery} for the given {@link Predicate}.
*
* @param predicate
* @return
@@ -207,12 +206,12 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
}
/**
* Creates a {@link MongodbQuery}.
* Creates a {@link AbstractMongodbQuery}.
*
* @return
*/
private AbstractMongodbQuery<T, SpringDataMongodbQuery<T>> createQuery() {
return new SpringDataMongodbQuery<T>(mongoOperations, entityInformation.getJavaType());
return new SpringDataMongodbQuery<>(mongoOperations, entityInformation.getJavaType());
}
/**
@@ -248,13 +247,13 @@ public class QuerydslMongoPredicateExecutor<T> implements QuerydslPredicateExecu
return query;
}
sort.stream().map(this::toOrder).forEach(it -> query.orderBy(it));
sort.stream().map(this::toOrder).forEach(query::orderBy);
return query;
}
/**
* Transforms a plain {@link Order} into a QueryDsl specific {@link OrderSpecifier}.
* Transforms a plain {@link Order} into a Querydsl specific {@link OrderSpecifier}.
*
* @param order
* @return

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,18 +33,9 @@ import lombok.NoArgsConstructor;
import java.lang.reflect.InvocationTargetException;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.UUID;
import java.util.*;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
import org.bson.types.ObjectId;
import org.hamcrest.collection.IsMapContaining;
@@ -108,6 +99,7 @@ import com.mongodb.client.FindIterable;
import com.mongodb.client.ListIndexesIterable;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
/**
@@ -3286,6 +3278,38 @@ public class MongoTemplateTests {
assertThat(template.find(new Query().limit(1), Sample.class)).hasSize(1);
}
@Test // DATAMONGO-1870
public void removeShouldConsiderLimit() {
List<Sample> samples = IntStream.range(0, 100) //
.mapToObj(i -> new Sample("id-" + i, i % 2 == 0 ? "stark" : "lannister")) //
.collect(Collectors.toList());
template.insertAll(samples);
DeleteResult wr = template.remove(query(where("field").is("lannister")).limit(25), Sample.class);
assertThat(wr.getDeletedCount()).isEqualTo(25L);
assertThat(template.count(new Query(), Sample.class)).isEqualTo(75L);
}
@Test // DATAMONGO-1870
public void removeShouldConsiderSkipAndSort() {
List<Sample> samples = IntStream.range(0, 100) //
.mapToObj(i -> new Sample("id-" + i, i % 2 == 0 ? "stark" : "lannister")) //
.collect(Collectors.toList());
template.insertAll(samples);
DeleteResult wr = template.remove(new Query().skip(25).with(Sort.by("field")), Sample.class);
assertThat(wr.getDeletedCount()).isEqualTo(75L);
assertThat(template.count(new Query(), Sample.class)).isEqualTo(25L);
assertThat(template.count(query(where("field").is("lannister")), Sample.class)).isEqualTo(25L);
assertThat(template.count(query(where("field").is("stark")), Sample.class)).isEqualTo(0L);
}
static class TypeWithNumbers {
@Id String id;

View File

@@ -58,7 +58,6 @@ import org.springframework.data.domain.Sort;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
@@ -71,6 +70,7 @@ import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -155,7 +155,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
new MongoTemplate(null, "database");
}
@Test(expected = DataAccessException.class)
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1870
public void removeHandlesMongoExceptionProperly() throws Exception {
MongoTemplate template = mockOutGetDb();
@@ -484,7 +484,6 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
when(output.iterator()).thenReturn(cursor);
when(cursor.hasNext()).thenReturn(false);
when(collection.mapReduce(anyString(), anyString())).thenReturn(output);
template.mapReduce("collection", "function(){}", "function(key,values){}", new MapReduceOptions().limit(1000),

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,10 @@ import java.util.Map;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import java.util.stream.IntStream;
import org.assertj.core.api.Assertions;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.junit.After;
@@ -298,9 +301,12 @@ public class ReactiveMongoTemplateTests {
public void updateFirstByEntityTypeShouldUpdateObject() {
Person person = new Person("Oliver2", 25);
StepVerifier.create(template.insert(person) //
.then(template.updateFirst(new Query(where("age").is(25)), new Update().set("firstName", "Sven"), Person.class)) //
.flatMapMany(p -> template.find(new Query(where("age").is(25)), Person.class))).consumeNextWith(actual -> {
StepVerifier
.create(template.insert(person) //
.then(template.updateFirst(new Query(where("age").is(25)), new Update().set("firstName", "Sven"),
Person.class)) //
.flatMapMany(p -> template.find(new Query(where("age").is(25)), Person.class)))
.consumeNextWith(actual -> {
assertThat(actual.getFirstName(), is(equalTo("Sven")));
}).verifyComplete();
@@ -481,7 +487,7 @@ public class ReactiveMongoTemplateTests {
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1774
public void removeWithNullShouldThrowError() {
template.remove((Object)null).subscribe();
template.remove((Object) null).subscribe();
}
@Test // DATAMONGO-1774
@@ -924,6 +930,37 @@ public class ReactiveMongoTemplateTests {
assertThat(documents.poll(1, TimeUnit.SECONDS), is(nullValue()));
}
@Test // DATAMONGO-1870
public void removeShouldConsiderLimit() {
List<Sample> samples = IntStream.range(0, 100) //
.mapToObj(i -> new Sample("id-" + i, i % 2 == 0 ? "stark" : "lannister")) //
.collect(Collectors.toList());
StepVerifier.create(template.insertAll(samples)).expectNextCount(100).verifyComplete();
StepVerifier.create(template.remove(query(where("field").is("lannister")).limit(25), Sample.class))
.assertNext(wr -> Assertions.assertThat(wr.getDeletedCount()).isEqualTo(25L)).verifyComplete();
}
@Test // DATAMONGO-1870
public void removeShouldConsiderSkipAndSort() {
List<Sample> samples = IntStream.range(0, 100) //
.mapToObj(i -> new Sample("id-" + i, i % 2 == 0 ? "stark" : "lannister")) //
.collect(Collectors.toList());
StepVerifier.create(template.insertAll(samples)).expectNextCount(100).verifyComplete();
StepVerifier.create(template.remove(new Query().skip(25).with(Sort.by("field")), Sample.class))
.assertNext(wr -> Assertions.assertThat(wr.getDeletedCount()).isEqualTo(75L)).verifyComplete();
StepVerifier.create(template.count(query(where("field").is("lannister")), Sample.class)).expectNext(25L)
.verifyComplete();
StepVerifier.create(template.count(query(where("field").is("stark")), Sample.class)).expectNext(0L)
.verifyComplete();
}
private PersonWithAList createPersonWithAList(String firstname, int age) {
PersonWithAList p = new PersonWithAList();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -168,16 +168,16 @@ public class ReactiveMongoTemplateUnitTests {
assertThat(options.getValue().getCollation().getLocale(), is("fr"));
}
@Ignore("see https://jira.mongodb.org/browse/JAVARS-27")
@Test // DATAMONGO-1518
public void findAndRemoveManyShouldUseCollationWhenPresent() {
when(collection.deleteMany(any(), any())).thenReturn(Mono.empty());
template.doRemove("collection-1", new BasicQuery("{}").collation(Collation.of("fr")), AutogenerateableId.class)
.subscribe();
ArgumentCaptor<DeleteOptions> options = ArgumentCaptor.forClass(DeleteOptions.class);
// the current mongodb-driver-reactivestreams:1.4.0 driver does not offer deleteMany with options.
// verify(collection).deleteMany(Mockito.any(), options.capture());
verify(collection).deleteMany(Mockito.any(), options.capture());
assertThat(options.getValue().getCollation().getLocale(), is("fr"));
}

View File

@@ -15,12 +15,10 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.DocumentTestUtils.*;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import static org.springframework.data.mongodb.test.util.Assertions.*;
import java.util.ArrayList;
import java.util.Arrays;
@@ -33,7 +31,6 @@ import org.junit.rules.ExpectedException;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
/**
* Unit tests for {@link Aggregation}.
@@ -119,10 +116,9 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document unwind = ((List<Document>) agg.get("pipeline")).get(0);
assertThat((Document) unwind.get("$unwind"),
isBsonObject(). //
containing("includeArrayIndex", "x").//
containing("preserveNullAndEmptyArrays", true));
assertThat(unwind.get("$unwind", Document.class)). //
containsEntry("includeArrayIndex", "x"). //
containsEntry("preserveNullAndEmptyArrays", true);
}
@Test // DATAMONGO-1391
@@ -133,8 +129,9 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document unwind = ((List<Document>) agg.get("pipeline")).get(0);
assertThat(unwind,
isBsonObject().notContaining("includeArrayIndex").containing("preserveNullAndEmptyArrays", true));
assertThat(unwind) //
.doesNotContainKey("$unwind.includeArrayIndex") //
.containsEntry("$unwind.preserveNullAndEmptyArrays", true);
}
@Test // DATAMONGO-1550
@@ -146,7 +143,7 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document unwind = ((List<Document>) agg.get("pipeline")).get(0);
assertThat(unwind, isBsonObject().containing("$replaceRoot.newRoot", new Document("field", "value")));
assertThat(unwind).containsEntry("$replaceRoot.newRoot", new Document("field", "value"));
}
@Test // DATAMONGO-753
@@ -171,8 +168,8 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(2);
Document fields = getAsDocument(secondProjection, "$project");
assertThat(fields.get("aCnt"), is((Object) 1));
assertThat(fields.get("a"), is((Object) "$_id.a"));
assertThat(fields.get("aCnt")).isEqualTo(1);
assertThat(fields.get("a")).isEqualTo("$_id.a");
}
@Test // DATAMONGO-791
@@ -188,14 +185,14 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(2);
Document fields = getAsDocument(secondProjection, "$project");
assertThat(fields.get("aCnt"), is((Object) 1));
assertThat(fields.get("a"), is((Object) "$_id.a"));
assertThat(fields.get("aCnt")).isEqualTo(1);
assertThat(fields.get("a")).isEqualTo("$_id.a");
}
@Test // DATAMONGO-791
public void allowTypedAggregationOperationsToBePassedAsIterable() {
List<AggregationOperation> ops = new ArrayList<AggregationOperation>();
List<AggregationOperation> ops = new ArrayList<>();
ops.add(project("a"));
ops.add(group("a").count().as("aCnt"));
ops.add(project("aCnt", "a"));
@@ -205,8 +202,8 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(2);
Document fields = getAsDocument(secondProjection, "$project");
assertThat(fields.get("aCnt"), is((Object) 1));
assertThat(fields.get("a"), is((Object) "$_id.a"));
assertThat(fields.get("aCnt")).isEqualTo((Object) 1);
assertThat(fields.get("a")).isEqualTo((Object) "$_id.a");
}
@Test // DATAMONGO-838
@@ -220,7 +217,7 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(1);
Document fields = getAsDocument(secondProjection, "$group");
assertThat(fields.get("foosum"), is((Object) new Document("$sum", "$foo")));
assertThat(fields.get("foosum")).isEqualTo(new Document("$sum", "$foo"));
}
@Test // DATAMONGO-908
@@ -232,13 +229,13 @@ public class AggregationUnitTests {
group("cmsParameterId", "rules.ruleType").count().as("totol") //
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg, is(notNullValue()));
assertThat(agg).isNotNull();
Document group = ((List<Document>) agg.get("pipeline")).get(2);
Document fields = getAsDocument(group, "$group");
Document id = getAsDocument(fields, "_id");
assertThat(id.get("ruleType"), is((Object) "$rules.ruleType"));
assertThat(id.get("ruleType")).isEqualTo("$rules.ruleType");
}
@Test // DATAMONGO-1585
@@ -249,11 +246,12 @@ public class AggregationUnitTests {
sort(Direction.ASC, "cmsParameterId", "titles") //
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg, is(notNullValue()));
assertThat(agg).isNotNull();
Document sort = ((List<Document>) agg.get("pipeline")).get(1);
assertThat(getAsDocument(sort, "$sort"), is(Document.parse("{ \"_id.cmsParameterId\" : 1 , \"titles\" : 1}")));
assertThat(getAsDocument(sort, "$sort"))
.isEqualTo(Document.parse("{ \"_id.cmsParameterId\" : 1 , \"titles\" : 1}"));
}
@Test // DATAMONGO-1585
@@ -266,14 +264,13 @@ public class AggregationUnitTests {
sort(Direction.ASC, "cmsParameterId", "titles", "alias") //
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg, is(notNullValue()));
assertThat(agg).isNotNull();
Document sort = ((List<Document>) agg.get("pipeline")).get(1);
assertThat(getAsDocument(sort, "$sort"),
isBsonObject().containing("cmsParameterId", 1) //
.containing("titles", 1) //
.containing("alias", 1));
assertThat(getAsDocument(sort, "$sort")).containsEntry("cmsParameterId", 1) //
.containsEntry("titles", 1) //
.containsEntry("alias", 1);
}
@Test // DATAMONGO-924
@@ -285,10 +282,10 @@ public class AggregationUnitTests {
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document projection0 = extractPipelineElement(agg, 0, "$project");
assertThat(projection0, is((Document) new Document("ba", "$foo.bar")));
assertThat(projection0).isEqualTo(new Document("ba", "$foo.bar"));
Document projection1 = extractPipelineElement(agg, 1, "$project");
assertThat(projection1, is((Document) new Document("b", "$ba")));
assertThat(projection1).isEqualTo(new Document("b", "$ba"));
}
@Test // DATAMONGO-960
@@ -298,8 +295,8 @@ public class AggregationUnitTests {
project().and("a").as("aa") //
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg,
is(Document.parse("{ \"aggregate\" : \"foo\" , \"pipeline\" : [ { \"$project\" : { \"aa\" : \"$a\"}}]}")));
assertThat(agg).isEqualTo(
Document.parse("{ \"aggregate\" : \"foo\" , \"pipeline\" : [ { \"$project\" : { \"aa\" : \"$a\"}}]}"));
}
@Test // DATAMONGO-960
@@ -314,13 +311,12 @@ public class AggregationUnitTests {
.withOptions(aggregationOptions) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
assertThat(agg,
is(Document.parse("{ \"aggregate\" : \"foo\" , " //
+ "\"pipeline\" : [ { \"$project\" : { \"aa\" : \"$a\"}}] , " //
+ "\"allowDiskUse\" : true , " //
+ "\"explain\" : true , " //
+ "\"cursor\" : { \"foo\" : 1}}") //
));
assertThat(agg).isEqualTo(Document.parse("{ \"aggregate\" : \"foo\" , " //
+ "\"pipeline\" : [ { \"$project\" : { \"aa\" : \"$a\"}}] , " //
+ "\"allowDiskUse\" : true , " //
+ "\"explain\" : true , " //
+ "\"cursor\" : { \"foo\" : 1}}") //
);
}
@Test // DATAMONGO-954, DATAMONGO-1585
@@ -335,13 +331,13 @@ public class AggregationUnitTests {
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document projection0 = extractPipelineElement(agg, 0, "$project");
assertThat(projection0, is((Document) new Document("someKey", 1).append("a1", "$a").append("a2", "$$CURRENT.a")));
assertThat(projection0).isEqualTo(new Document("someKey", 1).append("a1", "$a").append("a2", "$$CURRENT.a"));
Document sort = extractPipelineElement(agg, 1, "$sort");
assertThat(sort, is((Document) new Document("a1", -1)));
assertThat(sort).isEqualTo(new Document("a1", -1));
Document group = extractPipelineElement(agg, 2, "$group");
assertThat(group, is((Document) new Document("_id", "$someKey").append("doc", new Document("$first", "$$ROOT"))));
assertThat(group).isEqualTo(new Document("_id", "$someKey").append("doc", new Document("$first", "$$ROOT")));
}
@Test // DATAMONGO-1254
@@ -355,7 +351,7 @@ public class AggregationUnitTests {
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document group = extractPipelineElement(agg, 1, "$group");
assertThat(getAsDocument(group, "count"), is(new Document().append("$sum", "$tags_count")));
assertThat(getAsDocument(group, "count")).isEqualTo(new Document().append("$sum", "$tags_count"));
}
@Test // DATAMONGO-1254
@@ -369,7 +365,7 @@ public class AggregationUnitTests {
).toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document group = extractPipelineElement(agg, 1, "$group");
assertThat(getAsDocument(group, "count"), is(new Document().append("$sum", "$tags_count")));
assertThat(getAsDocument(group, "count")).isEqualTo(new Document().append("$sum", "$tags_count"));
}
@Test // DATAMONGO-861
@@ -385,9 +381,9 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(1);
Document fields = getAsDocument(secondProjection, "$group");
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first"));
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first.$cond.then", "$answer"));
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first.$cond.else", "no-answer"));
assertThat(getAsDocument(fields, "foosum")).containsKey("$first");
assertThat(getAsDocument(fields, "foosum")).containsEntry("$first.$cond.then", "$answer");
assertThat(getAsDocument(fields, "foosum")).containsEntry("$first.$cond.else", "no-answer");
}
@Test // DATAMONGO-861
@@ -406,18 +402,17 @@ public class AggregationUnitTests {
.append("then", "bright") //
.append("else", "dark");
assertThat(getAsDocument(project, "color"), isBsonObject().containing("$cond", expectedCondition));
assertThat(getAsDocument(project, "color")).containsEntry("$cond", expectedCondition);
}
@Test // DATAMONGO-861
public void shouldRenderProjectionConditionalCorrectly() {
Document agg = Aggregation.newAggregation(//
project().and("color")
.applyCondition(ConditionalOperators.Cond.newBuilder() //
.when("isYellow") //
.then("bright") //
.otherwise("dark")))
project().and("color").applyCondition(ConditionalOperators.Cond.newBuilder() //
.when("isYellow") //
.then("bright") //
.otherwise("dark")))
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 0, "$project");
@@ -426,17 +421,16 @@ public class AggregationUnitTests {
.append("then", "bright") //
.append("else", "dark");
assertThat(getAsDocument(project, "color"), isBsonObject().containing("$cond", expectedCondition));
assertThat(getAsDocument(project, "color")).containsEntry("$cond", expectedCondition);
}
@Test // DATAMONGO-861
public void shouldRenderProjectionConditionalWithCriteriaCorrectly() {
Document agg = Aggregation
.newAggregation(project()//
.and("color")//
.applyCondition(ConditionalOperators.Cond.newBuilder().when(Criteria.where("key").gt(5)) //
.then("bright").otherwise("dark"))) //
Document agg = Aggregation.newAggregation(project()//
.and("color")//
.applyCondition(ConditionalOperators.Cond.newBuilder().when(Criteria.where("key").gt(5)) //
.then("bright").otherwise("dark"))) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 0, "$project");
@@ -445,20 +439,18 @@ public class AggregationUnitTests {
.append("then", "bright") //
.append("else", "dark");
assertThat(getAsDocument(project, "color"), isBsonObject().containing("$cond", expectedCondition));
assertThat(getAsDocument(project, "color")).containsEntry("$cond", expectedCondition);
}
@Test // DATAMONGO-861
public void referencingProjectionAliasesShouldRenderProjectionConditionalWithFieldReferenceCorrectly() {
Document agg = Aggregation
.newAggregation(//
project().and("color").as("chroma"),
project().and("luminosity") //
.applyCondition(ConditionalOperators //
.when("chroma") //
.thenValueOf("bright") //
.otherwise("dark"))) //
Document agg = Aggregation.newAggregation(//
project().and("color").as("chroma"), project().and("luminosity") //
.applyCondition(ConditionalOperators //
.when("chroma") //
.thenValueOf("bright") //
.otherwise("dark"))) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 1, "$project");
@@ -467,20 +459,17 @@ public class AggregationUnitTests {
.append("then", "bright") //
.append("else", "dark");
assertThat(getAsDocument(project, "luminosity"), isBsonObject().containing("$cond", expectedCondition));
assertThat(getAsDocument(project, "luminosity")).containsEntry("$cond", expectedCondition);
}
@Test // DATAMONGO-861
public void referencingProjectionAliasesShouldRenderProjectionConditionalWithCriteriaReferenceCorrectly() {
Document agg = Aggregation
.newAggregation(//
project().and("color").as("chroma"),
project().and("luminosity") //
.applyCondition(ConditionalOperators.Cond.newBuilder()
.when(Criteria.where("chroma") //
.is(100)) //
.then("bright").otherwise("dark"))) //
Document agg = Aggregation.newAggregation(//
project().and("color").as("chroma"), project().and("luminosity") //
.applyCondition(ConditionalOperators.Cond.newBuilder().when(Criteria.where("chroma") //
.is(100)) //
.then("bright").otherwise("dark"))) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 1, "$project");
@@ -489,56 +478,50 @@ public class AggregationUnitTests {
.append("then", "bright") //
.append("else", "dark");
assertThat(getAsDocument(project, "luminosity"), isBsonObject().containing("$cond", expectedCondition));
assertThat(getAsDocument(project, "luminosity")).containsEntry("$cond", expectedCondition);
}
@Test // DATAMONGO-861
public void shouldRenderProjectionIfNullWithFieldReferenceCorrectly() {
Document agg = Aggregation
.newAggregation(//
project().and("color"), //
project().and("luminosity") //
.applyCondition(ConditionalOperators //
.ifNull("chroma") //
.then("unknown"))) //
Document agg = Aggregation.newAggregation(//
project().and("color"), //
project().and("luminosity") //
.applyCondition(ConditionalOperators //
.ifNull("chroma") //
.then("unknown"))) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 1, "$project");
assertThat(getAsDocument(project, "luminosity"),
isBsonObject().containing("$ifNull", Arrays.<Object> asList("$chroma", "unknown")));
assertThat(getAsDocument(project, "luminosity")).containsEntry("$ifNull", Arrays.asList("$chroma", "unknown"));
}
@Test // DATAMONGO-861
public void shouldRenderProjectionIfNullWithFallbackFieldReferenceCorrectly() {
Document agg = Aggregation
.newAggregation(//
project("fallback").and("color").as("chroma"),
project().and("luminosity") //
.applyCondition(ConditionalOperators.ifNull("chroma") //
.thenValueOf("fallback"))) //
Document agg = Aggregation.newAggregation(//
project("fallback").and("color").as("chroma"), project().and("luminosity") //
.applyCondition(ConditionalOperators.ifNull("chroma") //
.thenValueOf("fallback"))) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 1, "$project");
assertThat(getAsDocument(project, "luminosity"),
isBsonObject().containing("$ifNull", Arrays.asList("$chroma", "$fallback")));
assertThat(getAsDocument(project, "luminosity")).containsEntry("$ifNull", Arrays.asList("$chroma", "$fallback"));
}
@Test // DATAMONGO-1552
public void shouldHonorDefaultCountField() {
Document agg = Aggregation
.newAggregation(//
bucket("year"), //
project("count")) //
Document agg = Aggregation.newAggregation(//
bucket("year"), //
project("count")) //
.toDocument("foo", Aggregation.DEFAULT_CONTEXT);
Document project = extractPipelineElement(agg, 1, "$project");
assertThat(project, isBsonObject().containing("count", 1));
assertThat(project).containsEntry("count", 1);
}
@Test // DATAMONGO-1533
@@ -552,11 +535,11 @@ public class AggregationUnitTests {
@SuppressWarnings("unchecked")
Document secondProjection = ((List<Document>) agg.get("pipeline")).get(1);
Document fields = getAsDocument(secondProjection, "$group");
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first"));
assertThat(getAsDocument(fields, "foosum"),
isBsonObject().containing("$first.$cond.if", new Document("$gte", new Document("$a", 42))));
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first.$cond.then", "answer"));
assertThat(getAsDocument(fields, "foosum"), isBsonObject().containing("$first.$cond.else", "no-answer"));
assertThat(getAsDocument(fields, "foosum")).containsKey("$first");
assertThat(getAsDocument(fields, "foosum")).containsEntry("$first.$cond.if",
new Document("$gte", Arrays.asList("$a", 42)));
assertThat(getAsDocument(fields, "foosum")).containsEntry("$first.$cond.then", "answer");
assertThat(getAsDocument(fields, "foosum")).containsEntry("$first.$cond.else", "no-answer");
}
@Test // DATAMONGO-1756
@@ -565,8 +548,25 @@ public class AggregationUnitTests {
Document agg = newAggregation(project().and("value1.value").plus("value2.value").as("val")).toDocument("collection",
Aggregation.DEFAULT_CONTEXT);
assertThat(extractPipelineElement(agg, 0, "$project"),
is(equalTo(new Document("val", new Document("$add", Arrays.asList("$value1.value", "$value2.value"))))));
Document expected = new Document("val", new Document("$add", Arrays.asList("$value1.value", "$value2.value")));
assertThat(extractPipelineElement(agg, 0, "$project")).isEqualTo(expected);
}
@Test // DATAMONGO-1871
public void providedAliasShouldAllowNestingExpressionWithAliasCorrectly() {
Document condition = new Document("$and",
Arrays.asList(new Document("$gte", Arrays.asList("$$est.dt", "2015-12-29")), //
new Document("$lte", Arrays.asList("$$est.dt", "2017-12-29")) //
));
Aggregation agg = newAggregation(project("_id", "dId", "aId", "cty", "cat", "plts.plt")
.and(ArrayOperators.arrayOf("plts.ests").filter().as("est").by(condition)).as("plts.ests"));
Document $project = extractPipelineElement(agg.toDocument("collection-1", Aggregation.DEFAULT_CONTEXT), 0,
"$project");
assertThat($project.containsKey("plts.ests")).isTrue();
}
private Document extractPipelineElement(Document agg, int index, String operation) {

View File

@@ -38,6 +38,7 @@ import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
@@ -1177,4 +1178,19 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
public void readsClosedProjection() {
assertThat(repository.findClosedProjectionBy()).isNotEmpty();
}
@Test // DATAMONGO-1865
public void findFirstEntityReturnsFirstResultEvenForNonUniqueMatches() {
assertThat(repository.findFirstBy()).isNotNull();
}
@Test(expected = IncorrectResultSizeDataAccessException.class) // DATAMONGO-1865
public void findSingleEntityThrowsErrorWhenNotUnique() {
repository.findPersonByLastnameLike(dave.getLastname());
}
@Test(expected = IncorrectResultSizeDataAccessException.class) // DATAMONGO-1865
public void findOptionalSingleEntityThrowsErrorWhenNotUnique() {
repository.findOptionalPersonByLastnameLike(dave.getLastname());
}
}

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.repository;
import java.util.Collection;
import java.util.Date;
import java.util.List;
import java.util.Optional;
import java.util.stream.Stream;
import org.springframework.data.domain.Page;
@@ -286,6 +287,15 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
// DATAMONGO-950
Page<Person> findTop3ByLastnameStartingWith(String lastname, Pageable pageRequest);
// DATAMONGO-1865
Person findFirstBy(); // limits to 1 result if more, just return the first one
// DATAMONGO-1865
Person findPersonByLastnameLike(String firstname); // single person, error if more than one
// DATAMONGO-1865
Optional<Person> findOptionalPersonByLastnameLike(String firstname); // optional still, error when more than one
// DATAMONGO-1030
PersonSummaryDto findSummaryByLastname(String lastname);

View File

@@ -39,6 +39,7 @@ import org.springframework.beans.factory.BeanClassLoaderAware;
import org.springframework.beans.factory.BeanFactory;
import org.springframework.beans.factory.BeanFactoryAware;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
@@ -300,6 +301,17 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
.verifyComplete();
}
@Test // DATAMONGO-1865
public void shouldErrorOnFindOneWithNonUniqueResult() {
StepVerifier.create(repository.findOneByLastname(dave.getLastname()))
.expectError(IncorrectResultSizeDataAccessException.class).verify();
}
@Test // DATAMONGO-1865
public void shouldReturnFirstFindFirstWithMoreResults() {
StepVerifier.create(repository.findFirstByLastname(dave.getLastname())).expectNextCount(1).verifyComplete();
}
interface ReactivePersonRepository extends ReactiveMongoRepository<Person, String> {
Flux<Person> findByLastname(String lastname);
@@ -326,6 +338,8 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
Flux<GeoResult<Person>> findByLocationNear(Point point, Distance maxDistance, Pageable pageable);
Flux<Person> findPersonByLocationNear(Point point, Distance maxDistance);
Mono<Person> findFirstByLastname(String lastname);
}
interface ReactiveCappedCollectionRepository extends Repository<Capped, String> {

View File

@@ -17,8 +17,8 @@ package org.springframework.data.mongodb.repository.query;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.ArgumentMatchers.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.*;
import java.lang.reflect.Method;
@@ -26,6 +26,7 @@ import java.util.List;
import java.util.Optional;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -40,7 +41,6 @@ import org.springframework.data.domain.Slice;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.Person;
@@ -55,6 +55,7 @@ import org.springframework.data.mongodb.repository.Meta;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import com.mongodb.client.result.DeleteResult;
@@ -71,8 +72,7 @@ import com.mongodb.client.result.DeleteResult;
public class AbstractMongoQueryUnitTests {
@Mock MongoOperations mongoOperationsMock;
@Mock ExecutableFind<?> findOperationMock;
@Mock FindWithProjection<?> withProjectionMock;
@Mock ExecutableFind<?> executableFind;
@Mock FindWithQuery<?> withQueryMock;
@Mock BasicMongoPersistentEntity<?> persitentEntityMock;
@Mock MongoMappingContext mappingContextMock;
@@ -91,9 +91,8 @@ public class AbstractMongoQueryUnitTests {
converter.afterPropertiesSet();
doReturn(converter).when(mongoOperationsMock).getConverter();
doReturn(findOperationMock).when(mongoOperationsMock).query(any());
doReturn(withProjectionMock).when(findOperationMock).inCollection(any());
doReturn(withQueryMock).when(withProjectionMock).as(any());
doReturn(executableFind).when(mongoOperationsMock).query(any());
doReturn(withQueryMock).when(executableFind).as(any());
doReturn(withQueryMock).when(withQueryMock).matching(any());
}
@@ -144,9 +143,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock).as(Person.class);
verify(executableFind).as(Person.class);
verify(withQueryMock).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getValue().getMeta().getComment(), nullValue());
}
@@ -159,9 +157,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock).as(Person.class);
verify(executableFind).as(Person.class);
verify(withQueryMock).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
@@ -174,9 +171,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock).as(Person.class);
verify(executableFind).as(Person.class);
verify(withQueryMock).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
@@ -189,9 +185,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock).as(Person.class);
verify(executableFind).as(Person.class);
verify(withQueryMock).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getValue().getMeta().getComment(), is("comment"));
}
@@ -208,9 +203,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock, times(2)).as(Person.class);
verify(executableFind, times(2)).as(Person.class);
verify(withQueryMock, times(2)).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getAllValues().get(0).getSkip(), is(0L));
assertThat(captor.getAllValues().get(1).getSkip(), is(10L));
@@ -228,9 +222,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock, times(2)).as(Person.class);
verify(executableFind, times(2)).as(Person.class);
verify(withQueryMock, times(2)).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
assertThat(captor.getAllValues().get(0).getLimit(), is(11));
assertThat(captor.getAllValues().get(1).getLimit(), is(11));
@@ -248,9 +241,8 @@ public class AbstractMongoQueryUnitTests {
ArgumentCaptor<Query> captor = ArgumentCaptor.forClass(Query.class);
verify(withProjectionMock, times(2)).as(Person.class);
verify(executableFind, times(2)).as(Person.class);
verify(withQueryMock, times(2)).matching(captor.capture());
verify(findOperationMock).inCollection("persons");
Document expectedSortObject = new Document().append("bar", -1);
assertThat(captor.getAllValues().get(0).getSortObject(), is(expectedSortObject));
@@ -269,17 +261,43 @@ public class AbstractMongoQueryUnitTests {
assertThat(query.execute(new Object[] { "lastname" }), is(reference));
}
@Test // DATAMONGO-1865
public void limitingSingleEntityQueryCallsFirst() {
Person reference = new Person();
doReturn(reference).when(withQueryMock).firstValue();
AbstractMongoQuery query = createQueryForMethod("findFirstByLastname", String.class).setLimitingQuery(true);
assertThat(query.execute(new Object[] { "lastname" }), is(reference));
}
@Test // DATAMONGO-1872
public void doesNotFixCollectionOnPreparation() {
AbstractMongoQuery query = createQueryForMethod(DynamicallyMappedRepository.class, "findBy");
query.execute(new Object[0]);
verify(executableFind, never()).inCollection(anyString());
verify(executableFind).as(DynamicallyMapped.class);
}
private MongoQueryFake createQueryForMethod(String methodName, Class<?>... paramTypes) {
return createQueryForMethod(Repo.class, methodName, paramTypes);
}
private MongoQueryFake createQueryForMethod(Class<?> repository, String methodName, Class<?>... paramTypes) {
try {
Method method = Repo.class.getMethod(methodName, paramTypes);
Method method = repository.getMethod(methodName, paramTypes);
ProjectionFactory factory = new SpelAwareProxyProjectionFactory();
MongoQueryMethod queryMethod = new MongoQueryMethod(method, new DefaultRepositoryMetadata(Repo.class), factory,
MongoQueryMethod queryMethod = new MongoQueryMethod(method, new DefaultRepositoryMetadata(repository), factory,
mappingContextMock);
return new MongoQueryFake(queryMethod, mongoOperationsMock);
} catch (Exception e) {
throw new IllegalArgumentException(e.getMessage(), e);
}
@@ -288,6 +306,7 @@ public class AbstractMongoQueryUnitTests {
private static class MongoQueryFake extends AbstractMongoQuery {
private boolean isDeleteQuery;
private boolean isLimitingQuery;
public MongoQueryFake(MongoQueryMethod method, MongoOperations operations) {
super(method, operations);
@@ -313,10 +332,21 @@ public class AbstractMongoQueryUnitTests {
return isDeleteQuery;
}
@Override
protected boolean isLimiting() {
return isLimitingQuery;
}
public MongoQueryFake setDeleteQuery(boolean isDeleteQuery) {
this.isDeleteQuery = isDeleteQuery;
return this;
}
public MongoQueryFake setLimitingQuery(boolean limitingQuery) {
isLimitingQuery = limitingQuery;
return this;
}
}
private interface Repo extends MongoRepository<Person, Long> {
@@ -338,5 +368,16 @@ public class AbstractMongoQueryUnitTests {
Slice<Person> findByLastname(String lastname, Pageable page);
Optional<Person> findByLastname(String lastname);
Person findFirstByLastname(String lastname);
}
// DATAMONGO-1872
@org.springframework.data.mongodb.core.mapping.Document(collection = "#{T(java.lang.Math).random()}")
static class DynamicallyMapped {}
interface DynamicallyMappedRepository extends Repository<DynamicallyMapped, ObjectId> {
DynamicallyMapped findBy();
}
}

View File

@@ -185,6 +185,16 @@ public class PartTreeMongoQueryUnitTests {
assertThat(query.getFieldsObject(), is(new Document()));
}
@Test // DATAMONGO-1865
public void limitingReturnsTrueIfTreeIsLimiting() {
assertThat(createQueryForMethod("findFirstBy").isLimiting(), is(true));
}
@Test // DATAMONGO-1865
public void limitingReturnsFalseIfTreeIsNotLimiting() {
assertThat(createQueryForMethod("findPersonBy").isLimiting(), is(false));
}
private org.springframework.data.mongodb.core.query.Query deriveQueryFromMethod(String method, Object... args) {
Class<?>[] types = new Class<?>[args.length];
@@ -245,6 +255,8 @@ public class PartTreeMongoQueryUnitTests {
List<Person> findBySex(Sex sex);
OpenProjection findAllBy();
Person findFirstBy();
}
interface PersonProjection {

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.repository.query;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.mockito.Mockito.any;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -35,6 +36,7 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.data.mongodb.core.ReactiveFindOperation.ReactiveFind;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
@@ -56,6 +58,7 @@ import org.springframework.expression.spel.standard.SpelExpressionParser;
* Unit tests for {@link ReactiveStringBasedMongoQuery}.
*
* @author Mark Paluch
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class ReactiveStringBasedMongoQueryUnitTests {
@@ -64,11 +67,15 @@ public class ReactiveStringBasedMongoQueryUnitTests {
@Mock ReactiveMongoOperations operations;
@Mock DbRefResolver factory;
@Mock ReactiveFind reactiveFind;
MongoConverter converter;
@Before
public void setUp() {
when(operations.query(any())).thenReturn(reactiveFind);
this.converter = new MappingMongoConverter(factory, new MongoMappingContext());
}

View File

@@ -6,7 +6,7 @@ Mark Pollack; Thomas Risberg; Oliver Gierke; Costin Leau; Jon Brisbin; Thomas Da
:toc-placement!:
:spring-data-commons-docs: ../../../../spring-data-commons/src/main/asciidoc
(C) 2008-2017 The original authors.
(C) 2008-2018 The original authors.
NOTE: _Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically._

View File

@@ -140,17 +140,18 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Str
Person findByShippingAddresses(Address address); <3>
Stream<Person> findAllBy(); <4>
Person findFirstByLastname(String lastname) <4>
Stream<Person> findAllBy(); <5>
}
----
<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`.
<2> Applies pagination to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly.
<3> Shows that you can query based on properties which are not a primitive type.
<4> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream.
<3> Shows that you can query based on properties which are not a primitive type. Throws `IncorrectResultSizeDataAccessException` if more than one match found.
<4> Uses the `First` keyword to restrict the query to the very first result. Unlike <3> this method does not throw an exception if more than one match was found.
<5> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream.
====
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
[cols="1,2,3", options="header"]

View File

@@ -951,7 +951,25 @@ assertThat(p.getAge(), is(1));
You can use several overloaded methods to remove an object from the database.
* *remove* Remove the given document based on one of the following: a specific object instance, a query document criteria combined with a class or a query document criteria combined with a specific collection name.
====
[source,java]
----
template.remove(tywin, "GOT"); <1>
template.remove(query(where("lastname").is("lannister")), "GOT"); <2>
template.remove(new Query().limit(3), "GOT"); <3>
template.findAllAndRemove(query(where("lastname").is("lannister"), "GOT"); <4>
template.findAllAndRemove(new Query().limit(3), "GOT"); <5>
----
<1> Remove a single entity via its `_id` from the associated collection.
<2> Remove all documents matching the criteria of the query from the `GOT` collection.
<3> Remove the first 3 documents in the `GOT` collection. Unlike <2>, the documents to remove are identified via their `_id` executing the given query applying `sort`, `limit` and `skip` options first and then remove all at once in a separate step.
<4> Remove all documents matching the criteria of the query from the `GOT` collection. Unlike <3>, documents do not get deleted in a batch but one by one.
<5> Remove the first 3 documents in the `GOT` collection. Unlike <3>, documents do not get deleted in a batch but one by one.
====
[[mongo-template.optimistic-locking]]
=== Optimistic locking

View File

@@ -52,15 +52,22 @@ We have a quite simple domain object here. Note that it has a property named `id
----
public interface ReactivePersonRepository extends ReactiveSortingRepository<Person, Long> {
Flux<Person> findByFirstname(String firstname);
Flux<Person> findByFirstname(String firstname); <1>
Flux<Person> findByFirstname(Publisher<String> firstname);
Flux<Person> findByFirstname(Publisher<String> firstname); <2>
Flux<Person> findByFirstnameOrderByLastname(String firstname, Pageable pageable);
Flux<Person> findByFirstnameOrderByLastname(String firstname, Pageable pageable); <3>
Mono<Person> findByFirstnameAndLastname(String firstname, String lastname);
Mono<Person> findByFirstnameAndLastname(String firstname, String lastname); <4>
Mono<Person> findFirstByLastname(String lastname); <5>
}
----
<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`.
<2> The method shows a query for all people with the given firstname once the firstname is emitted via the given `Publisher`.
<3> Use `Pageable` to pass on offset and sorting parameters to the database.
<4> Find a single entity for given criteria. Completes with `IncorrectResultSizeDataAccessException` on non unique results.
<5> Unless <4> the first entity is always emitted even if the query yields more result documents.
====
For JavaConfig use the `@EnableReactiveMongoRepositories` annotation. The annotation carries the very same attributes like the namespace element. If no base package is configured the infrastructure will scan the package of the annotated configuration class.

View File

@@ -1,6 +1,50 @@
Spring Data MongoDB Changelog
=============================
Changes in version 2.0.4.RELEASE (2018-02-19)
---------------------------------------------
* DATAMONGO-1872 - SpEL Expressions in @Document annotations are not re-evaluated for repository query executions.
* DATAMONGO-1871 - AggregationExpression rendering does not consider nested property aliasing.
* DATAMONGO-1870 - Skip parameter not working in MongoTemplate#remove(Query, Class).
* DATAMONGO-1865 - findFirst query method throws IncorrectResultSizeDataAccessException on non-unique result.
* DATAMONGO-1860 - Mongo count operation called twice in QuerydslMongoPredicateExecutor.findAll(Predicate, Pageable).
* DATAMONGO-1859 - Release 2.0.4 (Kay SR4).
Changes in version 2.1.0.M1 (2018-02-06)
----------------------------------------
* DATAMONGO-1864 - Upgrade to MongoDB Java Driver 3.6.2.
* DATAMONGO-1858 - Fix line endings.
* DATAMONGO-1850 - GridFsResource.getContentType() throws NullPointerException on absent metadata.
* DATAMONGO-1846 - Upgrade to MongoDB Java Driver 3.6.
* DATAMONGO-1844 - Update copyright years to 2018.
* DATAMONGO-1843 - Aggregation operator $reduce with ArrayOperators.Reduce produce a wrong Document.
* DATAMONGO-1835 - Add support for $jsonSchema to Criteria API.
* DATAMONGO-1831 - Failure to read Scala collection types in MappingMongoConverter.
* DATAMONGO-1824 - Assert compatibility with MongoDB Server 3.6.
* DATAMONGO-1823 - AfterConvertEvent is not published when using custom methods in repository interface.
* DATAMONGO-1822 - Adapt repository readme to changed configuration support.
* DATAMONGO-1821 - Fix method ambiguity in tests when compiling against MongoDB 3.6.
* DATAMONGO-1820 - Investigate failing TravisCI build.
* DATAMONGO-1818 - Reference documentation mentions @TailableCursor instead of @Tailable.
* DATAMONGO-1817 - Kotlin extensions should return nullable types.
* DATAMONGO-1815 - Adapt API changes in Property in test cases.
* DATAMONGO-1814 - Missing documentation on Faceted classification.
* DATAMONGO-1812 - Temporarily add milestone repository to plugin repositories.
* DATAMONGO-1811 - Reference Documentation doesn't match with API Documentation 2.X vesrion.
* DATAMONGO-1809 - Type hint usage broken when using positional parameters with more than one digit.
* DATAMONGO-1806 - GridFsResource wrong type in javaDoc.
* DATAMONGO-1805 - Documentation for operations.find uses wrong result type.
* DATAMONGO-1803 - Add support for MongoDB 3.6 change streams.
* DATAMONGO-1802 - No converter found capable of converting from type org.bson.types.Binary to type byte[].
* DATAMONGO-1795 - Remove obsolete Kotlin build configuration.
* DATAMONGO-1794 - Release 2.1 M1 (Lovelace).
* DATAMONGO-1761 - Add distinct operation to MongoTemplate.
* DATAMONGO-1696 - Reference documentation uses JPA Annotations.
* DATAMONGO-1553 - Add $sortByCount aggregation stage.
* DATAMONGO-1322 - Add support for validator when creating collection.
Changes in version 2.0.3.RELEASE (2018-01-24)
---------------------------------------------
* DATAMONGO-1858 - Fix line endings.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 2.0.3
Spring Data MongoDB 2.0.4
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").