Compare commits

...

18 Commits

Author SHA1 Message Date
Mark Paluch
63dfb59a3f DATAMONGO-2335 - Release version 2.2 RC3 (Moore). 2019-09-06 10:10:42 +02:00
Mark Paluch
d33ee2ffac DATAMONGO-2335 - Prepare 2.2 RC3 (Moore). 2019-09-06 10:10:12 +02:00
Mark Paluch
59388d99cc DATAMONGO-2335 - Updated changelog. 2019-09-06 10:10:07 +02:00
Christoph Strobl
ee6048e289 DATAMONGO-2357 - Fix read/write for MongoDB client.model GeoJSON types.
We now consider native GeoJSON types of the MongoDB client during conversion passing on the raw values to the driver when writing and using the configured MongoDB codecs on read.

Original pull request: #786.
2019-09-05 15:46:19 +02:00
Christoph Strobl
9a062d53f3 DATAMONGO-2310 - Update documentation for TypedAggregation. 2019-09-05 13:02:01 +02:00
Christoph Strobl
a3c5b07eb7 DATAMONGO-2348 - Update documentation of version property handling. 2019-09-05 10:29:53 +02:00
Christoph Strobl
40d30a230d DATAMONGO-2354 - Polishing.
Same as with FindPublisherPreparer the CursorPreparer needs to be public because it is used in one of the protected methods of MongoTemplate.

Original Pull Request: #784
2019-09-04 13:03:52 +02:00
kostya05983
10116f7c93 DATAMONGO-2354 - Change visibility of FindPublisherPreparer.
The FindPublisherPreparer is used in an protected method of ReactiveMongoTemplate and needs to be public to allow overriding.

Original Pull Request: #784
2019-09-04 13:03:44 +02:00
Mark Paluch
705203c898 DATAMONGO-2358 - Polishing.
Inherit dependency-management for Kotlin Coroutines.

Original pull request: #785.
2019-09-04 11:52:39 +02:00
Sebastien Deleuze
8fb9d9e5f4 DATAMONGO-2358 - Upgrade to Coroutines 1.3.0 and fix warnings.
Original pull request: #785.
2019-09-04 11:40:10 +02:00
Mark Paluch
c23c5ae6c6 DATAMONGO-2356 - Move off deprecated Flux/Mono.usingWhen to their replacement overrides. 2019-09-04 09:37:59 +02:00
Mark Paluch
e67bacf66c DATAMONGO-2352 - Polishing.
Apply typo fixes also to ReactiveMongoOperations.

Original pull request: #782.
2019-09-03 11:28:20 +02:00
Ryan Cloherty
617dbdac3f DATAMONGO-2352 - Fix documentation typos.
Original pull request: #782.
2019-09-03 11:26:47 +02:00
Mark Paluch
8ad4f4b71b DATAMONGO-2344 - Polishing.
Remove generics from FindPublisherPreparer. Rename ReadPreferenceAware.hasReadPreferences to hasReadPreference.

Original pull request: #779.
2019-09-03 11:23:37 +02:00
Christoph Strobl
9048ec83af DATAMONGO-2344 - Fix slaveOK query option not applied correctly.
Since MongoDB 3.6 the slaveOk option translates to the primaryPreferred ReadPreference that is now again applied when executing a find operation.

Original pull request: #779.
2019-09-03 11:23:31 +02:00
Christoph Strobl
b2e3e3fb8e DATAMONGO-2346 - Fix (reactive)auditing of immutable versioned entities.
We now check if the source is still the same object after potentially applying auditing modifications and make sure to pass the audited object on to the mapping layer.

Limited bean inspection scope of event listener tests to avoid side effects in index creation.

Original pull request: #780.
2019-09-03 10:43:29 +02:00
Mark Paluch
e9a2b84af5 DATAMONGO-2349 - Polishing.
Reformat code. Remove duplicate simple types.

Original pull request: #783.
2019-09-03 08:57:55 +02:00
Christoph Strobl
5b8be281fb DATAMONGO-2349 - Fix converter registration for java.time types.
The MongoDB Java Driver does not handle java.time types. Therefore those must not be considered simple types.
The behavior was changed by DATACMNS-1294 forcing usage of Reading & WritingConverter annotations to disambiguate converter direction.
This commit restores the converter registration to the state before the change in Spring Data Commons.

Original pull request: #783.
2019-09-03 08:57:41 +02:00
56 changed files with 1037 additions and 224 deletions

10
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.2.0.RC3</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.2.0.RC3</version>
</parent>
<modules>
@@ -26,7 +26,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.2.0.BUILD-SNAPSHOT</springdata.commons>
<springdata.commons>2.2.0.RC3</springdata.commons>
<mongo>3.11.0</mongo>
<mongo.reactivestreams>1.12.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
@@ -134,8 +134,8 @@
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</repository>
</repositories>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -276,14 +276,12 @@
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-core</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-reactor</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
</dependency>

View File

@@ -105,6 +105,7 @@ public abstract class AbstractMongoClientConfiguration extends MongoConfiguratio
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}

View File

@@ -111,6 +111,7 @@ public abstract class AbstractMongoConfiguration extends MongoConfigurationSuppo
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}

View File

@@ -83,6 +83,7 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(reactiveMongoDbFactory());
return converter;
}

View File

@@ -15,9 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import java.util.function.Function;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ReadPreference;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MongoCollection;
/**
* Simple callback interface to allow customization of a {@link FindIterable}.
@@ -25,7 +31,14 @@ import com.mongodb.client.FindIterable;
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface CursorPreparer {
public interface CursorPreparer extends ReadPreferenceAware {
/**
* Default {@link CursorPreparer} just passing on the given {@link FindIterable}.
*
* @since 2.2
*/
CursorPreparer NO_OP_PREPARER = (iterable -> iterable);
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
@@ -33,4 +46,37 @@ interface CursorPreparer {
* @param cursor
*/
FindIterable<Document> prepare(FindIterable<Document> cursor);
/**
* Apply query specific settings to {@link MongoCollection} and initate a find operation returning a
* {@link FindIterable} via the given {@link Function find} function.
*
* @param collection must not be {@literal null}.
* @param find must not be {@literal null}.
* @return
* @throws IllegalArgumentException if one of the required arguments is {@literal null}.
* @since 2.2
*/
default FindIterable<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindIterable<Document>> find) {
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());
}
return prepare(find.apply(collection));
}
/**
* @return the {@link ReadPreference} to apply or {@literal null} if none defined.
* @since 2.2
*/
@Override
@Nullable
default ReadPreference getReadPreference() {
return null;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.ReadPreference;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
@@ -267,6 +268,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.limit = Optional.of(limit);
return this;
}
@Override
public ReadPreference getReadPreference() {
return delegate.getReadPreference();
}
}
/**

View File

@@ -15,19 +15,69 @@
*/
package org.springframework.data.mongodb.core;
import java.util.function.Function;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ReadPreference;
import com.mongodb.reactivestreams.client.FindPublisher;
import com.mongodb.reactivestreams.client.MongoCollection;
/**
* Simple callback interface to allow customization of a {@link FindPublisher}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @author Konstantin Volivach
*/
interface FindPublisherPreparer {
public interface FindPublisherPreparer extends ReadPreferenceAware {
/**
* Default {@link FindPublisherPreparer} just passing on the given {@link FindPublisher}.
*
* @since 2.2
*/
FindPublisherPreparer NO_OP_PREPARER = (findPublisher -> findPublisher);
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param findPublisher must not be {@literal null}.
*/
<T> FindPublisher<T> prepare(FindPublisher<T> findPublisher);
FindPublisher<Document> prepare(FindPublisher<Document> findPublisher);
/**
* Apply query specific settings to {@link MongoCollection} and initate a find operation returning a
* {@link FindPublisher} via the given {@link Function find} function.
*
* @param collection must not be {@literal null}.
* @param find must not be {@literal null}.
* @return
* @throws IllegalArgumentException if one of the required arguments is {@literal null}.
* @since 2.2
*/
default FindPublisher<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindPublisher<Document>> find) {
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());
}
return prepare(find.apply(collection));
}
/**
* @return the {@link ReadPreference} to apply or {@literal null} if none defined.
* @since 2.2
*/
@Override
@Nullable
default ReadPreference getReadPreference() {
return null;
}
}

View File

@@ -1190,7 +1190,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1252,7 +1252,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1270,7 +1270,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type

View File

@@ -105,6 +105,7 @@ import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Meta.CursorOption;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
@@ -447,8 +448,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Document mappedFields = getMappedFieldsObject(query.getFieldsObject(), persistentEntity, returnType);
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
FindIterable<Document> cursor = new QueryCursorPreparer(query, entityType)
.prepare(collection.find(mappedQuery, Document.class).projection(mappedFields));
FindIterable<Document> cursor = new QueryCursorPreparer(query, entityType).initiateFind(collection,
col -> col.find(mappedQuery, Document.class).projection(mappedFields));
return new CloseableIterableCursorAdapter<>(cursor, exceptionTranslator,
new ProjectingReadCallback<>(mongoConverter, entityType, returnType, collectionName));
@@ -538,8 +539,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
sortObject, fieldsObject, collectionName);
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject, null), preparer, documentCallbackHandler,
collectionName);
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject, null),
preparer != null ? preparer : CursorPreparer.NO_OP_PREPARER, documentCallbackHandler, collectionName);
}
/*
@@ -777,8 +778,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(mode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
DefaultBulkOperations operations = new DefaultBulkOperations(this, collectionName, new BulkOperationContext(mode,
Optional.ofNullable(getPersistentEntity(entityType)), queryMapper, updateMapper, eventPublisher, entityCallbacks));
DefaultBulkOperations operations = new DefaultBulkOperations(this, collectionName,
new BulkOperationContext(mode, Optional.ofNullable(getPersistentEntity(entityType)), queryMapper, updateMapper,
eventPublisher, entityCallbacks));
operations.setExceptionTranslator(exceptionTranslator);
operations.setDefaultWriteConcern(writeConcern);
@@ -814,8 +816,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
if (ObjectUtils.isEmpty(query.getSortObject())) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(),
operations.forType(entityClass).getCollation(query).map(Collation::toMongoCollation).orElse(null),
entityClass);
new QueryCursorPreparer(query, entityClass), entityClass);
} else {
query.limit(1);
List<T> results = find(query, entityClass, collectionName);
@@ -933,6 +934,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
serializeToJsonSafely(mappedQuery), field, collectionName);
}
QueryCursorPreparer preparer = new QueryCursorPreparer(query, entityClass);
if (preparer.hasReadPreference()) {
collection = collection.withReadPreference(preparer.getReadPreference());
}
DistinctIterable<T> iterable = collection.distinct(mappedFieldName, mappedQuery, mongoDriverCompatibleType);
return operations.forType(entityClass) //
@@ -1007,8 +1013,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(collectionName, "CollectionName must not be null!");
Assert.notNull(returnType, "ReturnType must not be null!");
String collection = StringUtils.hasText(collectionName) ? collectionName
: getCollectionName(domainType);
String collection = StringUtils.hasText(collectionName) ? collectionName : getCollectionName(domainType);
String distanceField = operations.nearQueryDistanceFieldName(domainType);
Aggregation $geoNear = TypedAggregation.newAggregation(domainType, Aggregation.geoNear(near, distanceField))
@@ -1039,8 +1044,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable
@Override
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass,
getCollectionName(entityClass));
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, getCollectionName(entityClass));
}
@Nullable
@@ -1296,8 +1300,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(batchToSave, "BatchToSave must not be null!");
return (Collection<T>) doInsertBatch(getCollectionName(entityClass), batchToSave,
this.mongoConverter);
return (Collection<T>) doInsertBatch(getCollectionName(entityClass), batchToSave, this.mongoConverter);
}
@Override
@@ -1430,6 +1433,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
toSave = maybeEmitEvent(new BeforeConvertEvent<T>(toSave, collectionName)).getSource();
toSave = maybeCallBeforeConvert(toSave, collectionName);
if (source.getBean() != toSave) {
source = operations.forEntity(toSave, mongoConverter.getConversionService());
}
source.assertUpdateableIdIfNotSet();
MappedDocument mapped = source.toMappedDocument(mongoConverter);
@@ -1791,7 +1798,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return executeFindMultiInternal(
new FindCallback(new Document(), new Document(),
operations.forType(entityClass).getCollation().map(Collation::toMongoCollation).orElse(null)),
null, new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName), collectionName);
CursorPreparer.NO_OP_PREPARER, new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName),
collectionName);
}
@Override
@@ -2431,7 +2439,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @return the {@link List} of converted objects.
*/
protected <T> T doFindOne(String collectionName, Document query, Document fields, Class<T> entityClass) {
return doFindOne(collectionName, query, fields, null, entityClass);
return doFindOne(collectionName, query, fields, CursorPreparer.NO_OP_PREPARER, entityClass);
}
/**
@@ -2442,12 +2450,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param preparer the preparer used to modify the cursor on execution.
* @return the {@link List} of converted objects.
* @since 2.2
*/
@SuppressWarnings("ConstantConditions")
protected <T> T doFindOne(String collectionName, Document query, Document fields,
@Nullable com.mongodb.client.model.Collation collation, Class<T> entityClass) {
protected <T> T doFindOne(String collectionName, Document query, Document fields, CursorPreparer preparer,
Class<T> entityClass) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
Document mappedQuery = queryMapper.getMappedObject(query, entity);
@@ -2458,7 +2467,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
mappedFields, entityClass, collectionName);
}
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields, collation),
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields, preparer),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
}
@@ -2509,8 +2518,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName);
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields, null), preparer, objectCallback,
collectionName);
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields, null),
preparer != null ? preparer : CursorPreparer.NO_OP_PREPARER, objectCallback, collectionName);
}
/**
@@ -2732,6 +2741,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
DocumentCallback<T> objectCallback, String collectionName) {
try {
T result = objectCallback
.doWith(collectionCallback.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName)));
return result;
@@ -2759,7 +2769,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @return
*/
private <T> List<T> executeFindMultiInternal(CollectionCallback<FindIterable<Document>> collectionCallback,
@Nullable CursorPreparer preparer, DocumentCallback<T> objectCallback, String collectionName) {
CursorPreparer preparer, DocumentCallback<T> objectCallback, String collectionName) {
try {
@@ -2767,14 +2777,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
try {
FindIterable<Document> iterable = collectionCallback
.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName));
if (preparer != null) {
iterable = preparer.prepare(iterable);
}
cursor = iterable.iterator();
cursor = preparer
.initiateFind(getAndPrepareCollection(doGetDatabase(), collectionName), collectionCallback::doInCollection)
.iterator();
List<T> result = new ArrayList<>();
@@ -2796,21 +2801,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
private void executeQueryInternal(CollectionCallback<FindIterable<Document>> collectionCallback,
@Nullable CursorPreparer preparer, DocumentCallbackHandler callbackHandler, String collectionName) {
CursorPreparer preparer, DocumentCallbackHandler callbackHandler, String collectionName) {
try {
MongoCursor<Document> cursor = null;
try {
FindIterable<Document> iterable = collectionCallback
.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName));
if (preparer != null) {
iterable = preparer.prepare(iterable);
}
cursor = iterable.iterator();
cursor = preparer
.initiateFind(getAndPrepareCollection(doGetDatabase(), collectionName), collectionCallback::doInCollection)
.iterator();
while (cursor.hasNext()) {
callbackHandler.processDocument(cursor.next());
@@ -2845,6 +2846,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mappingContext);
converter.setCustomConversions(conversions);
converter.setCodecRegistryProvider(factory);
converter.afterPropertiesSet();
return converter;
@@ -2904,21 +2906,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Document query;
private final Optional<Document> fields;
private final @Nullable com.mongodb.client.model.Collation collation;
private final CursorPreparer cursorPreparer;
public FindOneCallback(Document query, Document fields, @Nullable com.mongodb.client.model.Collation collation) {
FindOneCallback(Document query, Document fields, CursorPreparer preparer) {
this.query = query;
this.fields = Optional.of(fields).filter(it -> !ObjectUtils.isEmpty(fields));
this.collation = collation;
this.cursorPreparer = preparer;
}
@Override
public Document doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
FindIterable<Document> iterable = collection.find(query, Document.class);
if (collation != null) {
iterable = iterable.collation(collation);
}
FindIterable<Document> iterable = cursorPreparer.initiateFind(collection, col -> col.find(query, Document.class));
if (LOGGER.isDebugEnabled()) {
@@ -3314,6 +3314,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
case PARTIAL:
cursorToUse = cursorToUse.partial(true);
break;
case SLAVE_OK:
break;
default:
throw new IllegalArgumentException(String.format("%s is no supported flag.", option));
}
@@ -3326,6 +3328,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return cursorToUse;
}
@Override
public ReadPreference getReadPreference() {
return query.getMeta().getFlags().contains(CursorOption.SLAVE_OK) ? ReadPreference.primaryPreferred() : null;
}
}
/**

View File

@@ -23,6 +23,7 @@ import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -31,8 +32,6 @@ import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.reactivestreams.client.FindPublisher;
/**
* Implementation of {@link ReactiveFindOperation}.
*
@@ -120,12 +119,7 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
public Mono<T> first() {
FindPublisherPreparer preparer = getCursorPreparer(query);
Flux<T> result = doFind(new FindPublisherPreparer() {
@Override
public <D> FindPublisher<D> prepare(FindPublisher<D> publisher) {
return preparer.prepare(publisher).limit(1);
}
});
Flux<T> result = doFind(publisher -> preparer.prepare(publisher).limit(1));
return result.next();
}
@@ -138,12 +132,7 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
public Mono<T> one() {
FindPublisherPreparer preparer = getCursorPreparer(query);
Flux<T> result = doFind(new FindPublisherPreparer() {
@Override
public <D> FindPublisher<D> prepare(FindPublisher<D> publisher) {
return preparer.prepare(publisher).limit(2);
}
});
Flux<T> result = doFind(publisher -> preparer.prepare(publisher).limit(2));
return result.collectList().flatMap(it -> {

View File

@@ -970,7 +970,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1030,7 +1030,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1078,7 +1078,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1096,7 +1096,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
@@ -1115,7 +1115,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
@@ -1133,7 +1133,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type

View File

@@ -102,6 +102,7 @@ import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCal
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Meta.CursorOption;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
@@ -572,7 +573,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return Flux.usingWhen(Mono.just(session), //
s -> ReactiveMongoTemplate.this.withSession(action, s), //
ClientSession::commitTransaction, //
ClientSession::abortTransaction) //
(sess, err) -> sess.abortTransaction(), //
ClientSession::commitTransaction) //
.doFinally(signalType -> doFinally.accept(session));
});
}
@@ -803,7 +805,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (ObjectUtils.isEmpty(query.getSortObject())) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
operations.forType(entityClass).getCollation(query).orElse(null));
new QueryFindPublisherPreparer(query, entityClass));
}
query.limit(1);
@@ -891,7 +893,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
String idKey = operations.getIdPropertyName(entityClass);
return doFindOne(collectionName, new Document(idKey, id), null, entityClass, null);
return doFindOne(collectionName, new Document(idKey, id), null, entityClass, (Collation) null);
}
/*
@@ -932,6 +934,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(mappedQuery), field, collectionName);
}
FindPublisherPreparer preparer = new QueryFindPublisherPreparer(query, entityClass);
if (preparer.hasReadPreference()) {
collection = collection.withReadPreference(preparer.getReadPreference());
}
DistinctPublisher<T> publisher = collection.distinct(mappedFieldName, mappedQuery, mongoDriverCompatibleType);
return operations.forType(entityClass).getCollation(query) //
.map(Collation::toMongoCollation) //
@@ -1542,7 +1549,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return maybeCallBeforeConvert(afterEvent, collectionName).flatMap(toConvert -> {
MappedDocument mapped = operations.forEntity(toSave).toMappedDocument(mongoConverter);
MappedDocument mapped = operations.forEntity(toConvert).toMappedDocument(mongoConverter);
Document document = mapped.getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(toConvert, document, collectionName));
@@ -1981,7 +1988,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findAll(java.lang.Class, java.lang.String)
*/
public <T> Flux<T> findAll(Class<T> entityClass, String collectionName) {
return executeFindMultiInternal(new FindCallback(null), null,
return executeFindMultiInternal(new FindCallback(null), FindPublisherPreparer.NO_OP_PREPARER,
new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName), collectionName);
}
@@ -2035,8 +2042,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
LOGGER.debug(String.format("find for class: %s in collection: %s", entityClass, collectionName));
return executeFindMultiInternal(
collection -> new FindCallback(null).doInCollection(collection).cursorType(CursorType.TailableAwait), null,
new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName), collectionName);
collection -> new FindCallback(null).doInCollection(collection).cursorType(CursorType.TailableAwait),
FindPublisherPreparer.NO_OP_PREPARER, new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName),
collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
@@ -2327,6 +2335,25 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
protected <T> Mono<T> doFindOne(String collectionName, Document query, @Nullable Document fields,
Class<T> entityClass, @Nullable Collation collation) {
return doFindOne(collectionName, query, fields, entityClass,
findPublisher -> collation != null ? findPublisher.collation(collation.toMongoCollation()) : findPublisher);
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
* The query document is specified as a standard {@link Document} and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param preparer the preparer modifying collection and publisher to fit the needs.
* @return the {@link List} of converted objects.
* @since 2.2
*/
protected <T> Mono<T> doFindOne(String collectionName, Document query, @Nullable Document fields,
Class<T> entityClass, FindPublisherPreparer preparer) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
Document mappedQuery = queryMapper.getMappedObject(query, entity);
Document mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
@@ -2336,7 +2363,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
}
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields, collation),
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields, preparer),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
}
@@ -2706,13 +2733,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Nullable FindPublisherPreparer preparer, DocumentCallback<T> objectCallback, String collectionName) {
return createFlux(collectionName, collection -> {
FindPublisher<Document> findPublisher = collectionCallback.doInCollection(collection);
if (preparer != null) {
findPublisher = preparer.prepare(findPublisher);
}
return Flux.from(findPublisher).map(objectCallback::doWith);
return Flux.from(preparer.initiateFind(collection, collectionCallback::doInCollection))
.map(objectCallback::doWith);
});
}
@@ -2752,7 +2774,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return type == null ? null : mappingContext.getPersistentEntity(type);
}
private static MappingMongoConverter getDefaultMongoConverter() {
private MappingMongoConverter getDefaultMongoConverter() {
MongoCustomConversions conversions = new MongoCustomConversions(Collections.emptyList());
@@ -2762,6 +2784,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
MappingMongoConverter converter = new MappingMongoConverter(NO_OP_REF_RESOLVER, context);
converter.setCustomConversions(conversions);
converter.setCodecRegistryProvider(this.mongoDatabaseFactory);
converter.afterPropertiesSet();
return converter;
@@ -2784,37 +2807,36 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*
* @author Oliver Gierke
* @author Thomas Risberg
* @author Christoph Strobl
*/
private static class FindOneCallback implements ReactiveCollectionCallback<Document> {
private final Document query;
private final Optional<Document> fields;
private final Optional<Collation> collation;
private final FindPublisherPreparer preparer;
FindOneCallback(Document query, @Nullable Document fields, @Nullable Collation collation) {
FindOneCallback(Document query, @Nullable Document fields, FindPublisherPreparer preparer) {
this.query = query;
this.fields = Optional.ofNullable(fields);
this.collation = Optional.ofNullable(collation);
this.preparer = preparer;
}
@Override
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
FindPublisher<Document> publisher = collection.find(query, Document.class);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: {} fields: {} in db.collection: {}", serializeToJsonSafely(query),
serializeToJsonSafely(fields.orElseGet(Document::new)), collection.getNamespace().getFullName());
}
FindPublisher<Document> publisher = preparer.initiateFind(collection, col -> col.find(query, Document.class));
if (fields.isPresent()) {
publisher = publisher.projection(fields.get());
}
publisher = collation.map(Collation::toMongoCollation).map(publisher::collation).orElse(publisher);
return publisher.limit(1).first();
}
}
@@ -3159,9 +3181,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
@SuppressWarnings("deprecation")
public <T> FindPublisher<T> prepare(FindPublisher<T> findPublisher) {
public FindPublisher<Document> prepare(FindPublisher<Document> findPublisher) {
FindPublisher<T> findPublisherToUse = operations.forType(type) //
FindPublisher<Document> findPublisherToUse = operations.forType(type) //
.getCollation(query) //
.map(Collation::toMongoCollation) //
.map(findPublisher::collation) //
@@ -3221,6 +3243,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return findPublisherToUse;
}
@Override
public ReadPreference getReadPreference() {
return query.getMeta().getFlags().contains(CursorOption.SLAVE_OK) ? ReadPreference.primaryPreferred() : null;
}
}
class TailingQueryFindPublisherPreparer extends QueryFindPublisherPreparer {
@@ -3230,7 +3257,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
@Override
public <T> FindPublisher<T> prepare(FindPublisher<T> findPublisher) {
public FindPublisher<Document> prepare(FindPublisher<Document> findPublisher) {
return super.prepare(findPublisher.cursorType(CursorType.TailableAwait));
}
}

View File

@@ -0,0 +1,45 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.lang.Nullable;
import com.mongodb.ReadPreference;
/**
* Interface to be implemented by any object that wishes to expose the {@link ReadPreference}.
* <p>
* Typically implemented by cursor or query preparer objects.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
*/
public interface ReadPreferenceAware {
/**
* @return {@literal true} if a {@link ReadPreference} is set.
*/
default boolean hasReadPreference() {
return getReadPreference() != null;
}
/**
* @return the {@link ReadPreference} to apply or {@literal null} if none set.
*/
@Nullable
ReadPreference getReadPreference();
}

View File

@@ -30,11 +30,13 @@ import java.util.Optional;
import java.util.Set;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.bson.codecs.DecoderContext;
import org.bson.conversions.Bson;
import org.bson.json.JsonReader;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -57,6 +59,7 @@ import org.springframework.data.mapping.model.PropertyValueProvider;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELExpressionParameterValueProvider;
import org.springframework.data.mongodb.CodecRegistryProvider;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -103,6 +106,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected @Nullable ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper;
protected @Nullable String mapKeyDotReplacement = null;
protected @Nullable CodecRegistryProvider codecRegistryProvider;
private SpELContext spELContext;
@@ -141,6 +145,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public MappingMongoConverter(MongoDbFactory mongoDbFactory,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this(new DefaultDbRefResolver(mongoDbFactory), mappingContext);
setCodecRegistryProvider(mongoDbFactory);
}
/**
@@ -178,6 +183,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.mapKeyDotReplacement = mapKeyDotReplacement;
}
/**
* Configure a {@link CodecRegistryProvider} that provides native MongoDB {@link org.bson.codecs.Codec codecs} for
* reading values.
*
* @param codecRegistryProvider can be {@literal null}.
* @since 2.2
*/
public void setCodecRegistryProvider(@Nullable CodecRegistryProvider codecRegistryProvider) {
this.codecRegistryProvider = codecRegistryProvider;
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.EntityConverter#getMappingContext()
@@ -253,6 +269,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(typeToUse);
if (entity == null) {
if (codecRegistryProvider != null) {
Optional<? extends Codec<? extends S>> codec = codecRegistryProvider.getCodecFor(rawType);
if(codec.isPresent()) {
return codec.get().decode(new JsonReader(target.toJson()),
DecoderContext.builder().build());
}
}
throw new MappingException(String.format(INVALID_TYPE_TO_READ, target, typeToUse.getType()));
}
@@ -1650,6 +1676,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
target.spELContext = spELContext;
target.setInstantiators(instantiators);
target.typeMapper = typeMapper;
target.setCodecRegistryProvider(dbFactory);
target.afterPropertiesSet();
return target;

View File

@@ -33,6 +33,14 @@ import org.bson.types.Symbol;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import com.mongodb.DBRef;
import com.mongodb.client.model.geojson.Geometry;
import com.mongodb.client.model.geojson.GeometryCollection;
import com.mongodb.client.model.geojson.LineString;
import com.mongodb.client.model.geojson.MultiLineString;
import com.mongodb.client.model.geojson.MultiPoint;
import com.mongodb.client.model.geojson.MultiPolygon;
import com.mongodb.client.model.geojson.Point;
import com.mongodb.client.model.geojson.Polygon;
/**
* Simple constant holder for a {@link SimpleTypeHolder} enriched with Mongo specific simple types.
@@ -45,15 +53,14 @@ public abstract class MongoSimpleTypes {
public static final Set<Class<?>> AUTOGENERATED_ID_TYPES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
Set<Class<?>> classes = new HashSet<>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
AUTOGENERATED_ID_TYPES = Collections.unmodifiableSet(classes);
Set<Class<?>> simpleTypes = new HashSet<Class<?>>();
Set<Class<?>> simpleTypes = new HashSet<>();
simpleTypes.add(Binary.class);
simpleTypes.add(BsonObjectId.class);
simpleTypes.add(DBRef.class);
simpleTypes.add(Decimal128.class);
simpleTypes.add(org.bson.Document.class);
@@ -71,7 +78,6 @@ public abstract class MongoSimpleTypes {
simpleTypes.add(BsonDbPointer.class);
simpleTypes.add(BsonDecimal128.class);
simpleTypes.add(BsonDocument.class);
simpleTypes.add(BsonDocument.class);
simpleTypes.add(BsonDouble.class);
simpleTypes.add(BsonInt32.class);
simpleTypes.add(BsonInt64.class);
@@ -82,11 +88,32 @@ public abstract class MongoSimpleTypes {
simpleTypes.add(BsonString.class);
simpleTypes.add(BsonTimestamp.class);
simpleTypes.add(Geometry.class);
simpleTypes.add(GeometryCollection.class);
simpleTypes.add(LineString.class);
simpleTypes.add(MultiLineString.class);
simpleTypes.add(MultiPoint.class);
simpleTypes.add(MultiPolygon.class);
simpleTypes.add(Point.class);
simpleTypes.add(Polygon.class);
MONGO_SIMPLE_TYPES = Collections.unmodifiableSet(simpleTypes);
}
private static final Set<Class<?>> MONGO_SIMPLE_TYPES;
public static final SimpleTypeHolder HOLDER = new SimpleTypeHolder(MONGO_SIMPLE_TYPES, true);
public static final SimpleTypeHolder HOLDER = new SimpleTypeHolder(MONGO_SIMPLE_TYPES, true) {
@Override
public boolean isSimpleType(Class<?> type) {
if (type.getName().startsWith("java.time")) {
return false;
}
return super.isSimpleType(type);
}
};
private MongoSimpleTypes() {}
}

View File

@@ -68,7 +68,7 @@ class DataBufferPublisherAdapter {
state.request(sink, n);
});
});
}, AsyncInputStream::close, AsyncInputStream::close, AsyncInputStream::close) //
}, AsyncInputStream::close, (it, err) -> it.close(), AsyncInputStream::close) //
.concatMap(Flux::just, 1);
}

View File

@@ -24,8 +24,6 @@ import org.springframework.data.mongodb.core.BulkOperations.BulkMode
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.aggregation.AggregationResults
import org.springframework.data.mongodb.core.index.IndexOperations
import org.springframework.data.mongodb.core.mapreduce.GroupBy
import org.springframework.data.mongodb.core.mapreduce.GroupByResults
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults
import org.springframework.data.mongodb.core.query.Criteria
@@ -200,7 +198,9 @@ inline fun <reified T : Any> MongoOperations.findAll(collectionName: String? = n
* @author Sebastien Deleuze
* @since 2.0
*/
inline fun <reified T : Any> MongoOperations.group(inputCollectionName: String, groupBy: GroupBy): GroupByResults<T> =
@Suppress("DEPRECATION")
@Deprecated("since 2.2, the `group` command has been removed in MongoDB Server 4.2.0.", replaceWith = ReplaceWith("aggregate<T>()"))
inline fun <reified T : Any> MongoOperations.group(inputCollectionName: String, groupBy: org.springframework.data.mongodb.core.mapreduce.GroupBy): org.springframework.data.mongodb.core.mapreduce.GroupByResults<T> =
group(inputCollectionName, groupBy, T::class.java)
/**
@@ -209,7 +209,9 @@ inline fun <reified T : Any> MongoOperations.group(inputCollectionName: String,
* @author Sebastien Deleuze
* @since 2.0
*/
inline fun <reified T : Any> MongoOperations.group(criteria: Criteria, inputCollectionName: String, groupBy: GroupBy): GroupByResults<T> =
@Suppress("DEPRECATION")
@Deprecated("since 2.2, the `group` command has been removed in MongoDB Server 4.2.0.", replaceWith = ReplaceWith("aggregate<T>()"))
inline fun <reified T : Any> MongoOperations.group(criteria: Criteria, inputCollectionName: String, groupBy: org.springframework.data.mongodb.core.mapreduce.GroupBy): org.springframework.data.mongodb.core.mapreduce.GroupByResults<T> =
group(criteria, inputCollectionName, groupBy, T::class.java)
/**
@@ -276,6 +278,8 @@ inline fun <reified T : Any> MongoOperations.mapReduce(query: Query, collectionN
* @author Sebastien Deleuze
* @since 2.0
*/
@Suppress("DEPRECATION")
@Deprecated("Since 2.2, the `geoNear` command has been removed in MongoDB Server 4.2.0. Use Aggregations with `Aggregation.geoNear(NearQuery, String)` instead.", replaceWith = ReplaceWith("aggregate<T>()"))
inline fun <reified T : Any> MongoOperations.geoNear(near: NearQuery, collectionName: String? = null): GeoResults<T> =
if (collectionName != null) geoNear(near, T::class.java, collectionName)
else geoNear(near, T::class.java)
@@ -339,7 +343,7 @@ inline fun <reified T : Any> MongoOperations.findById(id: Any, collectionName: S
*/
@Deprecated("Since 2.2, use the reified variant", replaceWith = ReplaceWith("findDistinct<T, E>(field)"))
inline fun <reified T : Any> MongoOperations.findDistinct(field: String, entityClass: KClass<*>): List<T> =
findDistinct(field, entityClass.java, T::class.java);
findDistinct(field, entityClass.java, T::class.java)
/**
* Extension for [MongoOperations.findDistinct] leveraging reified type parameters.

View File

@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.flow.asFlow
import kotlinx.coroutines.reactive.asFlow
import kotlin.reflect.KClass
/**
@@ -42,12 +42,9 @@ inline fun <reified T : Any> ReactiveAggregationOperation.aggregateAndReturn():
/**
* Coroutines [Flow] variant of [ReactiveAggregationOperation.TerminatingAggregationOperation.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveAggregationOperation.TerminatingAggregationOperation<T>.flow(batchSize: Int = 1): Flow<T> =
all().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveAggregationOperation.TerminatingAggregationOperation<T>.flow(): Flow<T> =
all().asFlow()

View File

@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.flow.asFlow
import kotlinx.coroutines.reactive.asFlow
/**
* Extension for [RactiveChangeStreamOperation.changeStream] leveraging reified type parameters.
@@ -40,13 +40,11 @@ inline fun <reified T : Any> ReactiveChangeStreamOperation.ChangeStreamWithFilte
/**
* Coroutines [Flow] variant of [ReactiveChangeStreamOperation.TerminatingChangeStream.listen].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Christoph Strobl
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveChangeStreamOperation.TerminatingChangeStream<T>.flow(batchSize: Int = 1): Flow<ChangeStreamEvent<T>> =
listen().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveChangeStreamOperation.TerminatingChangeStream<T>.flow(): Flow<ChangeStreamEvent<T>> =
listen().asFlow()

View File

@@ -15,11 +15,11 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.asFlow
import kotlinx.coroutines.reactive.awaitFirstOrNull
import kotlinx.coroutines.reactive.awaitSingle
import kotlinx.coroutines.reactive.flow.asFlow
import org.springframework.data.geo.GeoResult
import kotlin.reflect.KClass
@@ -137,48 +137,36 @@ suspend fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.awaitExists(): Bo
/**
* Coroutines [Flow] variant of [ReactiveFindOperation.TerminatingFind.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
*/
@FlowPreview
fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.flow(batchSize: Int = 1): Flow<T> =
all().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.flow(): Flow<T> =
all().asFlow()
/**
* Coroutines [Flow] variant of [ReactiveFindOperation.TerminatingFind.tail].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
*/
@FlowPreview
fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.tailAsFlow(batchSize: Int = 1): Flow<T> =
tail().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveFindOperation.TerminatingFind<T>.tailAsFlow(): Flow<T> =
tail().asFlow()
/**
* Coroutines [Flow] variant of [ReactiveFindOperation.TerminatingFindNear.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
*/
@FlowPreview
fun <T : Any> ReactiveFindOperation.TerminatingFindNear<T>.flow(batchSize: Int = 1): Flow<GeoResult<T>> =
all().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveFindOperation.TerminatingFindNear<T>.flow(): Flow<GeoResult<T>> =
all().asFlow()
/**
* Coroutines [Flow] variant of [ReactiveFindOperation.TerminatingDistinct.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveFindOperation.TerminatingDistinct<T>.flow(batchSize: Int = 1): Flow<T> =
all().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveFindOperation.TerminatingDistinct<T>.flow(): Flow<T> =
all().asFlow()

View File

@@ -15,10 +15,10 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.asFlow
import kotlinx.coroutines.reactive.awaitSingle
import kotlinx.coroutines.reactive.flow.asFlow
import kotlin.reflect.KClass
/**
@@ -53,12 +53,9 @@ suspend inline fun <reified T: Any> ReactiveInsertOperation.TerminatingInsert<T>
/**
* Coroutines [Flow] variant of [ReactiveInsertOperation.TerminatingInsert.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveInsertOperation.TerminatingInsert<T>.flow(objects: Collection<T>, batchSize: Int = 1): Flow<T> =
all(objects).asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveInsertOperation.TerminatingInsert<T>.flow(objects: Collection<T>): Flow<T> =
all(objects).asFlow()

View File

@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.flow.asFlow
import kotlinx.coroutines.reactive.asFlow
import kotlin.reflect.KClass
/**
@@ -62,12 +62,9 @@ inline fun <reified T : Any> ReactiveMapReduceOperation.MapReduceWithProjection<
/**
* Coroutines [Flow] variant of [ReactiveMapReduceOperation.TerminatingMapReduce.all].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveMapReduceOperation.TerminatingMapReduce<T>.flow(batchSize: Int = 1): Flow<T> =
all().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveMapReduceOperation.TerminatingMapReduce<T>.flow(): Flow<T> =
all().asFlow()

View File

@@ -216,6 +216,8 @@ inline fun <reified T : Any, reified E : Any> ReactiveMongoOperations.findDistin
* @author Sebastien Deleuze
* @since 2.0
*/
@Suppress("DEPRECATION")
@Deprecated("Since 2.2, the `geoNear` command has been removed in MongoDB Server 4.2.0. Use Aggregations with `Aggregation.geoNear(NearQuery, String)` instead.", replaceWith = ReplaceWith("aggregate<T>()"))
inline fun <reified T : Any> ReactiveMongoOperations.geoNear(near: NearQuery, collectionName: String? = null): Flux<GeoResult<T>> =
if (collectionName != null) geoNear(near, T::class.java, collectionName) else geoNear(near, T::class.java)

View File

@@ -16,10 +16,10 @@
package org.springframework.data.mongodb.core
import com.mongodb.client.result.DeleteResult
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.reactive.asFlow
import kotlinx.coroutines.reactive.awaitSingle
import kotlinx.coroutines.reactive.flow.asFlow
import kotlin.reflect.KClass
/**
@@ -54,12 +54,9 @@ suspend fun <T : Any> ReactiveRemoveOperation.TerminatingRemove<T>.allAndAwait()
/**
* Coroutines [Flow] variant of [ReactiveRemoveOperation.TerminatingRemove.findAndRemove].
*
* Backpressure is controlled by [batchSize] parameter that controls the size of in-flight elements
* and [org.reactivestreams.Subscription.request] size.
*
* @author Sebastien Deleuze
* @since 2.2
*/
@FlowPreview
fun <T : Any> ReactiveRemoveOperation.TerminatingRemove<T>.findAndRemoveAsFlow(batchSize: Int = 1): Flow<T> =
findAndRemove().asFlow(batchSize)
@ExperimentalCoroutinesApi
fun <T : Any> ReactiveRemoveOperation.TerminatingRemove<T>.findAndRemoveAsFlow(): Flow<T> =
findAndRemove().asFlow()

View File

@@ -16,20 +16,30 @@
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.Arrays;
import org.bson.Document;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.dao.DataAccessException;
import org.springframework.data.annotation.Id;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.geojson.Geometry;
import com.mongodb.client.model.geojson.MultiPolygon;
import com.mongodb.client.model.geojson.PolygonCoordinates;
import com.mongodb.client.model.geojson.Position;
/**
* Integration test for {@link MongoTemplate}.
@@ -65,6 +75,46 @@ public class MongoTemplateMappingTests {
checkPersonPersisted(template2);
}
@Test // DATAMONGO-2357
public void writesAndReadsEntityWithNativeMongoGeoJsonTypesCorrectly() {
WithMongoGeoJson source = new WithMongoGeoJson();
source.id = "id-2";
source.multiPolygon = new MultiPolygon(Arrays.asList(new PolygonCoordinates(Arrays.asList(new Position(0, 0),
new Position(0, 1), new Position(1, 1), new Position(1, 0), new Position(0, 0)))));
template1.save(source);
assertThat(template1.findOne(query(where("id").is(source.id)), WithMongoGeoJson.class)).isEqualTo(source);
}
@Test // DATAMONGO-2357
public void writesAndReadsEntityWithOpenNativeMongoGeoJsonTypesCorrectly() {
WithOpenMongoGeoJson source = new WithOpenMongoGeoJson();
source.id = "id-2";
source.geometry = new MultiPolygon(Arrays.asList(new PolygonCoordinates(Arrays.asList(new Position(0, 0),
new Position(0, 1), new Position(1, 1), new Position(1, 0), new Position(0, 0)))));
template1.save(source);
assertThat(template1.findOne(query(where("id").is(source.id)), WithOpenMongoGeoJson.class)).isEqualTo(source);
}
@Data
static class WithMongoGeoJson {
@Id String id;
MultiPolygon multiPolygon;
}
@Data
static class WithOpenMongoGeoJson {
@Id String id;
Geometry geometry;
}
private void addAndRetrievePerson(MongoTemplate template) {
Person person = new Person("Oliver");
person.setAge(25);

View File

@@ -935,7 +935,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void appliesFieldsWhenInterfaceProjectionIsClosedAndQueryDoesNotDefineFields() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonProjection.class, null);
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonProjection.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document("firstname", 1)));
}
@@ -943,7 +944,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void doesNotApplyFieldsWhenInterfaceProjectionIsClosedAndQueryDefinesFields() {
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, PersonProjection.class, null);
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, PersonProjection.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document("bar", 1)));
}
@@ -951,7 +953,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void doesNotApplyFieldsWhenInterfaceProjectionIsOpen() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonSpELProjection.class, null);
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonSpELProjection.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
}
@@ -959,7 +962,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733, DATAMONGO-2041
public void appliesFieldsToDtoProjection() {
template.doFind("star-wars", new Document(), new Document(), Person.class, Jedi.class, null);
template.doFind("star-wars", new Document(), new Document(), Person.class, Jedi.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document("firstname", 1)));
}
@@ -967,7 +971,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void doesNotApplyFieldsToDtoProjectionWhenQueryDefinesFields() {
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, Jedi.class, null);
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, Jedi.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document("bar", 1)));
}
@@ -975,7 +980,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void doesNotApplyFieldsWhenTargetIsNotAProjection() {
template.doFind("star-wars", new Document(), new Document(), Person.class, Person.class, null);
template.doFind("star-wars", new Document(), new Document(), Person.class, Person.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
}
@@ -983,7 +989,8 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1733
public void doesNotApplyFieldsWhenTargetExtendsDomainType() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonExtended.class, null);
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonExtended.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
}
@@ -1637,6 +1644,37 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
Assertions.assertThat(ReflectionTestUtils.getField(template, "entityCallbacks")).isSameAs(callbacks);
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFind() {
template.find(new Query().slaveOk(), AutogenerateableId.class);
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFindOne() {
template.findOne(new Query().slaveOk(), AutogenerateableId.class);
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFindDistinct() {
template.findDistinct(new Query().slaveOk(), "name", AutogenerateableId.class, String.class);
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForStream() {
template.stream(new Query().slaveOk(), AutogenerateableId.class);
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
class AutogenerateableId {
@Id BigInteger id;

View File

@@ -40,7 +40,6 @@ import org.mockito.ArgumentCaptor;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.reactivestreams.Publisher;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.StaticApplicationContext;
@@ -65,6 +64,7 @@ import org.springframework.lang.Nullable;
import org.springframework.test.util.ReflectionTestUtils;
import org.springframework.util.CollectionUtils;
import com.mongodb.ReadPreference;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
@@ -121,6 +121,7 @@ public class ReactiveMongoTemplateUnitTests {
when(db.getCollection(any(), any())).thenReturn(collection);
when(db.runCommand(any(), any(Class.class))).thenReturn(runCommandPublisher);
when(db.createCollection(any(), any(CreateCollectionOptions.class))).thenReturn(runCommandPublisher);
when(collection.withReadPreference(any())).thenReturn(collection);
when(collection.find(any(Class.class))).thenReturn(findPublisher);
when(collection.find(any(Document.class), any(Class.class))).thenReturn(findPublisher);
when(collection.aggregate(anyList())).thenReturn(aggregatePublisher);
@@ -324,8 +325,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void appliesFieldsWhenInterfaceProjectionIsClosedAndQueryDoesNotDefineFields() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonProjection.class, null)
.subscribe();
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonProjection.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher).projection(eq(new Document("firstname", 1)));
}
@@ -333,8 +334,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void doesNotApplyFieldsWhenInterfaceProjectionIsClosedAndQueryDefinesFields() {
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, PersonProjection.class, null)
.subscribe();
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, PersonProjection.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher).projection(eq(new Document("bar", 1)));
}
@@ -342,8 +343,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void doesNotApplyFieldsWhenInterfaceProjectionIsOpen() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonSpELProjection.class, null)
.subscribe();
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonSpELProjection.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher, never()).projection(any());
}
@@ -351,7 +352,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719, DATAMONGO-2041
public void appliesFieldsToDtoProjection() {
template.doFind("star-wars", new Document(), new Document(), Person.class, Jedi.class, null).subscribe();
template.doFind("star-wars", new Document(), new Document(), Person.class, Jedi.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher).projection(eq(new Document("firstname", 1)));
}
@@ -359,7 +361,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void doesNotApplyFieldsToDtoProjectionWhenQueryDefinesFields() {
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, Jedi.class, null).subscribe();
template.doFind("star-wars", new Document(), new Document("bar", 1), Person.class, Jedi.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher).projection(eq(new Document("bar", 1)));
}
@@ -367,7 +370,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void doesNotApplyFieldsWhenTargetIsNotAProjection() {
template.doFind("star-wars", new Document(), new Document(), Person.class, Person.class, null).subscribe();
template.doFind("star-wars", new Document(), new Document(), Person.class, Person.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher, never()).projection(any());
}
@@ -375,7 +379,8 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1719
public void doesNotApplyFieldsWhenTargetExtendsDomainType() {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonExtended.class, null).subscribe();
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonExtended.class,
FindPublisherPreparer.NO_OP_PREPARER).subscribe();
verify(findPublisher, never()).projection(any());
}
@@ -804,6 +809,30 @@ public class ReactiveMongoTemplateUnitTests {
Assertions.assertThat(ReflectionTestUtils.getField(template, "entityCallbacks")).isSameAs(callbacks);
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFind() {
template.find(new Query().slaveOk(), AutogenerateableId.class).subscribe();
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFindOne() {
template.findOne(new Query().slaveOk(), AutogenerateableId.class).subscribe();
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Test // DATAMONGO-2344
public void slaveOkQueryOptionShouldApplyPrimaryPreferredReadPreferenceForFindDistinct() {
template.findDistinct(new Query().slaveOk(), "name", AutogenerateableId.class, String.class).subscribe();
verify(collection).withReadPreference(eq(ReadPreference.primaryPreferred()));
}
@Data
@org.springframework.data.mongodb.core.mapping.Document(collection = "star-wars")
static class Person {

View File

@@ -407,6 +407,7 @@ public class SessionBoundMongoTemplateTests {
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mappingContext);
converter.setCustomConversions(conversions);
converter.setCodecRegistryProvider(factory);
converter.afterPropertiesSet();
return converter;

View File

@@ -0,0 +1,132 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.auditing;
import static org.assertj.core.api.Assertions.*;
import java.time.Instant;
import java.util.concurrent.TimeUnit;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.LastModifiedDate;
import org.springframework.data.annotation.Version;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.config.EnableMongoAuditing;
import org.springframework.data.mongodb.core.KAuditableVersionedEntity;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.MongoClient;
/**
* @author Christoph Strobl
*/
@RunWith(SpringJUnit4ClassRunner.class)
public class MongoTemplateAuditingTests {
@Configuration
@EnableMongoAuditing
static class Conf extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "mongo-template-audit-tests";
}
}
@Autowired MongoTemplate template;
@Test // DATAMONGO-2346
public void auditingSetsLastModifiedDateCorrectlyForImmutableVersionedEntityOnSave() throws InterruptedException {
template.remove(new Query(), ImmutableAuditableEntityWithVersion.class);
ImmutableAuditableEntityWithVersion entity = new ImmutableAuditableEntityWithVersion("id-1", "value", null, null);
ImmutableAuditableEntityWithVersion inserted = template.save(entity);
TimeUnit.MILLISECONDS.sleep(500);
ImmutableAuditableEntityWithVersion modified = inserted.withValue("changed-value");
ImmutableAuditableEntityWithVersion updated = template.save(modified);
ImmutableAuditableEntityWithVersion fetched = template.findOne(Query.query(Criteria.where("id").is(entity.id)),
ImmutableAuditableEntityWithVersion.class);
assertThat(updated.modificationDate).isAfter(inserted.modificationDate);
assertThat(fetched.modificationDate).isAfter(inserted.modificationDate);
assertThat(fetched.modificationDate).isEqualTo(updated.modificationDate);
}
@Test // DATAMONGO-2346
public void auditingSetsLastModifiedDateCorrectlyForImmutableVersionedKotlinEntityOnSave()
throws InterruptedException {
template.remove(new Query(), KAuditableVersionedEntity.class);
KAuditableVersionedEntity entity = new KAuditableVersionedEntity("kId-1", "value", null, null);
KAuditableVersionedEntity inserted = template.save(entity);
TimeUnit.MILLISECONDS.sleep(500);
KAuditableVersionedEntity updated = template.save(inserted.withValue("changed-value"));
KAuditableVersionedEntity fetched = template.findOne(Query.query(Criteria.where("id").is(entity.getId())),
KAuditableVersionedEntity.class);
assertThat(updated.getModificationDate()).isAfter(inserted.getModificationDate());
assertThat(fetched.getModificationDate()).isAfter(inserted.getModificationDate());
assertThat(fetched.getModificationDate()).isEqualTo(updated.getModificationDate());
}
static class ImmutableAuditableEntityWithVersion {
final @Id String id;
final String value;
final @Version Integer version;
final @LastModifiedDate Instant modificationDate;
ImmutableAuditableEntityWithVersion(String id, String value, Integer version, Instant modificationDate) {
this.id = id;
this.value = value;
this.version = version;
this.modificationDate = modificationDate;
}
ImmutableAuditableEntityWithVersion withValue(String value) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
ImmutableAuditableEntityWithVersion withModificationDate(Instant modificationDate) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
ImmutableAuditableEntityWithVersion withVersion(Integer version) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
}
}

View File

@@ -0,0 +1,158 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.auditing;
import static org.assertj.core.api.Assertions.*;
import reactor.test.StepVerifier;
import reactor.util.function.Tuples;
import java.time.Duration;
import java.time.Instant;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.LastModifiedDate;
import org.springframework.data.annotation.Version;
import org.springframework.data.mongodb.config.AbstractReactiveMongoConfiguration;
import org.springframework.data.mongodb.config.EnableMongoAuditing;
import org.springframework.data.mongodb.core.KAuditableVersionedEntity;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
/**
* @author Christoph Strobl
*/
@RunWith(SpringJUnit4ClassRunner.class)
public class ReactiveMongoTemplateAuditingTests {
static final String DB_NAME = "mongo-template-audit-tests";
@Configuration
@EnableMongoAuditing
static class Conf extends AbstractReactiveMongoConfiguration {
@Bean
@Override
public MongoClient reactiveMongoClient() {
return MongoClients.create();
}
@Override
protected String getDatabaseName() {
return DB_NAME;
}
}
@Autowired ReactiveMongoTemplate template;
@Autowired MongoClient client;
@Before
public void setUp() {
MongoTestUtils.flushCollection(DB_NAME, template.getCollectionName(ImmutableAuditableEntityWithVersion.class),
client);
MongoTestUtils.flushCollection(DB_NAME, template.getCollectionName(KAuditableVersionedEntity.class), client);
}
@Test // DATAMONGO-2346
public void auditingSetsLastModifiedDateCorrectlyForImmutableVersionedEntityOnSave() {
ImmutableAuditableEntityWithVersion entity = new ImmutableAuditableEntityWithVersion(null, "value", null, null);
template.save(entity).delayElement(Duration.ofMillis(500)) //
.flatMap(inserted -> template.save(inserted.withValue("changed-value")) //
.map(updated -> Tuples.of(inserted, updated))) //
.flatMap(tuple2 -> template
.findOne(Query.query(Criteria.where("id").is(tuple2.getT1().id)), ImmutableAuditableEntityWithVersion.class)
.map(fetched -> Tuples.of(tuple2.getT1(), tuple2.getT2(), fetched))) //
.as(StepVerifier::create) //
.consumeNextWith(tuple3 -> {
assertThat(tuple3.getT2().modificationDate).isAfter(tuple3.getT1().modificationDate);
assertThat(tuple3.getT3().modificationDate).isAfter(tuple3.getT1().modificationDate);
assertThat(tuple3.getT3().modificationDate).isEqualTo(tuple3.getT2().modificationDate);
}) //
.verifyComplete();
}
@Test // DATAMONGO-2346
public void auditingSetsLastModifiedDateCorrectlyForImmutableVersionedKotlinEntityOnSave() {
KAuditableVersionedEntity entity = new KAuditableVersionedEntity(null, "value", null, null);
template.save(entity).delayElement(Duration.ofMillis(500)) //
.flatMap(inserted -> template.save(inserted.withValue("changed-value")) //
.map(updated -> Tuples.of(inserted, updated))) //
.flatMap(tuple2 -> template
.findOne(Query.query(Criteria.where("id").is(tuple2.getT1().getId())), KAuditableVersionedEntity.class)
.map(fetched -> Tuples.of(tuple2.getT1(), tuple2.getT2(), fetched))) //
.as(StepVerifier::create) //
.consumeNextWith(tuple3 -> {
assertThat(tuple3.getT2().getModificationDate()).isAfter(tuple3.getT1().getModificationDate());
assertThat(tuple3.getT3().getModificationDate()).isAfter(tuple3.getT1().getModificationDate());
assertThat(tuple3.getT3().getModificationDate()).isEqualTo(tuple3.getT2().getModificationDate());
}) //
.verifyComplete();
}
@Document("versioned-auditable")
static class ImmutableAuditableEntityWithVersion {
final @Id String id;
final String value;
final @Version Integer version;
final @LastModifiedDate Instant modificationDate;
ImmutableAuditableEntityWithVersion(String id, String value, Integer version, Instant modificationDate) {
this.id = id;
this.value = value;
this.version = version;
this.modificationDate = modificationDate;
}
ImmutableAuditableEntityWithVersion withId(String id) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
ImmutableAuditableEntityWithVersion withValue(String value) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
ImmutableAuditableEntityWithVersion withModificationDate(Instant modificationDate) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
ImmutableAuditableEntityWithVersion withVersion(Integer version) {
return new ImmutableAuditableEntityWithVersion(id, value, version, modificationDate);
}
}
}

View File

@@ -0,0 +1,51 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.assertj.core.api.Assertions.*;
import java.time.ZonedDateTime;
import java.util.Collections;
import java.util.Date;
import org.junit.Test;
import org.springframework.core.convert.converter.Converter;
/**
* Unit tests for {@link MongoCustomConversions}.
*
* @author Christoph Strobl
*/
public class MongoCustomConversionsUnitTests {
@Test // DATAMONGO-2349
public void nonAnnotatedConverterForJavaTimeTypeShouldOnlyBeRegisteredAsReadingConverter() {
MongoCustomConversions conversions = new MongoCustomConversions(
Collections.singletonList(new DateToZonedDateTimeConverter()));
assertThat(conversions.hasCustomReadTarget(Date.class, ZonedDateTime.class)).isTrue();
assertThat(conversions.hasCustomWriteTarget(Date.class)).isFalse();
}
static class DateToZonedDateTimeConverter implements Converter<Date, ZonedDateTime> {
@Override
public ZonedDateTime convert(Date source) {
return ZonedDateTime.now();
}
}
}

View File

@@ -82,7 +82,7 @@ public class MongoTestUtils {
return Mono.from(database.getCollection(collectionName).drop()) //
.delayElement(Duration.ofMillis(10)) // server replication time
.then(Mono.from(database.createCollection(collectionName)))
.then(Mono.from(database.createCollection(collectionName))) //
.delayElement(Duration.ofMillis(10)); // server replication time
}
@@ -124,6 +124,26 @@ public class MongoTestUtils {
.verifyComplete();
}
/**
* Remove all documents from the {@link MongoCollection} with given name in the according {@link MongoDatabase
* database}.
*
* @param dbName must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param client must not be {@literal null}.
*/
public static void flushCollection(String dbName, String collectionName,
com.mongodb.reactivestreams.client.MongoClient client) {
com.mongodb.reactivestreams.client.MongoDatabase database = client.getDatabase(dbName)
.withWriteConcern(WriteConcern.MAJORITY).withReadPreference(ReadPreference.primary());
Mono.from(database.getCollection(collectionName).deleteMany(new Document())) //
.then() //
.as(StepVerifier::create) //
.verifyComplete();
}
/**
* Create a new {@link com.mongodb.MongoClient} with defaults suitable for replica set usage.
*

View File

@@ -0,0 +1,32 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core
import org.springframework.data.annotation.Id
import org.springframework.data.annotation.LastModifiedDate
import org.springframework.data.annotation.Version
import org.springframework.data.mongodb.core.mapping.Document
import java.time.Instant
@Document("versioned-auditable")
data class KAuditableVersionedEntity(
@Id val id: String?,
val value: String,
@Version val version: Long?,
@LastModifiedDate val modificationDate: Instant?
) {
fun withValue(value: String) = copy(value = value)
}

View File

@@ -29,6 +29,7 @@ class ExecutableAggregationOperationExtensionsTests {
val operation = mockk<ExecutableAggregationOperation>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `aggregateAndReturn(KClass) extension should call its Java counterpart`() {
operation.aggregateAndReturn(First::class)

View File

@@ -33,6 +33,7 @@ class ExecutableFindOperationExtensionsTests {
val distinctWithProjection = mockk<ExecutableFindOperation.DistinctWithProjection>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `ExecutableFindOperation#query(KClass) extension should call its Java counterpart`() {
operation.query(First::class)
@@ -47,6 +48,7 @@ class ExecutableFindOperationExtensionsTests {
}
@Test // DATAMONGO-1689, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ExecutableFindOperation#FindOperationWithProjection#asType(KClass) extension should call its Java counterpart`() {
operationWithProjection.asType(User::class)
@@ -61,6 +63,7 @@ class ExecutableFindOperationExtensionsTests {
}
@Test // DATAMONGO-1761, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ExecutableFindOperation#DistinctWithProjection#asType(KClass) extension should call its Java counterpart`() {
distinctWithProjection.asType(User::class)

View File

@@ -29,6 +29,7 @@ class ExecutableInsertOperationExtensionsTests {
val operation = mockk<ExecutableInsertOperation>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `insert(KClass) extension should call its Java counterpart`() {
operation.insert(First::class)

View File

@@ -31,6 +31,7 @@ class ExecutableMapReduceOperationExtensionsTests {
val operationWithProjection = mockk<ExecutableMapReduceOperation.MapReduceWithProjection<First>>(relaxed = true)
@Test // DATAMONGO-1929
@Suppress("DEPRECATION")
fun `ExecutableMapReduceOperation#mapReduce(KClass) extension should call its Java counterpart`() {
operation.mapReduce(First::class)
@@ -45,6 +46,7 @@ class ExecutableMapReduceOperationExtensionsTests {
}
@Test // DATAMONGO-1929, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ExecutableMapReduceOperation#MapReduceWithProjection#asType(KClass) extension should call its Java counterpart`() {
operationWithProjection.asType(User::class)

View File

@@ -29,6 +29,7 @@ class ExecutableRemoveOperationExtensionsTests {
val operation = mockk<ExecutableRemoveOperation>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `remove(KClass) extension should call its Java counterpart`() {
operation.remove(First::class)

View File

@@ -31,6 +31,7 @@ class ExecutableUpdateOperationExtensionsTests {
val operation = mockk<ExecutableUpdateOperation>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `update(KClass) extension should call its Java counterpart`() {
operation.update(First::class)

View File

@@ -22,7 +22,6 @@ import io.mockk.verify
import org.junit.Test
import org.springframework.data.mongodb.core.BulkOperations.BulkMode
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.mapreduce.GroupBy
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions
import org.springframework.data.mongodb.core.query.Criteria
import org.springframework.data.mongodb.core.query.NearQuery
@@ -39,6 +38,7 @@ class MongoOperationsExtensionsTests {
val operations = mockk<MongoOperations>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `getCollectionName(KClass) extension should call its Java counterpart`() {
operations.getCollectionName(First::class)
@@ -78,6 +78,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `createCollection(KClass) extension should call its Java counterpart`() {
operations.createCollection(First::class)
@@ -85,6 +86,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `createCollection(KClass, CollectionOptions) extension should call its Java counterpart`() {
val collectionOptions = mockk<CollectionOptions>()
@@ -109,6 +111,7 @@ class MongoOperationsExtensionsTests {
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `collectionExists(KClass) extension should call its Java counterpart`() {
operations.collectionExists(First::class)
@@ -123,6 +126,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `dropCollection(KClass) extension should call its Java counterpart`() {
operations.dropCollection(First::class)
@@ -137,6 +141,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `indexOps(KClass) extension should call its Java counterpart`() {
operations.indexOps(First::class)
@@ -151,6 +156,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `bulkOps(BulkMode, KClass) extension should call its Java counterpart`() {
val bulkMode = BulkMode.ORDERED
@@ -160,6 +166,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `bulkOps(BulkMode, KClass, String) extension should call its Java counterpart`() {
val bulkMode = BulkMode.ORDERED
@@ -205,27 +212,30 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `group(String, GroupBy) with reified type parameter extension should call its Java counterpart`() {
val collectionName = "foo"
val groupBy = mockk<GroupBy>()
val groupBy = mockk<org.springframework.data.mongodb.core.mapreduce.GroupBy>()
operations.group<First>(collectionName, groupBy)
verify { operations.group(collectionName, groupBy, First::class.java) }
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `group(Criteria, String, GroupBy) with reified type parameter extension should call its Java counterpart`() {
val criteria = mockk<Criteria>()
val collectionName = "foo"
val groupBy = mockk<GroupBy>()
val groupBy = mockk<org.springframework.data.mongodb.core.mapreduce.GroupBy>()
operations.group<First>(criteria, collectionName, groupBy)
verify { operations.group(criteria, collectionName, groupBy, First::class.java) }
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `aggregate(Aggregation, KClass) with reified type parameter extension should call its Java counterpart`() {
val aggregation = mockk<Aggregation>()
@@ -245,6 +255,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `aggregateStream(Aggregation, KClass) with reified type parameter extension should call its Java counterpart`() {
val aggregation = mockk<Aggregation>()
@@ -312,6 +323,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `geoNear(Query) with reified type parameter extension should call its Java counterpart`() {
val query = NearQuery.near(0.0, 0.0)
@@ -321,6 +333,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `geoNear(Query, String) with reified type parameter extension should call its Java counterpart`() {
val collectionName = "foo"
@@ -350,6 +363,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `exists(Query, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -474,6 +488,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `count(Query, KClass) with reified type parameter extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -483,6 +498,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `count(Query, KClass, String) with reified type parameter extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -493,6 +509,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `insert(Collection, KClass) extension should call its Java counterpart`() {
val collection = listOf(First(), First())
@@ -511,6 +528,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `upsert(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -521,6 +539,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `upsert(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -553,6 +572,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateFirst(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -563,6 +583,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateFirst(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -595,6 +616,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateMulti(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -605,6 +627,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateMulti(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -637,6 +660,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `remove(Query, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -646,6 +670,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `remove(Query, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -684,6 +709,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(String, KClass) should call java counterpart`() {
operations.findDistinct<String>("field", First::class)
@@ -691,6 +717,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
val query = mockk<Query>()
@@ -700,6 +727,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, String, KClass) should call java counterpart`() {
val query = mockk<Query>()
@@ -727,6 +755,7 @@ class MongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
val query = mockk<Query>()

View File

@@ -19,7 +19,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
import org.assertj.core.api.Assertions.assertThat
@@ -35,6 +35,7 @@ class ReactiveAggregationOperationExtensionsTests {
val operation = mockk<ReactiveAggregationOperation>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `aggregateAndReturn(KClass) extension should call its Java counterpart`() {
operation.aggregateAndReturn(First::class)
@@ -49,7 +50,7 @@ class ReactiveAggregationOperationExtensionsTests {
}
@Test // DATAMONGO-2255
@FlowPreview
@ExperimentalCoroutinesApi
fun terminatingAggregationOperationAllAsFlow() {
val spec = mockk<ReactiveAggregationOperation.TerminatingAggregationOperation<String>>()

View File

@@ -19,7 +19,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
import org.assertj.core.api.Assertions.assertThat
@@ -44,7 +44,7 @@ class ReactiveChangeStreamOperationExtensionsTests {
}
@Test // DATAMONGO-2089
@FlowPreview
@ExperimentalCoroutinesApi
fun `TerminatingChangeStream#listen() flow extension`() {
val doc1 = mockk<ChangeStreamEvent<Document>>()

View File

@@ -19,7 +19,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.take
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
@@ -35,7 +35,7 @@ import reactor.core.publisher.Mono
* @author Mark Paluch
* @author Sebastien Deleuze
*/
@FlowPreview
@ExperimentalCoroutinesApi
class ReactiveFindOperationExtensionsTests {
val operation = mockk<ReactiveFindOperation>(relaxed = true)
@@ -45,6 +45,7 @@ class ReactiveFindOperationExtensionsTests {
val distinctWithProjection = mockk<ReactiveFindOperation.DistinctWithProjection>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `ReactiveFind#query(KClass) extension should call its Java counterpart`() {
operation.query(First::class)
@@ -59,6 +60,7 @@ class ReactiveFindOperationExtensionsTests {
}
@Test // DATAMONGO-1719, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ReactiveFind#FindOperatorWithProjection#asType(KClass) extension should call its Java counterpart`() {
operationWithProjection.asType(User::class)
@@ -73,6 +75,7 @@ class ReactiveFindOperationExtensionsTests {
}
@Test // DATAMONGO-1761, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ReactiveFind#DistinctWithProjection#asType(KClass) extension should call its Java counterpart`() {
distinctWithProjection.asType(User::class)

View File

@@ -19,7 +19,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
import org.assertj.core.api.Assertions.assertThat
@@ -36,6 +36,7 @@ class ReactiveInsertOperationExtensionsTests {
val operation = mockk<ReactiveInsertOperation>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `insert(KClass) extension should call its Java counterpart`() {
operation.insert(First::class)
@@ -65,7 +66,7 @@ class ReactiveInsertOperationExtensionsTests {
}
@Test // DATAMONGO-2255
@FlowPreview
@ExperimentalCoroutinesApi
fun terminatingInsertAllAsFlow() {
val insert = mockk<ReactiveInsertOperation.TerminatingInsert<String>>()

View File

@@ -19,7 +19,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
import org.assertj.core.api.Assertions.assertThat
@@ -37,6 +37,7 @@ class ReactiveMapReduceOperationExtensionsTests {
val operationWithProjection = mockk<ReactiveMapReduceOperation.MapReduceWithProjection<First>>(relaxed = true)
@Test // DATAMONGO-1929
@Suppress("DEPRECATION")
fun `ReactiveMapReduceOperation#mapReduce(KClass) extension should call its Java counterpart`() {
operation.mapReduce(First::class)
@@ -51,6 +52,7 @@ class ReactiveMapReduceOperationExtensionsTests {
}
@Test // DATAMONGO-1929, DATAMONGO-2086
@Suppress("DEPRECATION")
fun `ReactiveMapReduceOperation#MapReduceWithProjection#asType(KClass) extension should call its Java counterpart`() {
operationWithProjection.asType(User::class)
@@ -65,7 +67,7 @@ class ReactiveMapReduceOperationExtensionsTests {
}
@Test // DATAMONGO-2255
@FlowPreview
@ExperimentalCoroutinesApi
fun terminatingMapReduceAllAsFlow() {
val spec = mockk<ReactiveMapReduceOperation.TerminatingMapReduce<String>>()

View File

@@ -34,6 +34,7 @@ class ReactiveMongoOperationsExtensionsTests {
val operations = mockk<ReactiveMongoOperations>(relaxed = true)
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `indexOps(KClass) extension should call its Java counterpart`() {
operations.indexOps(First::class)
@@ -57,6 +58,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `createCollection(KClass) extension should call its Java counterpart`() {
operations.createCollection(First::class)
@@ -64,6 +66,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `createCollection(KClass, CollectionOptions) extension should call its Java counterpart`() {
val collectionOptions = mockk<CollectionOptions>()
@@ -89,6 +92,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `collectionExists(KClass) extension should call its Java counterpart`() {
operations.collectionExists(First::class)
@@ -103,6 +107,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `dropCollection(KClass) extension should call its Java counterpart`() {
operations.dropCollection(First::class)
@@ -152,6 +157,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `exists(Query, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -208,6 +214,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `geoNear(Query) with reified type parameter extension should call its Java counterpart`() {
val query = NearQuery.near(0.0, 0.0)
@@ -217,6 +224,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `geoNear(Query, String) with reified type parameter extension should call its Java counterpart`() {
val collectionName = "foo"
@@ -295,6 +303,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `count(Query, KClass) with reified type parameter extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -304,6 +313,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `count(Query, KClass, String) with reified type parameter extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -314,6 +324,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `insert(Collection, KClass) extension should call its Java counterpart`() {
val collection = listOf(First(), First())
@@ -332,6 +343,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `insertAll(Mono, KClass) extension should call its Java counterpart`() {
val collection = Mono.just(listOf(First(), First()))
@@ -341,6 +353,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `upsert(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -351,6 +364,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `upsert(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -383,6 +397,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateFirst(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -393,6 +408,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateFirst(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -425,6 +441,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateMulti(Query, Update, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -435,6 +452,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `updateMulti(Query, Update, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -467,6 +485,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `remove(Query, KClass) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -476,6 +495,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1689
@Suppress("DEPRECATION")
fun `remove(Query, KClass, String) extension should call its Java counterpart`() {
val query = mockk<Query>()
@@ -533,6 +553,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(String, KClass) should call java counterpart`() {
operations.findDistinct<String>("field", First::class)
@@ -540,6 +561,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
val query = mockk<Query>()
@@ -549,6 +571,7 @@ class ReactiveMongoOperationsExtensionsTests {
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, String, KClass) should call java counterpart`() {
val query = mockk<Query>()
@@ -577,6 +600,7 @@ class ReactiveMongoOperationsExtensionsTests {
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
val query = mockk<Query>()

View File

@@ -20,7 +20,7 @@ import example.first.First
import io.mockk.every
import io.mockk.mockk
import io.mockk.verify
import kotlinx.coroutines.FlowPreview
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.flow.toList
import kotlinx.coroutines.runBlocking
import org.assertj.core.api.Assertions.assertThat
@@ -37,6 +37,7 @@ class ReactiveRemoveOperationExtensionsTests {
val operation = mockk<ReactiveRemoveOperation>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `remove(KClass) extension should call its Java counterpart`() {
operation.remove(First::class)
@@ -67,7 +68,7 @@ class ReactiveRemoveOperationExtensionsTests {
}
@Test // DATAMONGO-2255
@FlowPreview
@ExperimentalCoroutinesApi
fun terminatingRemoveFindAndRemoveAsFlow() {
val spec = mockk<ReactiveRemoveOperation.TerminatingRemove<String>>()

View File

@@ -37,6 +37,7 @@ class ReactiveUpdateOperationExtensionsTests {
val operation = mockk<ReactiveUpdateOperation>(relaxed = true)
@Test // DATAMONGO-1719
@Suppress("DEPRECATION")
fun `update(KClass) extension should call its Java counterpart`() {
operation.update(First::class)

View File

@@ -9,7 +9,7 @@
<mongo:db-factory dbname="validation" />
<mongo:mapping-converter base-package="org.springframework.data.mongodb.core" />
<mongo:mapping-converter base-package="org.springframework.data.mongodb.core.mapping.event" />
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />

View File

@@ -420,7 +420,7 @@ The MappingMongoConverter can use metadata to drive the mapping of objects to do
* `@PersistenceConstructor`: Marks a given constructor - even a package protected one - to use when instantiating the object from the database. Constructor arguments are mapped by name to the key values in the retrieved Document.
* `@Value`: This annotation is part of the Spring Framework . Within the mapping framework it can be applied to constructor arguments. This lets you use a Spring Expression Language statement to transform a key's value retrieved in the database before it is used to construct a domain object. In order to reference a property of a given document one has to use expressions like: `@Value("#root.myProperty")` where `root` refers to the root of the given document.
* `@Field`: Applied at the field level it allows to describe the name and type of the field as it will be represented in the MongoDB BSON document thus allowing the name and type to be different than the fieldname of the class as well as the property type.
* `@Version`: Applied at field level is used for optimistic locking and checked for modification on save operations. The initial value is `zero` which is bumped automatically on every update.
* `@Version`: Applied at field level is used for optimistic locking and checked for modification on save operations. The initial value is `zero` (`one` for primitive types) which is bumped automatically on every update.
The mapping metadata infrastructure is defined in a separate spring-data-commons project that is technology agnostic. Specific subclasses are using in the MongoDB support to support annotation based metadata. Other strategies are also possible to put in place if there is demand.

View File

@@ -2391,6 +2391,12 @@ An `Aggregation` represents a MongoDB `aggregate` operation and holds the descri
+
The actual aggregate operation is executed by the `aggregate` method of the `MongoTemplate`, which takes the desired output class as a parameter.
+
* `TypedAggregation`
+
A `TypedAggregation`, just like an `Aggregation`, holds the instructions of the aggregation pipeline and a reference to the input type, that is used for mapping domain properties to actual document fields.
+
On execution field references get checked against the given input type considering potential `@Field` annotations and raising errors when referencing non existing properties.
+
* `AggregationOperation`
+
An `AggregationOperation` represents a MongoDB aggregation pipeline operation and describes the processing that should be performed in this aggregation step. Although you could manually create an `AggregationOperation`, we recommend using the static factory methods provided by the `Aggregate` class to construct an `AggregateOperation`.

View File

@@ -1,6 +1,26 @@
Spring Data MongoDB Changelog
=============================
Changes in version 2.2.0.RC3 (2019-09-06)
-----------------------------------------
* DATAMONGO-2358 - Upgrade to Kotlin Coroutines 1.3.0.
* DATAMONGO-2357 - com.mongodb.client.model.geojson types should be considered as store specific simple ones.
* DATAMONGO-2356 - Switch to newly introduced usingWhen methods.
* DATAMONGO-2354 - protected function expose package-private propertty.
* DATAMONGO-2352 - Documentation Typo Fix.
* DATAMONGO-2351 - UnsupportedOperationException is thrown during delete with UNACKNOWLEDGED write concern.
* DATAMONGO-2349 - ZonedDateTime to/from Date not recognized without @Reading/@WritingConverter.
* DATAMONGO-2348 - Initial value of @Version annotated fields not described correctly in reference documentation.
* DATAMONGO-2346 - Field annotated with @LastModifiedDate is not persisted in Kotlin data classes.
* DATAMONGO-2344 - Add SLAVE_OK as a cursor option throws an exception.
* DATAMONGO-2342 - Upgrade to MongoDB Java Driver 3.11.
* DATAMONGO-2339 - Id property containing underscore not mapped correctly.
* DATAMONGO-2338 - Unable to specialise MongoRepositoryFactoryBean due to final method.
* DATAMONGO-2337 - Add HTTPS entries into spring.schemas.
* DATAMONGO-2335 - Release 2.2 RC3 (Moore).
* DATAMONGO-2310 - Improve TypedAggregation vs. untyped aggregation documentation.
Changes in version 2.2.0.RC2 (2019-08-05)
-----------------------------------------
* DATAMONGO-2330 - Regression: DefaultBulkOperations ignores defaultWriteConcern.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 2.2 RC2
Spring Data MongoDB 2.2 RC3
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").