Compare commits

..

34 Commits

Author SHA1 Message Date
Christoph Strobl
a96e75bb64 Release version 3.2 M2 (2021.0.0).
See #3521
2021-01-13 15:34:04 +01:00
Christoph Strobl
dacb88e97f Prepare 3.2 M2 (2021.0.0).
See #3521
2021-01-13 15:33:34 +01:00
Christoph Strobl
59f276f2b5 Updated changelog.
See #3521
2021-01-13 15:33:23 +01:00
Christoph Strobl
8ecb09b142 Updated changelog.
See #3477
2021-01-13 15:16:18 +01:00
Christoph Strobl
c1c25b88e7 Update issue tracker references after GitHub issues migration.
See: #3529
2021-01-12 13:42:57 +01:00
Christoph Strobl
b5effeb4d8 Deprecate KPropertyPath in favor of Spring Data Common's KPropertyPath.
relates to: spring-projects/spring-data-commons#478

Original Pull Request: #3533
Closes: #3515
2021-01-12 13:42:35 +01:00
Mark Paluch
ad6d2c97b7 Update copyright year to 2021.
Closes #3534
2021-01-12 11:50:17 +01:00
Mark Paluch
8b0ecf17c4 DATAMONGO-2651 - Polishing.
Update since tag. Reduce test class/method visibility, update license headers.

Original pull request: #898.
2021-01-11 14:49:12 +01:00
Christoph Strobl
19e62787b8 DATAMONGO-2651 - Support $accumulator in GroupOperationBuilder.
Original pull request: #898.
2021-01-11 14:49:06 +01:00
Mark Paluch
b8298ed23d DATAMONGO-2671 - Polishing.
Fix copyright header.

Original pull request: #897.
2021-01-11 12:14:24 +01:00
Mark Paluch
3277673e39 DATAMONGO-2671 - Polishing.
Fix copyright header. Add since tags.

Original pull request: #897.
2021-01-11 12:14:24 +01:00
Christoph Strobl
b56e17e0eb DATAMONGO-2671 - Fix dateFromParts millisecond field name.
Use millisecond instead of milliseconds field name.

Related to: https://jira.mongodb.org/browse/DOCS-10652
Original pull request: #897.
2021-01-11 12:14:20 +01:00
Mark Paluch
4d10962d12 #3529 - Add GitHub actions workflow for issue management. 2020-12-31 10:37:23 +01:00
Greg L. Turnquist
b7310fd1ae DATAMONGO-2665 - Use Docker hub credentials for all CI jobs, 2020-12-14 16:43:15 -06:00
Mark Paluch
c66ffeaa09 DATAMONGO-2653 - Updated changelog. 2020-12-09 16:47:38 +01:00
Mark Paluch
d1c6b0cd19 DATAMONGO-2649 - After release cleanups. 2020-12-09 15:32:19 +01:00
Mark Paluch
b5d5485196 DATAMONGO-2649 - Prepare next development iteration. 2020-12-09 15:32:15 +01:00
Mark Paluch
3982536301 DATAMONGO-2649 - Release version 3.2 M1 (2021.0.0). 2020-12-09 15:21:55 +01:00
Mark Paluch
1e84f379b2 DATAMONGO-2649 - Prepare 3.2 M1 (2021.0.0). 2020-12-09 15:21:28 +01:00
Mark Paluch
d605a227fc DATAMONGO-2649 - Updated changelog. 2020-12-09 15:21:25 +01:00
Mark Paluch
8dea071270 DATAMONGO-2647 - Updated changelog. 2020-12-09 12:42:22 +01:00
Mark Paluch
fece1e99cb DATAMONGO-2646 - Updated changelog. 2020-12-09 09:59:08 +01:00
Mark Paluch
8918c97189 DATAMONGO-2663 - Document Spring Data to MongoDB compatibility.
Original Pull Request: #895
2020-12-07 14:39:20 +01:00
Mark Paluch
3f5cc897da DATAMONGO-2659 - Polishing.
Update Javadoc to reflect find and aggregation nature. Use primitive boolean on Query.allowDiskUse to avoid nullable type usage. Update ReactiveMongoTemplate to consider allowDiskUse.

Original pull request: #891.
2020-12-01 09:42:09 +01:00
abarkan
9d5b72db49 DATAMONGO-2659 - Allow disk use on query.
Original pull request: #891.
2020-12-01 09:41:34 +01:00
Mark Paluch
65401bf4c3 DATAMONGO-2661 - Polishing.
Add ticket reference.

Original pull request: #894.
2020-11-26 11:48:32 +01:00
Yoann de Martino
c8b64601db DATAMONGO-2661 - Handle nullable types for KPropertyPath.
Original pull request: #894.
2020-11-26 11:48:32 +01:00
Mark Paluch
ab4fe5cb0b DATAMONGO-2652 - Polishing.
Reorder implementation methods. Reduce visibility of test methods according to JUnit 5 requirements.

Original pull request: #892.
2020-11-25 14:38:11 +01:00
Jens Schauder
c1a8ffec96 DATAMONGO-2652 - Implements CrudRepository and ReactiveCrudRepository.delete(Iterable<ID> ids).
See also: DATACMNS-800.
Original pull request: #892.
2020-11-25 11:34:03 +01:00
Mark Paluch
5c3bb00b24 DATAMONGO-2648 - Updated changelog. 2020-11-11 12:34:36 +01:00
Christoph Strobl
07c728bb32 DATAMONGO-2644 - ProjectOperation no longer errors on inclusion of default _id field.
Original pull request: #890.
2020-11-10 09:39:13 +01:00
Christoph Strobl
c7e1ca5863 DATAMONGO-2635 - Enforce aggregation pipeline mapping.
Avoid using the Aggregation.DEFAULT_CONTEXT which does not map contained values to the according MongoDB representation. We now use a relaxed aggregation context, preserving given field names, where possible.

Original pull request: #890.
2020-11-10 09:39:05 +01:00
Mark Paluch
6ab43c2391 DATAMONGO-2639 - After release cleanups. 2020-10-28 16:10:23 +01:00
Mark Paluch
96f389e580 DATAMONGO-2639 - Prepare next development iteration. 2020-10-28 16:10:20 +01:00
40 changed files with 689 additions and 809 deletions

View File

@@ -6,7 +6,6 @@ Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).

47
.github/workflows/project.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
# GitHub Actions to automate GitHub issues for Spring Data Project Management
name: Spring Data GitHub Issues
on:
issues:
types: [opened, edited, reopened]
issue_comment:
types: [created]
pull_request_target:
types: [opened, edited, reopened]
jobs:
Inbox:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request == null
steps:
- name: Create or Update Issue Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Inbox'
project-location: 'spring-projects'
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}
Pull-Request:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request != null
steps:
- name: Create or Update Pull Request Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Review pending'
project-location: 'spring-projects'
issue-number: ${{ github.event.pull_request.number }}
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}
Feedback-Provided:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && github.event.action == 'created' && contains(join(github.event.issue.labels.*.name, ', '), 'waiting-for-feedback')
steps:
- name: Update Project Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Feedback provided'
project-location: 'spring-projects'
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}

10
Jenkinsfile vendored
View File

@@ -3,7 +3,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/2.4.x", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/master", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -68,7 +68,7 @@ pipeline {
stage("test: baseline (jdk8)") {
when {
anyOf {
branch '3.1.x'
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -95,7 +95,7 @@ pipeline {
stage("Test other configurations") {
when {
allOf {
branch '3.1.x'
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -168,7 +168,7 @@ pipeline {
stage('Release to artifactory') {
when {
anyOf {
branch '3.1.x'
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -201,7 +201,7 @@ pipeline {
stage('Publish documentation') {
when {
branch '3.1.x'
branch 'master'
}
agent {
label 'data'

10
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.4</version>
<version>3.2.0-M2</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.4.4</version>
<version>2.5.0-M2</version>
</parent>
<modules>
@@ -26,7 +26,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.4.4</springdata.commons>
<springdata.commons>2.5.0-M2</springdata.commons>
<mongo>4.1.1</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
@@ -134,8 +134,8 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.4</version>
<version>3.2.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.4</version>
<version>3.2.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.4</version>
<version>3.2.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -125,11 +125,6 @@ public interface ExecutableFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link MongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return total number of matching elements.
*/

View File

@@ -1160,12 +1160,6 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -1182,12 +1176,6 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -1199,9 +1187,6 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
@@ -1215,9 +1200,6 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
@@ -1232,12 +1214,6 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.

View File

@@ -28,6 +28,7 @@ import org.bson.Document;
import org.bson.conversions.Bson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -155,6 +156,7 @@ import com.mongodb.client.result.UpdateResult;
* @author Michael J. Simons
* @author Roman Puchkovskiy
* @author Yadhukrishna S Pai
* @author Anton Barkan
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware, IndexOperationsProvider {
@@ -3288,6 +3290,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
cursorToUse = cursorToUse.batchSize(meta.getCursorBatchSize());
}
if (meta.getAllowDiskUse() != null) {
cursorToUse = cursorToUse.allowDiskUse(meta.getAllowDiskUse());
}
for (Meta.CursorOption option : meta.getFlags()) {
switch (option) {
@@ -3453,20 +3459,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
}
/**
* @deprecated since 3.1.4. Use {@link #getMongoDatabaseFactory()} instead.
* @return the {@link MongoDatabaseFactory} in use.
*/
@Deprecated
public MongoDatabaseFactory getMongoDbFactory() {
return getMongoDatabaseFactory();
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public MongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDbFactory;
}

View File

@@ -106,12 +106,6 @@ public interface ReactiveFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but
* guarantees shard, session and transaction compliance. In case an inaccurate count satisfies the applications
* needs use {@link ReactiveMongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return {@link Mono} emitting total number of matching elements. Never {@literal null}.
*/

View File

@@ -940,12 +940,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -962,12 +956,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -983,12 +971,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -1001,9 +983,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
@@ -1017,9 +996,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.

View File

@@ -2730,14 +2730,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return potentiallyForceAcknowledgedWrite(wc);
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public ReactiveMongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDatabaseFactory;
}
@Nullable
private WriteConcern potentiallyForceAcknowledgedWrite(@Nullable WriteConcern wc) {
@@ -3337,6 +3329,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (meta.getCursorBatchSize() != null) {
findPublisherToUse = findPublisherToUse.batchSize(meta.getCursorBatchSize());
}
if (meta.getAllowDiskUse() != null) {
findPublisherToUse = findPublisherToUse.allowDiskUse(meta.getAllowDiskUse());
}
}
} catch (RuntimeException e) {

View File

@@ -1835,7 +1835,7 @@ public class DateOperators {
* @param millisecond must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal millisecond} is {@literal null}
* @deprecated since 3.1.3, use {@link #millisecond(Object)} instead.
* @deprecated since 3.2, use {@link #millisecond(Object)} instead.
*/
@Deprecated
default T milliseconds(Object millisecond) {
@@ -1849,7 +1849,7 @@ public class DateOperators {
* @param millisecond must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal millisecond} is {@literal null}
* @since 3.1.3
* @since 3.2
*/
T millisecond(Object millisecond);
@@ -1859,7 +1859,7 @@ public class DateOperators {
* @param fieldReference must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal fieldReference} is {@literal null}.
* @deprecated since 3.1.3,use {@link #millisecondOf(String)} instead.
* @deprecated since 3.2,use {@link #millisecondOf(String)} instead.
*/
@Deprecated
default T millisecondsOf(String fieldReference) {
@@ -1872,7 +1872,7 @@ public class DateOperators {
* @param fieldReference must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal fieldReference} is {@literal null}.
* @since 3.1.3
* @since 3.2
*/
default T millisecondOf(String fieldReference) {
return milliseconds(Fields.field(fieldReference));
@@ -1884,7 +1884,7 @@ public class DateOperators {
* @param expression must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal expression} is {@literal null}.
* @deprecated since 3.1.3, use {@link #millisecondOf(AggregationExpression)} instead.
* @deprecated since 3.2, use {@link #millisecondOf(AggregationExpression)} instead.
*/
@Deprecated
default T millisecondsOf(AggregationExpression expression) {
@@ -1897,7 +1897,7 @@ public class DateOperators {
* @param expression must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal expression} is {@literal null}.
* @since 3.1.3
* @since 3.2
*/
default T millisecondOf(AggregationExpression expression) {
return milliseconds(expression);

View File

@@ -23,6 +23,7 @@ import java.util.List;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -375,6 +376,17 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.STD_DEV_POP, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $accumulator}-expression.
*
* @param accumulator must not be {@literal null}.
* @return never {@literal null}.
* @since 3.2
*/
public GroupOperationBuilder accumulate(Accumulator accumulator) {
return new GroupOperationBuilder(this, new Operation(accumulator));
}
private GroupOperationBuilder newBuilder(Keyword keyword, @Nullable String reference, @Nullable Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
@@ -465,12 +477,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
static class Operation implements AggregationOperation {
private final Keyword op;
private final @Nullable Keyword op;
private final @Nullable String key;
private final @Nullable String reference;
private final @Nullable Object value;
public Operation(Keyword op, @Nullable String key, @Nullable String reference, @Nullable Object value) {
Operation(AggregationExpression expression) {
this(null, null, null, expression);
}
public Operation(@Nullable Keyword op, @Nullable String key, @Nullable String reference, @Nullable Object value) {
this.op = op;
this.key = key;
@@ -487,7 +503,12 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
}
public Document toDocument(AggregationOperationContext context) {
return new Document(key, new Document(op.toString(), getValue(context)));
Object value = getValue(context);
if(op == null && value instanceof Document) {
return new Document(key, value);
}
return new Document(key, new Document(op.toString(), value));
}
public Object getValue(AggregationOperationContext context) {

View File

@@ -32,9 +32,6 @@ import java.util.concurrent.atomic.AtomicLong;
import org.bson.BsonTimestamp;
import org.bson.Document;
import org.bson.UuidRepresentation;
import org.bson.codecs.Codec;
import org.bson.internal.CodecRegistryHelper;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -48,12 +45,11 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoClientSettings;
/**
* Wrapper class to contain useful converters for the usage with Mongo.
*
@@ -240,13 +236,9 @@ abstract class MongoConverters {
INSTANCE;
private final Codec<Document> codec = CodecRegistryHelper
.createRegistry(MongoClientSettings.getDefaultCodecRegistry(), UuidRepresentation.JAVA_LEGACY)
.get(Document.class);
@Override
public String convert(Document source) {
return source.toJson(codec);
return source.toJson();
}
}

View File

@@ -41,7 +41,7 @@ public @interface Document {
* The collection the document representing the entity is supposed to be stored in. If not configured, a default
* collection name will be derived from the type's name. The attribute supports SpEL expressions to dynamically
* calculate the collection to based on a per operation basis.
*
*
* @return the name of the collection to be used.
*/
@AliasFor("collection")
@@ -51,7 +51,7 @@ public @interface Document {
* The collection the document representing the entity is supposed to be stored in. If not configured, a default
* collection name will be derived from the type's name. The attribute supports SpEL expressions to dynamically
* calculate the collection to based on a per operation basis.
*
*
* @return the name of the collection to be used.
*/
@AliasFor("value")

View File

@@ -77,7 +77,7 @@ public class MongoMappingEvent<T> extends ApplicationEvent {
/**
* Allows client code to change the underlying source instance by applying the given {@link Function}.
*
*
* @param mapper the {@link Function} to apply, will only be applied if the source is not {@literal null}.
* @since 2.1
*/

View File

@@ -124,12 +124,7 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Static factory method to create a {@link Criteria} matching an example object. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125; </code>. <br />
* To avoid the above mentioned type restriction use an {@link UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
* Static factory method to create a {@link Criteria} matching an example object.
*
* @param example must not be {@literal null}.
* @return new instance of {@link Criteria}.
@@ -620,15 +615,8 @@ public class Criteria implements CriteriaDefinition {
*/
public Criteria alike(Example<?> sample) {
if (StringUtils.hasText(this.getKey())) {
criteria.put("$example", sample);
return this;
}
Criteria exampleCriteria = new Criteria();
exampleCriteria.criteria.put("$example", sample);
return registerCriteriaChainElement(exampleCriteria);
criteria.put("$example", sample);
return this;
}
/**

View File

@@ -181,7 +181,11 @@ public class Meta {
}
/**
* Set to {@literal true}, to allow aggregation stages to write data to disk.
* Enables writing to temporary files for aggregation stages and queries. When set to {@literal true}, aggregation
* stages can write data to the {@code _tmp} subdirectory in the {@code dbPath} directory.
* <p>
* Starting in MongoDB 4.2, the profiler log messages and diagnostic log messages includes a {@code usedDisk}
* indicator if any aggregation stage wrote data to temporary files due to memory restrictions.
*
* @param allowDiskUse use {@literal null} for server defaults.
* @since 3.0

View File

@@ -46,6 +46,7 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Anton Barkan
*/
public class Query {
@@ -372,6 +373,24 @@ public class Query {
return this;
}
/**
* Enables writing to temporary files for aggregation stages and queries. When set to {@literal true}, aggregation
* stages can write data to the {@code _tmp} subdirectory in the {@code dbPath} directory.
* <p>
* Starting in MongoDB 4.2, the profiler log messages and diagnostic log messages includes a {@code usedDisk}
* indicator if any aggregation stage wrote data to temporary files due to memory restrictions.
*
* @param allowDiskUse
* @return this.
* @see Meta#setAllowDiskUse(Boolean)
* @since 3.2
*/
public Query allowDiskUse(boolean allowDiskUse) {
meta.setAllowDiskUse(allowDiskUse);
return this;
}
/**
* Set the number of documents to return in each response batch. <br />
* Use {@literal 0 (zero)} for no limit. A <strong>negative limit</strong> closes the cursor after returning a single

View File

@@ -78,31 +78,16 @@ public interface MongoRepository<T, ID> extends PagingAndSortingRepository<T, ID
*/
<S extends T> List<S> insert(Iterable<S> entities);
/**
* Returns all entities matching the given {@link Example}. In case no match could be found an empty {@link List} is
* returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example)
*/
@Override
<S extends T> List<S> findAll(Example<S> example);
/**
* Returns all entities matching the given {@link Example} applying the given {@link Sort}. In case no match could be
* found an empty {@link List} is returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example,
* org.springframework.data.domain.Sort)
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example, org.springframework.data.domain.Sort)
*/
@Override
<S extends T> List<S> findAll(Example<S> example, Sort sort);

View File

@@ -64,33 +64,4 @@ public interface ReactiveMongoRepository<T, ID> extends ReactiveSortingRepositor
*/
<S extends T> Flux<S> insert(Publisher<S> entities);
/**
* Returns all entities matching the given {@link Example}. In case no match could be found an empty {@link Flux} is
* returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findAll(org.springframework.data.domain.Example)
*/
@Override
<S extends T> Flux<S> findAll(Example<S> example);
/**
* Returns all entities matching the given {@link Example} applying the given {@link Sort}. In case no match could be
* found an empty {@link Flux} is returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findAll(org.springframework.data.domain.Example,
* org.springframework.data.domain.Sort)
*/
@Override
<S extends T> Flux<S> findAll(Example<S> example, Sort sort);
}

View File

@@ -51,6 +51,7 @@ import com.mongodb.client.result.DeleteResult;
* @author Thomas Darimont
* @author Mark Paluch
* @author Mehran Behnam
* @author Jens Schauder
*/
public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
@@ -72,6 +73,10 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
this.mongoOperations = mongoOperations;
}
// -------------------------------------------------------------------------
// Methods from CrudRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#save(java.lang.Object)
@@ -135,6 +140,27 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#findAll()
*/
@Override
public List<T> findAll() {
return findAll(new Query());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#findAllById(java.lang.Iterable)
*/
@Override
public Iterable<T> findAllById(Iterable<ID> ids) {
Assert.notNull(ids, "The given Ids of entities not be null!");
return findAll(getIdQuery(ids));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#count()
@@ -175,6 +201,19 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#deleteAllById(java.lang.Iterable)
*/
@Override
public void deleteAllById(Iterable<? extends ID> ids) {
Assert.notNull(ids, "The given Iterable of ids must not be null!");
mongoOperations.remove(getIdQuery(ids), entityInformation.getJavaType(),
entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#delete(java.lang.Iterable)
@@ -182,7 +221,7 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
@Override
public void deleteAll(Iterable<? extends T> entities) {
Assert.notNull(entities, "The given Iterable of entities not be null!");
Assert.notNull(entities, "The given Iterable of entities must not be null!");
entities.forEach(this::delete);
}
@@ -196,27 +235,9 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
mongoOperations.remove(new Query(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#findAll()
*/
@Override
public List<T> findAll() {
return findAll(new Query());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.CrudRepository#findAllById(java.lang.Iterable)
*/
@Override
public Iterable<T> findAllById(Iterable<ID> ids) {
Assert.notNull(ids, "The given Ids of entities not be null!");
return findAll(new Query(new Criteria(entityInformation.getIdAttribute())
.in(Streamable.of(ids).stream().collect(StreamUtils.toUnmodifiableList()))));
}
// -------------------------------------------------------------------------
// Methods from PagingAndSortingRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
@@ -245,6 +266,10 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return findAll(new Query().with(sort));
}
// -------------------------------------------------------------------------
// Methods from MongoRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#insert(java.lang.Object)
@@ -275,23 +300,33 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return new ArrayList<>(mongoOperations.insertAll(list));
}
// -------------------------------------------------------------------------
// Methods from QueryByExampleExecutor
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#findAllByExample(org.springframework.data.domain.Example, org.springframework.data.domain.Pageable)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findOne(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Page<S> findAll(final Example<S> example, Pageable pageable) {
public <S extends T> Optional<S> findOne(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation()).with(pageable); //
.collation(entityInformation.getCollation());
List<S> list = mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName());
return Optional
.ofNullable(mongoOperations.findOne(query, example.getProbeType(), entityInformation.getCollectionName()));
}
return PageableExecutionUtils.getPage(list, pageable,
() -> mongoOperations.count(Query.of(query).limit(-1).skip(-1), example.getProbeType(), entityInformation.getCollectionName()));
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#findAllByExample(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> List<S> findAll(Example<S> example) {
return findAll(example, Sort.unsorted());
}
/*
@@ -313,27 +348,21 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.MongoRepository#findAllByExample(org.springframework.data.domain.Example)
* @see org.springframework.data.mongodb.repository.MongoRepository#findAllByExample(org.springframework.data.domain.Example, org.springframework.data.domain.Pageable)
*/
@Override
public <S extends T> List<S> findAll(Example<S> example) {
return findAll(example, Sort.unsorted());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findOne(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Optional<S> findOne(Example<S> example) {
public <S extends T> Page<S> findAll(Example<S> example, Pageable pageable) {
Assert.notNull(example, "Sample must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation());
.collation(entityInformation.getCollation()).with(pageable); //
return Optional
.ofNullable(mongoOperations.findOne(query, example.getProbeType(), entityInformation.getCollectionName()));
List<S> list = mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName());
return PageableExecutionUtils.getPage(list, pageable,
() -> mongoOperations.count(Query.of(query).limit(-1).skip(-1), example.getProbeType(), entityInformation.getCollectionName()));
}
/*
@@ -366,6 +395,10 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return mongoOperations.exists(query, example.getProbeType(), entityInformation.getCollectionName());
}
// -------------------------------------------------------------------------
// Utility methods
// -------------------------------------------------------------------------
private Query getIdQuery(Object id) {
return new Query(getIdCriteria(id));
}
@@ -374,6 +407,11 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return where(entityInformation.getIdAttribute()).is(id);
}
private Query getIdQuery(Iterable<? extends ID> ids) {
return new Query(new Criteria(entityInformation.getIdAttribute())
.in(Streamable.of(ids).stream().collect(StreamUtils.toUnmodifiableList())));
}
private List<T> findAll(@Nullable Query query) {
if (query == null) {
@@ -382,4 +420,5 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
}

View File

@@ -21,10 +21,12 @@ import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.io.Serializable;
import java.util.Collection;
import java.util.List;
import java.util.stream.Collectors;
import org.reactivestreams.Publisher;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.domain.Example;
@@ -47,6 +49,7 @@ import com.mongodb.client.result.DeleteResult;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ruben J Garcia
* @author Jens Schauder
* @since 2.0
*/
public class SimpleReactiveMongoRepository<T, ID extends Serializable> implements ReactiveMongoRepository<T, ID> {
@@ -64,232 +67,9 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findById(java.lang.Object)
*/
@Override
public Mono<T> findById(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findById(org.reactivestreams.Publisher)
*/
@Override
public Mono<T> findById(Publisher<ID> publisher) {
Assert.notNull(publisher, "The given id must not be null!");
return Mono.from(publisher).flatMap(
id -> mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findOne(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<S> findOne(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation()) //
.limit(2);
return mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName()).buffer(2)
.map(vals -> {
if (vals.size() > 1) {
throw new IncorrectResultSizeDataAccessException(1);
}
return vals.iterator().next();
}).next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#existsById(java.lang.Object)
*/
@Override
public Mono<Boolean> existsById(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#existsById(org.reactivestreams.Publisher)
*/
@Override
public Mono<Boolean> existsById(Publisher<ID> publisher) {
Assert.notNull(publisher, "The given id must not be null!");
return Mono.from(publisher).flatMap(id -> mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#exists(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<Boolean> exists(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation());
return mongoOperations.exists(query, example.getProbeType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveSortingRepository#findAll()
*/
@Override
public Flux<T> findAll() {
return findAll(new Query());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findAllById(java.lang.Iterable)
*/
@Override
public Flux<T> findAllById(Iterable<ID> ids) {
Assert.notNull(ids, "The given Iterable of Id's must not be null!");
return findAll(new Query(new Criteria(entityInformation.getIdAttribute())
.in(Streamable.of(ids).stream().collect(StreamUtils.toUnmodifiableList()))));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findAllById(org.reactivestreams.Publisher)
*/
@Override
public Flux<T> findAllById(Publisher<ID> ids) {
Assert.notNull(ids, "The given Publisher of Id's must not be null!");
return Flux.from(ids).buffer().flatMap(this::findAllById);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
@Override
public Flux<T> findAll(Sort sort) {
Assert.notNull(sort, "Sort must not be null!");
return findAll(new Query().with(sort));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#findAll(org.springframework.data.domain.Example, org.springframework.data.domain.Sort)
*/
@Override
public <S extends T> Flux<S> findAll(Example<S> example, Sort sort) {
Assert.notNull(example, "Sample must not be null!");
Assert.notNull(sort, "Sort must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation()) //
.with(sort);
return mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#findAll(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Flux<S> findAll(Example<S> example) {
Assert.notNull(example, "Example must not be null!");
return findAll(example, Sort.unsorted());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#count()
*/
@Override
public Mono<Long> count() {
return mongoOperations.count(new Query(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#count(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<Long> count(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation());
return mongoOperations.count(query, example.getProbeType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(java.lang.Object)
*/
@Override
public <S extends T> Mono<S> insert(S entity) {
Assert.notNull(entity, "Entity must not be null!");
return mongoOperations.insert(entity, entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(java.lang.Iterable)
*/
@Override
public <S extends T> Flux<S> insert(Iterable<S> entities) {
Assert.notNull(entities, "The given Iterable of entities must not be null!");
List<S> source = Streamable.of(entities).stream().collect(StreamUtils.toUnmodifiableList());
return source.isEmpty() ? Flux.empty() : Flux.from(mongoOperations.insertAll(source));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(org.reactivestreams.Publisher)
*/
@Override
public <S extends T> Flux<S> insert(Publisher<S> entities) {
Assert.notNull(entities, "The given Publisher of entities must not be null!");
return Flux.from(entities).flatMap(entity -> mongoOperations.insert(entity, entityInformation.getCollectionName()));
}
// -------------------------------------------------------------------------
// Methods from ReactiveCrudRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
@@ -337,6 +117,100 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
mongoOperations.save(entity, entityInformation.getCollectionName()).then(Mono.just(entity)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findById(java.lang.Object)
*/
@Override
public Mono<T> findById(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findById(org.reactivestreams.Publisher)
*/
@Override
public Mono<T> findById(Publisher<ID> publisher) {
Assert.notNull(publisher, "The given id must not be null!");
return Mono.from(publisher).flatMap(
id -> mongoOperations.findById(id, entityInformation.getJavaType(), entityInformation.getCollectionName()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#existsById(java.lang.Object)
*/
@Override
public Mono<Boolean> existsById(ID id) {
Assert.notNull(id, "The given id must not be null!");
return mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#existsById(org.reactivestreams.Publisher)
*/
@Override
public Mono<Boolean> existsById(Publisher<ID> publisher) {
Assert.notNull(publisher, "The given id must not be null!");
return Mono.from(publisher).flatMap(id -> mongoOperations.exists(getIdQuery(id), entityInformation.getJavaType(),
entityInformation.getCollectionName()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveSortingRepository#findAll()
*/
@Override
public Flux<T> findAll() {
return findAll(new Query());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findAllById(java.lang.Iterable)
*/
@Override
public Flux<T> findAllById(Iterable<ID> ids) {
Assert.notNull(ids, "The given Iterable of Id's must not be null!");
return findAll(getIdQuery(ids));
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#findAllById(org.reactivestreams.Publisher)
*/
@Override
public Flux<T> findAllById(Publisher<ID> ids) {
Assert.notNull(ids, "The given Publisher of Id's must not be null!");
return Flux.from(ids).buffer().flatMap(this::findAllById);
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#count()
*/
@Override
public Mono<Long> count() {
return mongoOperations.count(new Query(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#deleteById(java.lang.Object)
@@ -392,6 +266,19 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
return remove.then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#deleteAllById(java.lang.Iterable)
*/
@Override
public Mono<Void> deleteAllById(Iterable<? extends ID> ids) {
Assert.notNull(ids, "The given Iterable of Id's must not be null!");
return mongoOperations
.remove(getIdQuery(ids), entityInformation.getJavaType(), entityInformation.getCollectionName()).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveCrudRepository#deleteAll(java.lang.Iterable)
@@ -401,7 +288,14 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entities, "The given Iterable of entities must not be null!");
return Flux.fromIterable(entities).flatMap(this::delete).then();
Collection<?> idCollection = StreamUtils.createStreamFromIterator(entities.iterator()).map(entityInformation::getId)
.collect(Collectors.toList());
Criteria idsInCriteria = where(entityInformation.getIdAttribute()).in(idCollection);
return mongoOperations
.remove(new Query(idsInCriteria), entityInformation.getJavaType(), entityInformation.getCollectionName())
.then();
}
/*
@@ -428,6 +322,151 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
return mongoOperations.remove(new Query(), entityInformation.getCollectionName()).then(Mono.empty());
}
// -------------------------------------------------------------------------
// Methods from ReactiveSortingRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.repository.reactive.ReactiveSortingRepository#findAll(org.springframework.data.domain.Sort)
*/
@Override
public Flux<T> findAll(Sort sort) {
Assert.notNull(sort, "Sort must not be null!");
return findAll(new Query().with(sort));
}
// -------------------------------------------------------------------------
// Methods from ReactiveMongoRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(java.lang.Object)
*/
@Override
public <S extends T> Mono<S> insert(S entity) {
Assert.notNull(entity, "Entity must not be null!");
return mongoOperations.insert(entity, entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(java.lang.Iterable)
*/
@Override
public <S extends T> Flux<S> insert(Iterable<S> entities) {
Assert.notNull(entities, "The given Iterable of entities must not be null!");
List<S> source = Streamable.of(entities).stream().collect(StreamUtils.toUnmodifiableList());
return source.isEmpty() ? Flux.empty() : Flux.from(mongoOperations.insertAll(source));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#insert(org.reactivestreams.Publisher)
*/
@Override
public <S extends T> Flux<S> insert(Publisher<S> entities) {
Assert.notNull(entities, "The given Publisher of entities must not be null!");
return Flux.from(entities).flatMap(entity -> mongoOperations.insert(entity, entityInformation.getCollectionName()));
}
// -------------------------------------------------------------------------
// Methods from ReactiveMongoRepository
// -------------------------------------------------------------------------
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findOne(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<S> findOne(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation()) //
.limit(2);
return mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName()).buffer(2)
.map(vals -> {
if (vals.size() > 1) {
throw new IncorrectResultSizeDataAccessException(1);
}
return vals.iterator().next();
}).next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#findAll(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Flux<S> findAll(Example<S> example) {
Assert.notNull(example, "Example must not be null!");
return findAll(example, Sort.unsorted());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.ReactiveMongoRepository#findAll(org.springframework.data.domain.Example, org.springframework.data.domain.Sort)
*/
@Override
public <S extends T> Flux<S> findAll(Example<S> example, Sort sort) {
Assert.notNull(example, "Sample must not be null!");
Assert.notNull(sort, "Sort must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation()) //
.with(sort);
return mongoOperations.find(query, example.getProbeType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#count(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<Long> count(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation());
return mongoOperations.count(query, example.getProbeType(), entityInformation.getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#exists(org.springframework.data.domain.Example)
*/
@Override
public <S extends T> Mono<Boolean> exists(Example<S> example) {
Assert.notNull(example, "Sample must not be null!");
Query query = new Query(new Criteria().alike(example)) //
.collation(entityInformation.getCollation());
return mongoOperations.exists(query, example.getProbeType(), entityInformation.getCollectionName());
}
private Query getIdQuery(Object id) {
return new Query(getIdCriteria(id));
}
@@ -436,6 +475,13 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
return where(entityInformation.getIdAttribute()).is(id);
}
private Query getIdQuery(Iterable<? extends ID> ids) {
Collection<?> idCollection = StreamUtils.createStreamFromIterator(ids.iterator()).collect(Collectors.toList());
Criteria idsInCriteria = where(entityInformation.getIdAttribute()).in(idCollection);
return new Query(idsInCriteria);
}
private Flux<T> findAll(Query query) {
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());

View File

@@ -20,8 +20,6 @@ import com.mongodb.client.result.UpdateResult
import com.mongodb.reactivestreams.client.MongoCollection
import org.bson.Document
import org.springframework.data.geo.GeoResult
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.aggregation.TypedAggregation
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations
import org.springframework.data.mongodb.core.query.NearQuery
import org.springframework.data.mongodb.core.query.Query
@@ -212,52 +210,6 @@ inline fun <reified T : Any, reified E : Any> ReactiveMongoOperations.findDistin
if (collectionName != null) findDistinct(query, field, collectionName, E::class.java, T::class.java)
else findDistinct(query, field, E::class.java, T::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: TypedAggregation<*>,
collectionName: String
): Flux<O> =
this.aggregate(aggregation, collectionName, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(aggregation: TypedAggregation<*>): Flux<O> =
this.aggregate(aggregation, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @author Mark Paluch
* @since 3.1.4
*/
inline fun <reified I : Any, reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: Aggregation
): Flux<O> =
this.aggregate(aggregation, I::class.java, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: Aggregation,
collectionName: String
): Flux<O> =
this.aggregate(aggregation, collectionName, O::class.java)
/**
* Extension for [ReactiveMongoOperations.geoNear] leveraging reified type parameters.
*

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query
import org.springframework.data.mapping.toDotPath
import kotlin.reflect.KProperty
import kotlin.reflect.KProperty1
@@ -25,7 +26,9 @@ import kotlin.reflect.KProperty1
* @author Mark Paluch
* @author Yoann de Martino
* @since 2.2
* @deprecated since 3.2, KPropertyPath from Spring Data Commons.
*/
@Deprecated("use KPropertyPath from Spring Data Commons", replaceWith = ReplaceWith("KPropertyPath", "org.springframework.data.mapping.KPropertyPath"))
class KPropertyPath<T, U>(
internal val parent: KProperty<U?>,
internal val child: KProperty1<U, T>
@@ -39,7 +42,7 @@ internal fun asString(property: KProperty<*>): String {
return when (property) {
is KPropertyPath<*, *> ->
"${asString(property.parent)}.${property.child.name}"
else -> property.name
else -> property.toDotPath()
}
}
@@ -54,6 +57,8 @@ internal fun asString(property: KProperty<*>): String {
* @author Tjeu Kayim
* @author Yoann de Martino
* @since 2.2
* @deprecated since 3.2, KPropertyPath.div from Spring Data Commons.
*/
@Deprecated("use KPropertyPath.div from Spring Data Commons", replaceWith = ReplaceWith("this / other", "org.springframework.data.mapping.div"))
operator fun <T, U> KProperty<T?>.div(other: KProperty1<T, U>) =
KPropertyPath(this, other)

View File

@@ -139,14 +139,16 @@ public class QueryByExampleTests {
assertThat(result).containsExactlyInAnyOrder(p1, p2, p3);
}
@Test // DATAMONGO-1245, GH-3544
@Test // DATAMONGO-1245
public void findByExampleWithCriteria() {
Person sample = new Person();
sample.lastname = "stark";
Query query = new Query(new Criteria().alike(Example.of(sample)).and("firstname").regex(".*n.*"));
assertThat(operations.find(query, Person.class)).containsExactly(p1);
Query query = new Query(new Criteria().alike(Example.of(sample)).and("firstname").regex("^ary*"));
List<Person> result = operations.find(query, Person.class);
assertThat(result).hasSize(1);
}
@Test // DATAMONGO-1459

View File

@@ -45,6 +45,7 @@ import com.mongodb.client.FindIterable;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @author Anton Barkan
*/
@ExtendWith(MockitoExtension.class)
@MockitoSettings(strictness = Strictness.LENIENT)
@@ -61,6 +62,7 @@ class QueryCursorPreparerUnitTests {
when(factory.getCodecRegistry()).thenReturn(MongoClientSettings.getDefaultCodecRegistry());
when(cursor.batchSize(anyInt())).thenReturn(cursor);
when(cursor.comment(anyString())).thenReturn(cursor);
when(cursor.allowDiskUse(anyBoolean())).thenReturn(cursor);
when(cursor.maxTime(anyLong(), any())).thenReturn(cursor);
when(cursor.hint(any())).thenReturn(cursor);
when(cursor.noCursorTimeout(anyBoolean())).thenReturn(cursor);
@@ -94,27 +96,6 @@ class QueryCursorPreparerUnitTests {
verify(cursor).hint(new Document("age", 1));
}
// TODO
// @Test // DATAMONGO-957
// public void doesNotApplyMetaWhenEmpty() {
//
// Query query = query(where("foo").is("bar"));
// query.setMeta(new Meta());
//
// prepare(query);
//
// verify(cursor, never()).modifiers(any(Document.class));
// }
// @Test // DATAMONGO-957
// public void appliesMaxScanCorrectly() {
//
// Query query = query(where("foo").is("bar")).maxScan(100);
// prepare(query);
//
// verify(cursor).maxScan(100);
// }
@Test // DATAMONGO-957
void appliesMaxTimeCorrectly() {
@@ -133,15 +114,14 @@ class QueryCursorPreparerUnitTests {
verify(cursor).comment("spring data");
}
// TODO
// @Test // DATAMONGO-957
// public void appliesSnapshotCorrectly() {
//
// Query query = query(where("foo").is("bar")).useSnapshot();
// prepare(query);
//
// verify(cursor).snapshot(true);
// }
@Test // DATAMONGO-2659
void appliesAllowDiskUseCorrectly() {
Query query = query(where("foo").is("bar")).allowDiskUse(true);
prepare(query);
verify(cursor).allowDiskUse(true);
}
@Test // DATAMONGO-1480
void appliesNoCursorTimeoutCorrectly() {

View File

@@ -180,6 +180,7 @@ public class ReactiveMongoTemplateUnitTests {
when(findPublisher.limit(anyInt())).thenReturn(findPublisher);
when(findPublisher.collation(any())).thenReturn(findPublisher);
when(findPublisher.first()).thenReturn(findPublisher);
when(findPublisher.allowDiskUse(anyBoolean())).thenReturn(findPublisher);
when(aggregatePublisher.allowDiskUse(anyBoolean())).thenReturn(aggregatePublisher);
when(aggregatePublisher.collation(any())).thenReturn(aggregatePublisher);
when(aggregatePublisher.maxTime(anyLong(), any())).thenReturn(aggregatePublisher);
@@ -231,6 +232,17 @@ public class ReactiveMongoTemplateUnitTests {
verify(findPublisher).batchSize(1234);
}
@Test // DATAMONGO-2659
void executeQueryShouldUseAllowDiskSizeWhenPresent() {
when(findPublisher.batchSize(anyInt())).thenReturn(findPublisher);
Query query = new Query().allowDiskUse(true);
template.find(query, Person.class).subscribe();
verify(findPublisher).allowDiskUse(true);
}
@Test // DATAMONGO-1518
void findShouldUseCollationWhenPresent() {

View File

@@ -20,10 +20,10 @@ import static org.springframework.data.mongodb.core.aggregation.AggregationFunct
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import java.util.Arrays;
import java.util.Collections;
import org.bson.Document;
import org.junit.jupiter.api.Test;
import org.springframework.data.mongodb.core.DocumentTestUtils;
import org.springframework.data.mongodb.core.query.Criteria;
@@ -34,15 +34,15 @@ import org.springframework.data.mongodb.core.query.Criteria;
* @author Thomas Darimont
* @author Gustavo de Geus
*/
public class GroupOperationUnitTests {
class GroupOperationUnitTests {
@Test
public void rejectsNullFields() {
void rejectsNullFields() {
assertThatIllegalArgumentException().isThrownBy(() -> new GroupOperation((Fields) null));
}
@Test // DATAMONGO-759
public void groupOperationWithNoGroupIdFieldsShouldGenerateNullAsGroupId() {
void groupOperationWithNoGroupIdFieldsShouldGenerateNullAsGroupId() {
GroupOperation operation = new GroupOperation(Fields.from());
ExposedFields fields = operation.getFields();
@@ -54,7 +54,7 @@ public class GroupOperationUnitTests {
}
@Test // DATAMONGO-759
public void groupOperationWithNoGroupIdFieldsButAdditionalFieldsShouldGenerateNullAsGroupId() {
void groupOperationWithNoGroupIdFieldsButAdditionalFieldsShouldGenerateNullAsGroupId() {
GroupOperation operation = new GroupOperation(Fields.from()).count().as("cnt").last("foo").as("foo");
ExposedFields fields = operation.getFields();
@@ -68,62 +68,62 @@ public class GroupOperationUnitTests {
}
@Test
public void createsGroupOperationWithSingleField() {
void createsGroupOperationWithSingleField() {
GroupOperation operation = new GroupOperation(fields("a"));
Document groupClause = extractDocumentFromGroupOperation(operation);
assertThat(groupClause.get(UNDERSCORE_ID)).isEqualTo((Object) "$a");
assertThat(groupClause).containsEntry(UNDERSCORE_ID, "$a");
}
@Test
public void createsGroupOperationWithMultipleFields() {
void createsGroupOperationWithMultipleFields() {
GroupOperation operation = new GroupOperation(fields("a").and("b", "c"));
Document groupClause = extractDocumentFromGroupOperation(operation);
Document idClause = DocumentTestUtils.getAsDocument(groupClause, UNDERSCORE_ID);
assertThat(idClause.get("a")).isEqualTo((Object) "$a");
assertThat(idClause.get("b")).isEqualTo((Object) "$c");
assertThat(idClause).containsEntry("a", "$a")
.containsEntry("b", "$c");
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndSumOperation() {
void groupFactoryMethodWithMultipleFieldsAndSumOperation() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("e");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document eOp = DocumentTestUtils.getAsDocument(groupClause, "e");
assertThat(eOp).isEqualTo((Document) new Document("$sum", "$e"));
assertThat(eOp).isEqualTo(new Document("$sum", "$e"));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndSumOperationWithAlias() {
void groupFactoryMethodWithMultipleFieldsAndSumOperationWithAlias() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("ee");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document eOp = DocumentTestUtils.getAsDocument(groupClause, "ee");
assertThat(eOp).isEqualTo((Document) new Document("$sum", "$e"));
assertThat(eOp).isEqualTo(new Document("$sum", "$e"));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndCountOperationWithout() {
void groupFactoryMethodWithMultipleFieldsAndCountOperationWithout() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.count().as("count");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document eOp = DocumentTestUtils.getAsDocument(groupClause, "count");
assertThat(eOp).isEqualTo((Document) new Document("$sum", 1));
assertThat(eOp).isEqualTo(new Document("$sum", 1));
}
@Test
public void groupFactoryMethodWithMultipleFieldsAndMultipleAggregateOperationsWithAlias() {
void groupFactoryMethodWithMultipleFieldsAndMultipleAggregateOperationsWithAlias() {
GroupOperation groupOperation = Aggregation.group(fields("a", "b").and("c")) //
.sum("e").as("sum") //
@@ -131,58 +131,58 @@ public class GroupOperationUnitTests {
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document sum = DocumentTestUtils.getAsDocument(groupClause, "sum");
assertThat(sum).isEqualTo((Document) new Document("$sum", "$e"));
assertThat(sum).isEqualTo(new Document("$sum", "$e"));
Document min = DocumentTestUtils.getAsDocument(groupClause, "min");
assertThat(min).isEqualTo((Document) new Document("$min", "$e"));
assertThat(min).isEqualTo(new Document("$min", "$e"));
}
@Test
public void groupOperationPushWithValue() {
void groupOperationPushWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").push(1).as("x");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document push = DocumentTestUtils.getAsDocument(groupClause, "x");
assertThat(push).isEqualTo((Document) new Document("$push", 1));
assertThat(push).isEqualTo(new Document("$push", 1));
}
@Test
public void groupOperationPushWithReference() {
void groupOperationPushWithReference() {
GroupOperation groupOperation = Aggregation.group("a", "b").push("ref").as("x");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document push = DocumentTestUtils.getAsDocument(groupClause, "x");
assertThat(push).isEqualTo((Document) new Document("$push", "$ref"));
assertThat(push).isEqualTo(new Document("$push", "$ref"));
}
@Test
public void groupOperationAddToSetWithReference() {
void groupOperationAddToSetWithReference() {
GroupOperation groupOperation = Aggregation.group("a", "b").addToSet("ref").as("x");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document push = DocumentTestUtils.getAsDocument(groupClause, "x");
assertThat(push).isEqualTo((Document) new Document("$addToSet", "$ref"));
assertThat(push).isEqualTo(new Document("$addToSet", "$ref"));
}
@Test
public void groupOperationAddToSetWithValue() {
void groupOperationAddToSetWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").addToSet(42).as("x");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document push = DocumentTestUtils.getAsDocument(groupClause, "x");
assertThat(push).isEqualTo((Document) new Document("$addToSet", 42));
assertThat(push).isEqualTo(new Document("$addToSet", 42));
}
@Test // DATAMONGO-979
public void shouldRenderSizeExpressionInGroup() {
void shouldRenderSizeExpressionInGroup() {
GroupOperation groupOperation = Aggregation //
.group("username") //
@@ -192,11 +192,13 @@ public class GroupOperationUnitTests {
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document tagsCount = DocumentTestUtils.getAsDocument(groupClause, "tags_count");
assertThat(tagsCount.get("$first")).isEqualTo((Object) new Document("$size", Arrays.asList("$tags")));
assertThat(tagsCount)
.containsEntry("$first", new Document("$size", Collections
.singletonList("$tags")));
}
@Test // DATAMONGO-1327
public void groupOperationStdDevSampWithValue() {
void groupOperationStdDevSampWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").stdDevSamp("field").as("fieldStdDevSamp");
@@ -207,7 +209,7 @@ public class GroupOperationUnitTests {
}
@Test // DATAMONGO-1327
public void groupOperationStdDevPopWithValue() {
void groupOperationStdDevPopWithValue() {
GroupOperation groupOperation = Aggregation.group("a", "b").stdDevPop("field").as("fieldStdDevPop");
@@ -218,7 +220,7 @@ public class GroupOperationUnitTests {
}
@Test // DATAMONGO-1784
public void shouldRenderSumWithExpressionInGroup() {
void shouldRenderSumWithExpressionInGroup() {
GroupOperation groupOperation = Aggregation //
.group("username") //
@@ -231,16 +233,30 @@ public class GroupOperationUnitTests {
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document foobar = DocumentTestUtils.getAsDocument(groupClause, "foobar");
assertThat(foobar.get("$sum")).isEqualTo(new Document("$cond",
assertThat(foobar).containsEntry("$sum", new Document("$cond",
new Document("if", new Document("$eq", Arrays.asList("$foo", "bar"))).append("then", 1).append("else", -1)));
}
@Test // DATAMONGO-1784
public void sumWithNullExpressionShouldThrowException() {
void sumWithNullExpressionShouldThrowException() {
assertThatIllegalArgumentException()
.isThrownBy(() -> Aggregation.group("username").sum((AggregationExpression) null));
}
@Test // DATAMONGO-2651
void accumulatorShouldBeAllowedOnGroupOperation() {
GroupOperation groupOperation = Aggregation.group("id")
.accumulate(
ScriptOperators.accumulatorBuilder().init("inti").accumulate("acc").merge("merge").finalize("finalize"))
.as("accumulated-value");
Document groupClause = extractDocumentFromGroupOperation(groupOperation);
Document accumulatedValue = DocumentTestUtils.getAsDocument(groupClause, "accumulated-value");
assertThat(accumulatedValue).containsKey("$accumulator");
}
private Document extractDocumentFromGroupOperation(GroupOperation groupOperation) {
Document document = groupOperation.toDocument(Aggregation.DEFAULT_CONTEXT);
Document groupClause = DocumentTestUtils.getAsDocument(document, "$group");

View File

@@ -30,8 +30,6 @@ import java.time.LocalDateTime;
import java.time.temporal.ChronoUnit;
import java.util.*;
import javax.persistence.metamodel.EmbeddableType;
import org.assertj.core.api.Assertions;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -2181,15 +2179,6 @@ public class MappingMongoConverterUnitTests {
assertThat(((LinkedHashMap) result.get("cluster")).get("_id")).isEqualTo(100L);
}
@Test // GH-3546
void readFlattensNestedDocumentToStringIfNecessary() {
org.bson.Document source = new org.bson.Document("street", new org.bson.Document("json", "string").append("_id", UUID.randomUUID()));
Address target = converter.read(Address.class, source);
assertThat(target.street).isNotNull();
}
static class GenericType<T> {
T content;
}

View File

@@ -771,19 +771,6 @@ public class QueryMapperUnitTests {
assertThat(document).containsEntry("legacyPoint.y", 20D);
}
@Test // GH-3544
void exampleWithCombinedCriteriaShouldBeMappedCorrectly() {
Foo probe = new Foo();
probe.embedded = new EmbeddedClass();
probe.embedded.id = "conflux";
Query query = query(byExample(probe).and("listOfItems").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class));
assertThat(document).containsEntry("embedded\\._id", "conflux").containsEntry("my_items", new org.bson.Document("$exists", true));
}
@Test // DATAMONGO-1988
void mapsStringObjectIdRepresentationToObjectIdWhenReferencingIdProperty() {

View File

@@ -66,7 +66,6 @@ import org.springframework.data.mongodb.test.util.MongoClientExtension;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.querydsl.ReactiveQuerydslPredicateExecutor;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.test.context.junit.jupiter.SpringExtension;
@@ -77,11 +76,13 @@ import com.mongodb.reactivestreams.client.MongoClient;
*
* @author Mark Paluch
* @author Christoph Strobl
* @author Jens Schauder
*/
@ExtendWith({ MongoClientExtension.class, SpringExtension.class })
public class ReactiveMongoRepositoryTests {
class ReactiveMongoRepositoryTests {
static @Client MongoClient mongoClient;
private static final int PERSON_COUNT = 7;
private static @Client MongoClient mongoClient;
@Autowired ReactiveMongoTemplate template;
@@ -89,8 +90,8 @@ public class ReactiveMongoRepositoryTests {
@Autowired ReactiveContactRepository contactRepository;
@Autowired ReactiveCappedCollectionRepository cappedRepository;
Person dave, oliver, carter, boyd, stefan, leroi, alicia;
QPerson person = QPerson.person;
private Person dave, oliver, carter, boyd, stefan, leroi, alicia;
private QPerson person = QPerson.person;
@Configuration
static class Config extends AbstractReactiveMongoConfiguration {
@@ -140,14 +141,14 @@ public class ReactiveMongoRepositoryTests {
}
@BeforeAll
public static void cleanDb() {
static void cleanDb() {
MongoTestUtils.createOrReplaceCollectionNow("reactive", "person", mongoClient);
MongoTestUtils.createOrReplaceCollectionNow("reactive", "capped", mongoClient);
}
@BeforeEach
public void setUp() throws Exception {
void setUp() throws Exception {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -164,35 +165,35 @@ public class ReactiveMongoRepositoryTests {
alicia = new Person("Alicia", "Keys", 30, Sex.FEMALE);
repository.saveAll(Arrays.asList(oliver, carter, boyd, stefan, leroi, alicia, dave)).as(StepVerifier::create) //
.expectNextCount(7) //
.expectNextCount(PERSON_COUNT) //
.verifyComplete();
}
@Test // DATAMONGO-1444
public void shouldFindByLastName() {
void shouldFindByLastName() {
repository.findByLastname(dave.getLastname()).as(StepVerifier::create).expectNextCount(2).verifyComplete();
}
@Test // DATAMONGO-1444
public void shouldFindOneByLastName() {
void shouldFindOneByLastName() {
repository.findOneByLastname(carter.getLastname()).as(StepVerifier::create).expectNext(carter).verifyComplete();
}
@Test // DATAMONGO-1444
public void shouldFindOneByPublisherOfLastName() {
void shouldFindOneByPublisherOfLastName() {
repository.findByLastname(Mono.just(carter.getLastname())).as(StepVerifier::create).expectNext(carter)
.verifyComplete();
}
@Test // DATAMONGO-1444
public void shouldFindByPublisherOfLastNameIn() {
void shouldFindByPublisherOfLastNameIn() {
repository.findByLastnameIn(Flux.just(carter.getLastname(), dave.getLastname())).as(StepVerifier::create) //
.expectNextCount(3) //
.verifyComplete();
}
@Test // DATAMONGO-1444
public void shouldFindByPublisherOfLastNameInAndAgeGreater() {
void shouldFindByPublisherOfLastNameInAndAgeGreater() {
repository.findByLastnameInAndAgeGreaterThan(Flux.just(carter.getLastname(), dave.getLastname()), 41)
.as(StepVerifier::create) //
@@ -201,7 +202,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void shouldFindUsingPublishersInStringQuery() {
void shouldFindUsingPublishersInStringQuery() {
repository.findStringQuery(Flux.just("Beauford", "Matthews"), Mono.just(41)).as(StepVerifier::create) //
.expectNextCount(2) //
@@ -209,7 +210,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void shouldFindByLastNameAndSort() {
void shouldFindByLastNameAndSort() {
repository.findByLastname("Matthews", Sort.by(ASC, "age")).as(StepVerifier::create) //
.expectNext(oliver, dave) //
@@ -221,7 +222,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void shouldUseTailableCursor() throws Exception {
void shouldUseTailableCursor() throws Exception {
template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
@@ -246,7 +247,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void shouldUseTailableCursorWithProjection() throws Exception {
void shouldUseTailableCursorWithProjection() throws Exception {
template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
@@ -277,7 +278,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2080
public void shouldUseTailableCursorWithDtoProjection() {
void shouldUseTailableCursorWithDtoProjection() {
template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
@@ -290,7 +291,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void findsPeopleByLocationWithinCircle() {
void findsPeopleByLocationWithinCircle() {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
@@ -302,7 +303,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void findsPeopleByPageableLocationWithinCircle() {
void findsPeopleByPageableLocationWithinCircle() {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
@@ -315,7 +316,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void findsPeopleGeoresultByLocationWithinBox() {
void findsPeopleGeoresultByLocationWithinBox() {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
@@ -330,7 +331,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void findsPeoplePageableGeoresultByLocationWithinBox() throws InterruptedException {
void findsPeoplePageableGeoresultByLocationWithinBox() throws InterruptedException {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
@@ -350,7 +351,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1444
public void findsPeopleByLocationWithinBox() throws InterruptedException {
void findsPeopleByLocationWithinBox() throws InterruptedException {
Point point = new Point(-73.99171, 40.738868);
dave.setLocation(point);
@@ -366,23 +367,23 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1865
public void shouldErrorOnFindOneWithNonUniqueResult() {
void shouldErrorOnFindOneWithNonUniqueResult() {
repository.findOneByLastname(dave.getLastname()).as(StepVerifier::create)
.expectError(IncorrectResultSizeDataAccessException.class).verify();
}
@Test // DATAMONGO-1865
public void shouldReturnFirstFindFirstWithMoreResults() {
void shouldReturnFirstFindFirstWithMoreResults() {
repository.findFirstByLastname(dave.getLastname()).as(StepVerifier::create).expectNextCount(1).verifyComplete();
}
@Test // DATAMONGO-2030
public void shouldReturnExistsBy() {
void shouldReturnExistsBy() {
repository.existsByLastname(dave.getLastname()).as(StepVerifier::create).expectNext(true).verifyComplete();
}
@Test // DATAMONGO-1979
public void findAppliesAnnotatedSort() {
void findAppliesAnnotatedSort() {
repository.findByAgeGreaterThan(40).collectList().as(StepVerifier::create).consumeNextWith(result -> {
assertThat(result).containsSequence(carter, boyd, dave, leroi);
@@ -390,7 +391,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1979
public void findWithSortOverwritesAnnotatedSort() {
void findWithSortOverwritesAnnotatedSort() {
repository.findByAgeGreaterThan(40, Sort.by(Direction.ASC, "age")).collectList().as(StepVerifier::create)
.consumeNextWith(result -> {
@@ -399,7 +400,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2181
public void considersRepositoryCollectionName() {
void considersRepositoryCollectionName() {
repository.deleteAll() //
.as(StepVerifier::create) //
@@ -428,7 +429,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2182
public void shouldFindPersonsWhenUsingQueryDslPerdicatedOnIdProperty() {
void shouldFindPersonsWhenUsingQueryDslPerdicatedOnIdProperty() {
repository.findAll(person.id.in(Arrays.asList(dave.id, carter.id))) //
.collectList() //
@@ -439,24 +440,19 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void findListOfSingleValue() {
void findListOfSingleValue() {
repository.findAllLastnames() //
.collectList() //
.as(StepVerifier::create) //
.assertNext(actual -> {
assertThat(actual) //
.contains("Lessard") //
.contains("Keys") //
.contains("Tinsley") //
.contains("Beauford") //
.contains("Moore") //
.contains("Matthews");
assertThat(actual)
.contains("Lessard", "Keys", "Tinsley", "Beauford", "Moore", "Matthews");
}).verifyComplete();
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithPlaceholderValue() {
void annotatedAggregationWithPlaceholderValue() {
repository.groupByLastnameAnd("firstname") //
.collectList() //
@@ -473,7 +469,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithSort() {
void annotatedAggregationWithSort() {
repository.groupByLastnameAnd("firstname", Sort.by("lastname")) //
.collectList() //
@@ -492,7 +488,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithPageable() {
void annotatedAggregationWithPageable() {
repository.groupByLastnameAnd("firstname", PageRequest.of(1, 2, Sort.by("lastname"))) //
.collectList() //
@@ -507,7 +503,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithSingleSimpleResult() {
void annotatedAggregationWithSingleSimpleResult() {
repository.sumAge() //
.as(StepVerifier::create) //
@@ -516,7 +512,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithAggregationResultAsReturnType() {
void annotatedAggregationWithAggregationResultAsReturnType() {
repository.sumAgeAndReturnRawResult() //
.as(StepVerifier::create) //
@@ -525,7 +521,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithAggregationResultAsReturnTypeAndProjection() {
void annotatedAggregationWithAggregationResultAsReturnTypeAndProjection() {
repository.sumAgeAndReturnSumWrapper() //
.as(StepVerifier::create) //
@@ -534,7 +530,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2374
public void findsWithNativeProjection() {
void findsWithNativeProjection() {
repository.findDocumentById(dave.getId()) //
.as(StepVerifier::create) //
@@ -544,7 +540,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2153
public void annotatedAggregationWithAggregationResultAsMap() {
void annotatedAggregationWithAggregationResultAsMap() {
repository.sumAgeAndReturnSumAsMap() //
.as(StepVerifier::create) //
@@ -554,7 +550,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2403
public void annotatedAggregationExtractingSimpleValueIsEmptyForEmptyDocument() {
void annotatedAggregationExtractingSimpleValueIsEmptyForEmptyDocument() {
Person p = new Person("project-on-lastanme", null);
repository.save(p).then().as(StepVerifier::create).verifyComplete();
@@ -565,7 +561,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2403
public void annotatedAggregationSkipsEmptyDocumentsWhenExtractingSimpleValue() {
void annotatedAggregationSkipsEmptyDocumentsWhenExtractingSimpleValue() {
String firstname = "project-on-lastanme";
@@ -584,7 +580,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-2406
public void deleteByShouldHandleVoidResultTypeCorrectly() {
void deleteByShouldHandleVoidResultTypeCorrectly() {
repository.deleteByLastname(dave.getLastname()) //
.as(StepVerifier::create) //
@@ -596,7 +592,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1997
public void deleteByShouldAllowDeletedCountAsResult() {
void deleteByShouldAllowDeletedCountAsResult() {
repository.deleteCountByLastname(dave.getLastname()) //
.as(StepVerifier::create) //
@@ -605,7 +601,7 @@ public class ReactiveMongoRepositoryTests {
}
@Test // DATAMONGO-1997
public void deleteByShouldAllowSingleDocumentRemovalCorrectly() {
void deleteByShouldAllowSingleDocumentRemovalCorrectly() {
repository.deleteSinglePersonByLastname(carter.getLastname()) //
.as(StepVerifier::create) //
@@ -617,6 +613,18 @@ public class ReactiveMongoRepositoryTests {
.verifyComplete();
}
@Test // DATAMONGO-2652
void deleteAllById() {
repository.deleteAllById(Arrays.asList(carter.id, dave.id)) //
.as(StepVerifier::create) //
.verifyComplete();
repository.count().as(StepVerifier::create) //
.expectNext(PERSON_COUNT - 2L) //
.verifyComplete();
}
interface ReactivePersonRepository
extends ReactiveMongoRepository<Person, String>, ReactiveQuerydslPredicateExecutor<Person> {
@@ -717,7 +725,7 @@ public class ReactiveMongoRepositoryTests {
String key;
double random;
public Capped(String key, double random) {
Capped(String key, double random) {
this.key = key;
this.random = random;
}

View File

@@ -15,11 +15,11 @@
*/
package org.springframework.data.mongodb.repository.support;
import static java.util.Arrays.*;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.domain.ExampleMatcher.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
@@ -58,12 +58,13 @@ import org.springframework.transaction.support.TransactionTemplate;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Jens Schauder
*/
@ExtendWith({ MongoTemplateExtension.class, MongoServerCondition.class })
public class SimpleMongoRepositoryTests {
class SimpleMongoRepositoryTests {
@Template(initialEntitySet = Person.class) //
static MongoTestTemplate template;
private static MongoTestTemplate template;
private Person oliver, dave, carter, boyd, stefan, leroi, alicia;
private List<Person> all;
@@ -73,7 +74,7 @@ public class SimpleMongoRepositoryTests {
template);
@BeforeEach
public void setUp() {
void setUp() {
repository.deleteAll();
@@ -85,21 +86,21 @@ public class SimpleMongoRepositoryTests {
leroi = new Person("Leroi", "Moore", 41);
alicia = new Person("Alicia", "Keys", 30, Sex.FEMALE);
all = repository.saveAll(Arrays.asList(oliver, dave, carter, boyd, stefan, leroi, alicia));
all = repository.saveAll(asList(oliver, dave, carter, boyd, stefan, leroi, alicia));
}
@Test
public void findALlFromCustomCollectionName() {
assertThat(repository.findAll()).hasSize(all.size());
void findAllFromCustomCollectionName() {
assertThat(repository.findAll()).hasSameSizeAs(all);
}
@Test
public void findOneFromCustomCollectionName() {
assertThat(repository.findById(dave.getId()).get()).isEqualTo(dave);
void findOneFromCustomCollectionName() {
assertThat(repository.findById(dave.getId())).contains(dave);
}
@Test
public void deleteFromCustomCollectionName() {
void deleteFromCustomCollectionName() {
repository.delete(dave);
@@ -107,7 +108,7 @@ public class SimpleMongoRepositoryTests {
}
@Test
public void deleteByIdFromCustomCollectionName() {
void deleteByIdFromCustomCollectionName() {
repository.deleteById(dave.getId());
@@ -115,7 +116,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1054
public void shouldInsertSingle() {
void shouldInsertSingle() {
String randomId = UUID.randomUUID().toString();
@@ -126,7 +127,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1054
public void shouldInsertMultipleFromList() {
void shouldInsertMultipleFromList() {
String randomId = UUID.randomUUID().toString();
Map<String, Person> idToPerson = new HashMap<String, Person>();
@@ -140,12 +141,12 @@ public class SimpleMongoRepositoryTests {
List<Person> saved = repository.insert(persons);
assertThat(saved).hasSize(persons.size());
assertThat(saved).hasSameSizeAs(persons);
assertThatAllReferencePersonsWereStoredCorrectly(idToPerson, saved);
}
@Test // DATAMONGO-1054
public void shouldInsertMutlipleFromSet() {
void shouldInsertMutlipleFromSet() {
String randomId = UUID.randomUUID().toString();
Map<String, Person> idToPerson = new HashMap<String, Person>();
@@ -159,12 +160,12 @@ public class SimpleMongoRepositoryTests {
List<Person> saved = repository.insert(persons);
assertThat(saved).hasSize(persons.size());
assertThat(saved).hasSameSizeAs(persons);
assertThatAllReferencePersonsWereStoredCorrectly(idToPerson, saved);
}
@Test // DATAMONGO-1245, DATAMONGO-1464
public void findByExampleShouldLookUpEntriesCorrectly() {
void findByExampleShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setLastname("Matthews");
@@ -177,7 +178,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1464
public void findByExampleMultiplePagesShouldLookUpEntriesCorrectly() {
void findByExampleMultiplePagesShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setLastname("Matthews");
@@ -190,7 +191,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldLookUpEntriesCorrectly() {
void findAllByExampleShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setLastname("Matthews");
@@ -200,7 +201,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingNestedObject() {
void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingNestedObject() {
dave.setAddress(new Address("1600 Pennsylvania Ave NW", "20500", "Washington"));
repository.save(dave);
@@ -216,7 +217,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingPartialNestedObject() {
void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingPartialNestedObject() {
dave.setAddress(new Address("1600 Pennsylvania Ave NW", "20500", "Washington"));
repository.save(dave);
@@ -232,7 +233,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldNotFindEntriesWhenUsingPartialNestedObjectInStrictMode() {
void findAllByExampleShouldNotFindEntriesWhenUsingPartialNestedObjectInStrictMode() {
dave.setAddress(new Address("1600 Pennsylvania Ave NW", "20500", "Washington"));
repository.save(dave);
@@ -247,7 +248,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingNestedObjectInStrictMode() {
void findAllByExampleShouldLookUpEntriesCorrectlyWhenUsingNestedObjectInStrictMode() {
dave.setAddress(new Address("1600 Pennsylvania Ave NW", "20500", "Washington"));
repository.save(dave);
@@ -262,7 +263,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldRespectStringMatchMode() {
void findAllByExampleShouldRespectStringMatchMode() {
Person sample = new Person();
sample.setLastname("Mat");
@@ -274,7 +275,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldResolveDbRefCorrectly() {
void findAllByExampleShouldResolveDbRefCorrectly() {
User user = new User();
user.setId("c0nf1ux");
@@ -294,7 +295,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldResolveLegacyCoordinatesCorrectly() {
void findAllByExampleShouldResolveLegacyCoordinatesCorrectly() {
Person megan = new Person("megan", "tarash");
megan.setLocation(new Point(41.85003D, -87.65005D));
@@ -309,7 +310,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldResolveGeoJsonCoordinatesCorrectly() {
void findAllByExampleShouldResolveGeoJsonCoordinatesCorrectly() {
Person megan = new Person("megan", "tarash");
megan.setLocation(new GeoJsonPoint(41.85003D, -87.65005D));
@@ -324,7 +325,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findAllByExampleShouldProcessInheritanceCorrectly() {
void findAllByExampleShouldProcessInheritanceCorrectly() {
PersonExtended reference = new PersonExtended();
reference.setLastname("Matthews");
@@ -340,7 +341,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void findOneByExampleShouldLookUpEntriesCorrectly() {
void findOneByExampleShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setFirstname("Dave");
@@ -351,7 +352,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void existsByExampleShouldLookUpEntriesCorrectly() {
void existsByExampleShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setFirstname("Dave");
@@ -362,7 +363,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1245
public void countByExampleShouldLookUpEntriesCorrectly() {
void countByExampleShouldLookUpEntriesCorrectly() {
Person sample = new Person();
sample.setLastname("Matthews");
@@ -372,7 +373,7 @@ public class SimpleMongoRepositoryTests {
}
@Test // DATAMONGO-1896
public void saveAllUsesEntityCollection() {
void saveAllUsesEntityCollection() {
Person first = new PersonExtended();
first.setEmail("foo@bar.com");
@@ -384,7 +385,7 @@ public class SimpleMongoRepositoryTests {
repository.deleteAll();
repository.saveAll(Arrays.asList(first, second));
repository.saveAll(asList(first, second));
assertThat(repository.findAll()).containsExactlyInAnyOrder(first, second);
}
@@ -392,7 +393,7 @@ public class SimpleMongoRepositoryTests {
@Test // DATAMONGO-2130
@EnableIfReplicaSetAvailable
@EnableIfMongoServerVersion(isGreaterThanEqual = "4.0")
public void countShouldBePossibleInTransaction() {
void countShouldBePossibleInTransaction() {
MongoTransactionManager txmgr = new MongoTransactionManager(template.getMongoDbFactory());
TransactionTemplate tt = new TransactionTemplate(txmgr);
@@ -416,7 +417,7 @@ public class SimpleMongoRepositoryTests {
@Test // DATAMONGO-2130
@EnableIfReplicaSetAvailable
@EnableIfMongoServerVersion(isGreaterThanEqual = "4.0")
public void existsShouldBePossibleInTransaction() {
void existsShouldBePossibleInTransaction() {
MongoTransactionManager txmgr = new MongoTransactionManager(template.getMongoDbFactory());
TransactionTemplate tt = new TransactionTemplate(txmgr);
@@ -435,6 +436,15 @@ public class SimpleMongoRepositoryTests {
assertThat(exists).isTrue();
}
@Test // DATAMONGO-2652
void deleteAllByIds() {
repository.deleteAllById(asList(dave.getId(), carter.getId()));
assertThat(repository.findAll()) //
.hasSize(all.size() - 2).doesNotContain(dave, carter);
}
private void assertThatAllReferencePersonsWereStoredCorrectly(Map<String, Person> references, List<Person> saved) {
for (Person person : saved) {

View File

@@ -19,8 +19,6 @@ import example.first.First
import io.mockk.mockk
import io.mockk.verify
import org.junit.Test
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.aggregation.TypedAggregation
import org.springframework.data.mongodb.core.query.NearQuery
import org.springframework.data.mongodb.core.query.Query
import org.springframework.data.mongodb.core.query.Update
@@ -30,7 +28,6 @@ import reactor.core.publisher.Mono
* @author Sebastien Deleuze
* @author Christoph Strobl
* @author Mark Paluch
* @author Wonwoo Lee
*/
class ReactiveMongoOperationsExtensionsTests {
@@ -601,6 +598,7 @@ class ReactiveMongoOperationsExtensionsTests {
verify { operations.findDistinct(query, "field", "collection", First::class.java, String::class.java) }
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
@@ -608,55 +606,6 @@ class ReactiveMongoOperationsExtensionsTests {
val query = mockk<Query>()
operations.findDistinct<String>(query, "field", First::class)
verify {
operations.findDistinct(
query,
"field",
First::class.java,
String::class.java
)
}
}
@Test // #893
fun `aggregate(TypedAggregation, String, KClass) should call java counterpart`() {
val aggregation = mockk<TypedAggregation<String>>()
operations.aggregate<First>(aggregation, "foo")
verify { operations.aggregate(aggregation, "foo", First::class.java) }
}
@Test // #893
fun `aggregate(TypedAggregation, KClass) should call java counterpart`() {
val aggregation = mockk<TypedAggregation<String>>()
operations.aggregate<First>(aggregation)
verify { operations.aggregate(aggregation, First::class.java) }
}
@Test // #893
fun `aggregate(Aggregation, KClass) should call java counterpart`() {
val aggregation = mockk<Aggregation>()
operations.aggregate<String, First>(aggregation)
verify {
operations.aggregate(
aggregation,
String::class.java,
First::class.java
)
}
}
@Test // #893
fun `aggregate(Aggregation, String) should call java counterpart`() {
val aggregation = mockk<Aggregation>()
operations.aggregate<First>(aggregation, "foo")
verify { operations.aggregate(aggregation, "foo", First::class.java) }
verify { operations.findDistinct(query, "field", First::class.java, String::class.java) }
}
}

View File

@@ -1446,7 +1446,6 @@ The geo-near operations return a `GeoResults` wrapper object that encapsulates `
MongoDB supports https://geojson.org/[GeoJSON] and simple (legacy) coordinate pairs for geospatial data. Those formats can both be used for storing as well as querying data. See the https://docs.mongodb.org/manual/core/2dsphere/#geospatial-indexes-store-geojson/[MongoDB manual on GeoJSON support] to learn about requirements and restrictions.
[[mongo.geo-json.domain.classes]]
==== GeoJSON Types in Domain Classes
Usage of https://geojson.org/[GeoJSON] types in domain classes is straightforward. The `org.springframework.data.mongodb.core.geo` package contains types such as `GeoJsonPoint`, `GeoJsonPolygon`, and others. These types are extend the existing `org.springframework.data.geo` types. The following example uses a `GeoJsonPoint`:
@@ -1470,7 +1469,6 @@ public class Store {
----
====
[[mongo.geo-json.query-methods]]
==== GeoJSON Types in Repository Query Methods
Using GeoJSON types as repository query parameters forces usage of the `$geometry` operator when creating the query, as the following example shows:
@@ -1531,7 +1529,6 @@ repo.findByLocationWithin( <4>
<4> Use the legacy format `$polygon` operator.
====
[[mongo.geo-json.metrics]]
==== Metrics and Distance calculation
Then MongoDB `$geoNear` operator allows usage of a GeoJSON Point or legacy coordinate pairs.
@@ -1703,29 +1700,6 @@ Returning the 3 Documents just like the GeoJSON variant:
<4> Distance from center point in _Kilometers_ - take it times 1000 to match _Meters_ of the GeoJSON variant.
====
[[mongo.geo-json.jackson-modules]]
==== GeoJSON Jackson Modules
By using the <<core.web>>, Spring Data registers additional Jackson ``Modules``s to the `ObjectMapper` for deserializing common Spring Data domain types.
Please refer to the <<core.web.basic.jackson-mappers>> section to learn more about the infrastructure setup of this feature.
The MongoDB module additionally registers ``JsonDeserializer``s for the following GeoJSON types via its `GeoJsonConfiguration` exposing the `GeoJsonModule`.
----
org.springframework.data.mongodb.core.geo.GeoJsonPoint
org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint
org.springframework.data.mongodb.core.geo.GeoJsonLineString
org.springframework.data.mongodb.core.geo.GeoJsonMultiLineString
org.springframework.data.mongodb.core.geo.GeoJsonPolygon
org.springframework.data.mongodb.core.geo.GeoJsonMultiPolygon
----
[NOTE]
====
The `GeoJsonModule` only registers ``JsonDeserializer``s!
The next major version (`4.0`) will register both, ``JsonDeserializer``s and ``JsonSerializer``s for GeoJSON types by default.
====
[[mongo.textsearch]]
=== Full-text Queries
@@ -1757,7 +1731,7 @@ A query searching for `coffee cake` can be defined and run as follows:
[source,java]
----
Query query = TextQuery
.queryText(new TextCriteria().matchingAny("coffee", "cake"));
.searching(new TextCriteria().matchingAny("coffee", "cake"));
List<Document> page = template.find(query, Document.class);
----
@@ -1770,7 +1744,7 @@ To sort results by relevance according to the `weights` use `TextQuery.sortBySco
[source,java]
----
Query query = TextQuery
.queryText(new TextCriteria().matchingAny("coffee", "cake"))
.searching(new TextCriteria().matchingAny("coffee", "cake"))
.sortByScore() <1>
.includeScore(); <2>
@@ -1785,8 +1759,8 @@ You can exclude search terms by prefixing the term with `-` or by using `notMatc
[source,java]
----
// search for 'coffee' and not 'cake'
TextQuery.queryText(new TextCriteria().matching("coffee").matching("-cake"));
TextQuery.queryText(new TextCriteria().matching("coffee").notMatching("cake"));
TextQuery.searching(new TextCriteria().matching("coffee").matching("-cake"));
TextQuery.searching(new TextCriteria().matching("coffee").notMatching("cake"));
----
`TextCriteria.matching` takes the provided term as is. Therefore, you can define phrases by putting them between double quotation marks (for example, `\"coffee cake\")` or using by `TextCriteria.phrase.` The following example shows both ways of defining a phrase:
@@ -1794,8 +1768,8 @@ TextQuery.queryText(new TextCriteria().matching("coffee").notMatching("cake"));
[source,java]
----
// search for phrase 'coffee cake'
TextQuery.queryText(new TextCriteria().matching("\"coffee cake\""));
TextQuery.queryText(new TextCriteria().phrase("coffee cake"));
TextQuery.searching(new TextCriteria().matching("\"coffee cake\""));
TextQuery.searching(new TextCriteria().phrase("coffee cake"));
----
You can set flags for `$caseSensitive` and `$diacriticSensitive` by using the corresponding methods on `TextCriteria`. Note that these two optional flags have been introduced in MongoDB 3.2 and are not included in the query unless explicitly set.
@@ -2247,7 +2221,6 @@ With the introduction of <<mongo.transactions>> this was no longer possible beca
So in version 2.x `MongoOperations.count()` would use the collection statistics if no transaction was in progress, and the aggregation variant if so.
As of Spring Data MongoDB 3.x any `count` operation uses regardless the existence of filter criteria the aggregation-based count approach via MongoDBs `countDocuments`.
If the application is fine with the limitations of working upon collection statistics `MongoOperations.estimatedCount()` offers an alternative.
[NOTE]
====

View File

@@ -97,10 +97,3 @@ Query query = new Query(new Criteria().alike(example));
List<Person> result = template.find(query, Person.class);
----
====
[NOTE]
====
`UntypedExampleMatcher` is likely the right choice for you if you are storing different entities within a single collection or opted out of writing <<mongo-template.type-mapping,type hints>>.
Also, keep in mind that using `@TypeAlias` requires eager initialization of the `MappingContext`. To do so, configure `initialEntitySet` to to ensure proper alias resolution for read operations.
====

View File

@@ -1,38 +1,6 @@
Spring Data MongoDB Changelog
=============================
Changes in version 3.1.4 (2021-02-17)
-------------------------------------
* #3546 - org.bson.codecs.configuration.CodecConfigurationException: The uuidRepresentation has not been specified, so the UUID cannot be encoded.
* #3544 - alike Criteria can't add andOperator.
* #3540 - Allow access to mongoDatabaseFactory used in ReactiveMongoTemplate.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
* #3517 - GeoJson: Improper Deserialization of Document with a GeoJsonPolygon [DATAMONGO-2664].
* #3508 - Add ReactiveMongoOperations.aggregate(…) Kotlin extension [DATAMONGO-2655].
* #3474 - Search by alike() criteria is broken when type alias information is not available [DATAMONGO-2620].
* #3055 - Improve count() and countDocuments() mapping documentation and/or method availability [DATAMONGO-2192].
Changes in version 3.0.7.RELEASE (2021-02-17)
---------------------------------------------
* DATAMONGO-2671 - DateFromParts millisecondsOf returns "milliseconds" as $dateFromParts function but it should be millisecond.
* DATAMONGO-2665 - Update CI jobs with Docker Login.
* #3544 - alike Criteria can't add andOperator.
* #3534 - Update copyright year to 2021.
* #3529 - Update repository after GitHub issues migration.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
* #3517 - GeoJson: Improper Deserialization of Document with a GeoJsonPolygon [DATAMONGO-2664].
* #3474 - Search by alike() criteria is broken when type alias information is not available [DATAMONGO-2620].
Changes in version 2.2.13.RELEASE (2021-02-17)
----------------------------------------------
* #3544 - alike Criteria can't add andOperator.
* #3534 - Update copyright year to 2021.
* #3529 - Update repository after GitHub issues migration.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
Changes in version 3.2.0-M2 (2021-01-13)
----------------------------------------
* DATAMONGO-2671 - DateFromParts millisecondsOf returns "milliseconds" as $dateFromParts function but it should be millisecond.
@@ -3310,9 +3278,6 @@ Repository

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 3.1.4 (2020.0.4)
Spring Data MongoDB 3.2 M2 (2021.0.0)
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -22,5 +22,3 @@ conditions of the subcomponent's license, as noted in the LICENSE file.