Compare commits

..

30 Commits

Author SHA1 Message Date
Mark Paluch
c9251b1b29 DATAMONGO-2639 - Release version 3.1 GA (2020.0.0). 2020-10-28 15:46:54 +01:00
Mark Paluch
373f07e176 DATAMONGO-2639 - Prepare 3.1 GA (2020.0.0). 2020-10-28 15:46:31 +01:00
Mark Paluch
f5e2bdc7ef DATAMONGO-2639 - Updated changelog. 2020-10-28 15:46:17 +01:00
Mark Paluch
30e63fffe2 DATAMONGO-2625 - Updated changelog. 2020-10-28 15:03:01 +01:00
Mark Paluch
83136b4e60 DATAMONGO-2624 - Updated changelog. 2020-10-28 12:15:04 +01:00
Mark Paluch
56697545a3 DATAMONGO-2641 - Updated changelog. 2020-10-28 11:32:27 +01:00
Robin Dupret
76eecc443e DATAMONGO-2638 - Fix list item rendering in reference documentation.
Original Pull Request: #885
2020-10-27 13:31:45 +01:00
LiangYong
1f81806809 DATAMONGO-2638 - Fix aggregation input parameter syntax in reference documentation.
Original Pull Request: #881
2020-10-27 13:31:35 +01:00
Greg L. Turnquist
2d348be5b2 DATAMONGO-2629 - Use JDK 15 for next CI jobs. 2020-10-26 13:26:11 -05:00
Christoph Strobl
bbbe369093 DATAMONGO-2642 - Upgrade MongoDB drivers to 4.1.1. 2020-10-26 12:46:07 +01:00
Christoph Strobl
5aa29fc7b8 DATAMONGO-2626 - After release cleanups. 2020-10-14 14:48:47 +02:00
Christoph Strobl
05fc6546ff DATAMONGO-2626 - Prepare next development iteration. 2020-10-14 14:48:45 +02:00
Christoph Strobl
2c6e645a3d DATAMONGO-2626 - Release version 3.1 RC2 (2020.0.0). 2020-10-14 14:28:55 +02:00
Christoph Strobl
20f702512b DATAMONGO-2626 - Prepare 3.1 RC2 (2020.0.0). 2020-10-14 14:27:37 +02:00
Christoph Strobl
ad77f23364 DATAMONGO-2626 - Updated changelog. 2020-10-14 14:27:30 +02:00
Mark Paluch
9af8a73290 DATAMONGO-2616 - Polishing.
Reformat code. Merge if-statements.

Original pull request: #889.
2020-10-07 11:35:47 +02:00
Christoph Strobl
aaa4557887 DATAMONGO-2616 - Short circuit id value assignment in MongoConverter.
Original pull request: #889.
2020-10-07 11:35:40 +02:00
Mark Paluch
217be64a77 DATAMONGO-2623 - Polishing.
Avoid nullable method arguments and add assertions. Introduce build() method to AccumulatorFinalizeBuilder to build Accumulator without specifying a finalize function.

Original pull request: #887.
2020-10-07 09:51:08 +02:00
Christoph Strobl
0ef852a8fc DATAMONGO-2623 - Add support for $function and $accumulator aggregation operators.
Original pull request: #887.
2020-10-07 09:50:51 +02:00
Mark Paluch
26f0a1c7f9 DATAMONGO-2622 - Polishing.
Rename AggregationPipeline.requiresRelaxedChecking() to containsUnionWith() to avoid the concept of field validation leaking into AggregationPipeline.

Refactor AggregationOperation to consistently check their type and fallback to the operator check to allow for consistent checks when using custo AggregationOperations.

Original pull request: #886.
2020-10-06 12:09:18 +02:00
Christoph Strobl
230c32041a DATAMONGO-2622 - Add support for $unionWith aggregation stage.
We now support the $unionWith aggregation stage via the UnionWithOperation that performs a union of two collections by combining pipeline results, potentially containing duplicates, into a single result set that is handed over to the next stage.
In order to remove duplicates it is possible to append a GroupOperation right after UnionWithOperation.
If the UnionWithOperation uses a pipeline to process documents, field names within the pipeline will be treated as is. In order to map domain type property names to actual field names (considering potential org.springframework.data.mongodb.core.mapping.Field annotations) make sure the enclosing aggregation is a TypedAggregation and provide the target type for the $unionWith stage via mapFieldsTo(Class).

Original pull request: #886.
2020-10-06 12:09:12 +02:00
Mark Paluch
4548d07826 DATAMONGO-2596 - Polishing.
Refactor KPropertyPath.toString() into KProperty.asPath() extension function to allow rendering simple properties and property paths into a String property path.

Original pull request: #880.
2020-10-06 10:18:52 +02:00
Yoann de Martino
b879ec8c0f DATAMONGO-2596 - Introduce extension to render KProperty/KPropertyPath as property path.
Original pull request: #880.
2020-10-06 10:18:19 +02:00
Mark Paluch
c0581c4943 DATAMONGO-2294 - Polishing.
Reorganize imports after Delomboking. Use for-loop instead of Stream.forEach(…). Add Javadoc to methods. Add since tags.

Simplify tests.

Original pull request: #761.
2020-10-05 17:00:58 +02:00
owen-q
85022d24f3 DATAMONGO-2294 - Support query projections with collection types.
Query include/exclude now accepts a vararg array of fields to specify multiple fields at once.

Original pull request: #761.
2020-10-05 17:00:37 +02:00
Christoph Strobl
b2927ab419 DATAMONGO-2633 - Fix json parsing of nested arrays in ParameterBindingDocumentCodec.
Original pull request: #888.
2020-10-05 15:34:50 +02:00
Mark Paluch
91c39e2825 DATAMONGO-2630 - Add support for suspend repository query methods returning List<T>. 2020-09-22 15:01:09 +02:00
Greg L. Turnquist
965a34efd3 DATAMONGO-2629 - Only test other versions for local changes on main branch. 2020-09-18 11:08:38 -05:00
Mark Paluch
046cbb52a1 DATAMONGO-2608 - After release cleanups. 2020-09-16 14:05:28 +02:00
Mark Paluch
edfd07a3d0 DATAMONGO-2608 - Prepare next development iteration. 2020-09-16 14:05:24 +02:00
32 changed files with 1435 additions and 81 deletions

12
Jenkinsfile vendored
View File

@@ -46,16 +46,16 @@ pipeline {
}
}
}
stage('Publish JDK 14 + MongoDB 4.2') {
stage('Publish JDK 15 + MongoDB 4.2') {
when {
changeset "ci/openjdk14-mongodb-4.2/**"
changeset "ci/openjdk15-mongodb-4.2/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-openjdk14-with-mongodb-4.2.0", "ci/openjdk14-mongodb-4.2/")
def image = docker.build("springci/spring-data-openjdk15-with-mongodb-4.2.0", "ci/openjdk15-mongodb-4.2/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
@@ -93,7 +93,7 @@ pipeline {
stage("Test other configurations") {
when {
anyOf {
allOf {
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
@@ -139,10 +139,10 @@ pipeline {
}
}
stage("test: baseline (jdk14)") {
stage("test: baseline (jdk15)") {
agent {
docker {
image 'springci/spring-data-openjdk14-with-mongodb-4.2.0:latest'
image 'springci/spring-data-openjdk15-with-mongodb-4.2.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}

View File

@@ -1,4 +1,4 @@
FROM adoptopenjdk/openjdk14:latest
FROM adoptopenjdk/openjdk15:latest
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive

12
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-RC1</version>
<version>3.1.0</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.4.0-RC1</version>
<version>2.4.0</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.4.0-RC1</springdata.commons>
<mongo>4.1.0</mongo>
<springdata.commons>2.4.0</springdata.commons>
<mongo>4.1.1</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -134,8 +134,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-RC1</version>
<version>3.1.0</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-RC1</version>
<version>3.1.0</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-RC1</version>
<version>3.1.0</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -28,6 +28,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
@@ -75,12 +76,17 @@ class AggregationUtil {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
if (!(aggregation instanceof TypedAggregation)) {
return Aggregation.DEFAULT_CONTEXT;
}
return Aggregation.DEFAULT_CONTEXT;
Class<?> inputType = ((TypedAggregation) aggregation).getInputType();
if (aggregation.getPipeline().containsUnionWith()) {
return new RelaxedTypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
/**

View File

@@ -1977,9 +1977,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
mappingContext, queryMapper);
return aggregate(aggregation, inputCollectionName, outputType, context);
return aggregate(aggregation, inputCollectionName, outputType, null);
}
/* (non-Javadoc)

View File

@@ -29,8 +29,11 @@ import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Support class for {@link AggregationExpression} implementations.
*
* @author Christoph Strobl
* @author Matt Morrissette
* @author Mark Paluch
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
@@ -49,7 +52,6 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return toDocument(this.value, context);
}
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
return new Document(getMongoMethod(), unpack(value, context));
}
@@ -101,17 +103,19 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return value;
}
@SuppressWarnings("unchecked")
protected List<Object> append(Object value, Expand expandList) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<Object>((List) this.value);
List<Object> clone = new ArrayList<>((List<Object>) this.value);
if (value instanceof Collection && Expand.EXPAND_VALUES.equals(expandList)) {
clone.addAll((Collection<?>) value);
} else {
clone.add(value);
}
return clone;
}
@@ -129,25 +133,72 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return append(value, Expand.EXPAND_VALUES);
}
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> append(String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
java.util.Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.put(key, value);
return clone;
}
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> remove(String key) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.remove(key);
return clone;
}
/**
* Append the given key at the position in the underlying {@link LinkedHashMap}.
*
* @param index
* @param key
* @param value
* @return
* @since 3.1
*/
@SuppressWarnings({ "unchecked" })
protected Map<String, Object> appendAt(int index, String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>();
int i = 0;
for (Map.Entry<String, Object> entry : ((Map<String, Object>) this.value).entrySet()) {
if (i == index) {
clone.put(key, value);
}
if (!entry.getKey().equals(key)) {
clone.put(entry.getKey(), entry.getValue());
}
i++;
}
if (i <= index) {
clone.put(key, value);
}
return clone;
}
@SuppressWarnings({ "rawtypes" })
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<>(Collections.singletonList(value));
}
@@ -177,7 +228,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return (T) ((java.util.Map<String, Object>) this.value).get(key);
return (T) ((Map<String, Object>) this.value).get(key);
}
/**
@@ -187,11 +238,11 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
* @return
*/
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> argumentMap() {
protected Map<String, Object> argumentMap() {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return Collections.unmodifiableMap((java.util.Map) value);
return Collections.unmodifiableMap((java.util.Map<String, Object>) value);
}
/**
@@ -208,7 +259,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return false;
}
return ((java.util.Map<String, Object>) this.value).containsKey(key);
return ((Map<String, Object>) this.value).containsKey(key);
}
protected abstract String getMongoMethod();

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.function.Predicate;
import org.bson.Document;
import org.springframework.util.Assert;
@@ -26,6 +27,7 @@ import org.springframework.util.Assert;
* The {@link AggregationPipeline} holds the collection of {@link AggregationOperation aggregation stages}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.0.2
*/
public class AggregationPipeline {
@@ -45,6 +47,8 @@ public class AggregationPipeline {
* @param aggregationOperations must not be {@literal null}.
*/
public AggregationPipeline(List<AggregationOperation> aggregationOperations) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
pipeline = new ArrayList<>(aggregationOperations);
}
@@ -82,30 +86,77 @@ public class AggregationPipeline {
*/
public boolean isOutOrMerge() {
if (pipeline.isEmpty()) {
if (isEmpty()) {
return false;
}
String operator = pipeline.get(pipeline.size() - 1).getOperator();
return operator.equals("$out") || operator.equals("$merge");
AggregationOperation operation = pipeline.get(pipeline.size() - 1);
return isOut(operation) || isMerge(operation);
}
void verify() {
// check $out/$merge is the last operation if it exists
for (AggregationOperation aggregationOperation : pipeline) {
for (AggregationOperation operation : pipeline) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation)) {
if (isOut(operation) && !isLast(operation)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
if (aggregationOperation instanceof MergeOperation && !isLast(aggregationOperation)) {
if (isMerge(operation) && !isLast(operation)) {
throw new IllegalArgumentException("The $merge operator must be the last stage in the pipeline.");
}
}
}
/**
* Return whether this aggregation pipeline defines a {@code $unionWith} stage that may contribute documents from
* other collections. Checking for presence of union stages is useful when attempting to determine the aggregation
* element type for mapping metadata computation.
*
* @return {@literal true} the aggregation pipeline makes use of {@code $unionWith}.
* @since 3.1
*/
public boolean containsUnionWith() {
return containsOperation(AggregationPipeline::isUnionWith);
}
/**
* @return {@literal true} if the pipeline does not contain any stages.
* @since 3.1
*/
public boolean isEmpty() {
return pipeline.isEmpty();
}
private boolean containsOperation(Predicate<AggregationOperation> predicate) {
if (isEmpty()) {
return false;
}
for (AggregationOperation element : pipeline) {
if (predicate.test(element)) {
return true;
}
}
return false;
}
private boolean isLast(AggregationOperation aggregationOperation) {
return pipeline.indexOf(aggregationOperation) == pipeline.size() - 1;
}
private static boolean isUnionWith(AggregationOperation operator) {
return operator instanceof UnionWithOperation || operator.getOperator().equals("$unionWith");
}
private static boolean isMerge(AggregationOperation operator) {
return operator instanceof MergeOperation || operator.getOperator().equals("$merge");
}
private static boolean isOut(AggregationOperation operator) {
return operator instanceof OutOperation || operator.getOperator().equals("$out");
}
}

View File

@@ -149,4 +149,12 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
}
return null;
}
/**
* @return obtain the root context used to resolve references.
* @since 3.1
*/
AggregationOperationContext getRootContext() {
return rootContext;
}
}

View File

@@ -0,0 +1,587 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorBuilder;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorInitBuilder;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
/**
* Gateway to {@literal $function} and {@literal $accumulator} aggregation operations.
* <p />
* Using {@link ScriptOperators} as part of the {@link Aggregation} requires MongoDB server to have
* <a href="https://docs.mongodb.com/master/core/server-side-javascript/">server-side JavaScript</a> execution
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.1
*/
public class ScriptOperators {
/**
* Create a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function<a /> in JavaScript.
*
* @param body The function definition. Must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
return Function.function(body);
}
/**
* Create a custom <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator
* operator</a> in Javascript.
*
* @return new instance of {@link AccumulatorInitBuilder}.
*/
public static AccumulatorInitBuilder accumulatorBuilder() {
return new AccumulatorBuilder();
}
/**
* {@link Function} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function</a> in JavaScript.
* <p />
* <code class="java">
* {
* $function: {
* body: ...,
* args: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Function} cannot be used as part of {@link org.springframework.data.mongodb.core.schema.MongoJsonSchema
* schema} validation query expression. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">MongoDB Documentation:
* $function</a>
*/
public static class Function extends AbstractAggregationExpression {
private Function(Map<String, Object> values) {
super(values);
}
/**
* Create a new {@link Function} with the given function definition.
*
* @param body must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
Assert.notNull(body, "Function body must not be null!");
Map<String, Object> function = new LinkedHashMap<>(2);
function.put(Fields.BODY.toString(), body);
function.put(Fields.ARGS.toString(), Collections.emptyList());
function.put(Fields.LANG.toString(), "js");
return new Function(function);
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(Object... args) {
return args(Arrays.asList(args));
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(List<Object> args) {
Assert.notNull(args, "Args must not be null! Use an empty list instead.");
return new Function(appendAt(1, Fields.ARGS.toString(), args));
}
/**
* The language used in the body.
*
* @param lang must not be {@literal null} nor empty.
* @return new instance of {@link Function}.
*/
public Function lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
return new Function(appendAt(2, Fields.LANG.toString(), lang));
}
@Nullable
List<Object> getArgs() {
return get(Fields.ARGS.toString());
}
String getBody() {
return get(Fields.BODY.toString());
}
String getLang() {
return get(Fields.LANG.toString());
}
@Override
protected String getMongoMethod() {
return "$function";
}
enum Fields {
BODY, ARGS, LANG;
@Override
public String toString() {
return name().toLowerCase();
}
}
}
/**
* {@link Accumulator} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator operator</a>,
* one that maintains its state (e.g. totals, maximums, minimums, and related data) as documents progress through the
* pipeline, in JavaScript.
* <p />
* <code class="java">
* {
* $accumulator: {
* init: ...,
* intArgs: ...,
* accumulate: ...,
* accumulateArgs: ...,
* merge: ...,
* finalize: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Accumulator} can be used as part of {@link GroupOperation $group}, {@link BucketOperation $bucket} and
* {@link BucketAutoOperation $bucketAuto} pipeline stages. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">MongoDB Documentation:
* $accumulator</a>
*/
public static class Accumulator extends AbstractAggregationExpression {
private Accumulator(Map<String, Object> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$accumulator";
}
enum Fields {
ACCUMULATE("accumulate"), //
ACCUMULATE_ARGS("accumulateArgs"), //
FINALIZE("finalize"), //
INIT("init"), //
INIT_ARGS("initArgs"), //
LANG("lang"), //
MERGE("merge"); //
private String field;
Fields(String field) {
this.field = field;
}
@Override
public String toString() {
return field;
}
}
public interface AccumulatorInitBuilder {
/**
* Define the {@code init} {@link Function} for the {@link Accumulator accumulators} initial state. The function
* receives its arguments from the {@link Function#args(Object...) initArgs} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder init(Function function) {
return init(function.getBody()).initArgs(function.getArgs());
}
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link AccumulatorInitArgsBuilder#initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorInitArgsBuilder init(String function);
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
AccumulatorInitBuilder lang(String lang);
}
public interface AccumulatorInitArgsBuilder extends AccumulatorAccumulateBuilder {
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder initArgs(Object... args) {
return initArgs(Arrays.asList(args));
}
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateBuilder initArgs(List<Object> args);
}
public interface AccumulatorAccumulateBuilder {
/**
* Set the {@code accumulate} {@link Function} that updates the state for each document. The functions first
* argument is the current {@code state}, additional arguments can be defined via {@link Function#args(Object...)
* accumulateArgs}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulate(Function function) {
return accumulate(function.getBody()).accumulateArgs(function.getArgs());
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via
* {@link AccumulatorAccumulateArgsBuilder#accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateArgsBuilder accumulate(String function);
}
public interface AccumulatorAccumulateArgsBuilder extends AccumulatorMergeBuilder {
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulateArgs(Object... args) {
return accumulateArgs(Arrays.asList(args));
}
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorMergeBuilder accumulateArgs(List<Object> args);
}
public interface AccumulatorMergeBuilder {
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorFinalizeBuilder merge(String function);
}
public interface AccumulatorFinalizeBuilder {
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
Accumulator finalize(String function);
/**
* Build the {@link Accumulator} object without specifying a {@link #finalize(String) finalize function}.
*
* @return new instance of {@link Accumulator}.
*/
Accumulator build();
}
static class AccumulatorBuilder
implements AccumulatorInitBuilder, AccumulatorInitArgsBuilder, AccumulatorAccumulateBuilder,
AccumulatorAccumulateArgsBuilder, AccumulatorMergeBuilder, AccumulatorFinalizeBuilder {
private List<Object> initArgs;
private String initFunction;
private List<Object> accumulateArgs;
private String accumulateFunction;
private String mergeFunction;
private String finalizeFunction;
private String lang = "js";
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link #initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder init(String function) {
this.initFunction = function;
return this;
}
/**
* Define the optional {@code initArgs} for the {@link #init(String)} function.
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder initArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.initArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via {@link #accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulate(String function) {
Assert.notNull(function, "Accumulate function must not be null");
this.accumulateFunction = function;
return this;
}
/**
* Define additional {@code accumulateArgs} for the {@link #accumulate(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulateArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.accumulateArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder merge(String function) {
Assert.notNull(function, "Merge function must not be null");
this.mergeFunction = function;
return this;
}
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
public AccumulatorBuilder lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
this.lang = lang;
return this;
}
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
@Override
public Accumulator finalize(String function) {
Assert.notNull(function, "Finalize function must not be null");
this.finalizeFunction = function;
Map<String, Object> args = createArgumentMap();
args.put(Fields.FINALIZE.toString(), finalizeFunction);
return new Accumulator(args);
}
@Override
public Accumulator build() {
return new Accumulator(createArgumentMap());
}
private Map<String, Object> createArgumentMap() {
Map<String, Object> args = new LinkedHashMap<>();
args.put(Fields.INIT.toString(), initFunction);
if (!CollectionUtils.isEmpty(initArgs)) {
args.put(Fields.INIT_ARGS.toString(), initArgs);
}
args.put(Fields.ACCUMULATE.toString(), accumulateFunction);
if (!CollectionUtils.isEmpty(accumulateArgs)) {
args.put(Fields.ACCUMULATE_ARGS.toString(), accumulateArgs);
}
args.put(Fields.MERGE.toString(), mergeFunction);
args.put(Fields.LANG.toString(), lang);
return args;
}
}
}
}

View File

@@ -133,6 +133,19 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
*/
@Override
public AggregationOperationContext continueOnMissingFieldReference() {
return continueOnMissingFieldReference(type);
}
/**
* This toggle allows the {@link AggregationOperationContext context} to use any given field name without checking for
* its existence. Typically the {@link AggregationOperationContext} fails when referencing unknown fields, those that
* are not present in one of the previous stages or the input source, throughout the pipeline.
*
* @param type The domain type to map fields to.
* @return a more relaxed {@link AggregationOperationContext}.
* @since 3.1
*/
public AggregationOperationContext continueOnMissingFieldReference(Class<?> type) {
return new RelaxedTypeBasedAggregationOperationContext(type, mappingContext, mapper);
}

View File

@@ -0,0 +1,168 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* The <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">$unionWith</a> aggregation
* stage (available since MongoDB 4.4) performs a union of two collections by combining pipeline results, potentially
* containing duplicates, into a single result set that is handed over to the next stage. <br />
* In order to remove duplicates it is possible to append a {@link GroupOperation} right after
* {@link UnionWithOperation}.
* <p />
* If the {@link UnionWithOperation} uses a
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/#unionwith-pipeline">pipeline</a>
* to process documents, field names within the pipeline will be treated as is. In order to map domain type property
* names to actual field names (considering potential {@link org.springframework.data.mongodb.core.mapping.Field}
* annotations) make sure the enclosing aggregation is a {@link TypedAggregation} and provide the target type for the
* {@code $unionWith} stage via {@link #mapFieldsTo(Class)}.
*
* @author Christoph Strobl
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">Aggregation Pipeline Stage:
* $unionWith</a>
* @since 3.1
*/
public class UnionWithOperation implements AggregationOperation {
private final String collection;
private final @Nullable AggregationPipeline pipeline;
private final @Nullable Class<?> domainType;
public UnionWithOperation(String collection, @Nullable AggregationPipeline pipeline, @Nullable Class<?> domainType) {
Assert.notNull(collection, "Collection must not be null!");
this.collection = collection;
this.pipeline = pipeline;
this.domainType = domainType;
}
/**
* Set the name of the collection from which pipeline results should be included in the result set.<br />
* The collection name is used to set the {@code coll} parameter of {@code $unionWith}.
*
* @param collection the MongoDB collection name. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public static UnionWithOperation unionWith(String collection) {
return new UnionWithOperation(collection, null, null);
}
/**
* Set the {@link AggregationPipeline} to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param pipeline the {@link AggregationPipeline} that computes the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationPipeline pipeline) {
return new UnionWithOperation(collection, pipeline, domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(List<AggregationOperation> aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(aggregationStages), domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationOperation... aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(Arrays.asList(aggregationStages)), domainType);
}
/**
* Set domain type used for field name mapping of property references used by the {@link AggregationPipeline}.
* Remember to also use a {@link TypedAggregation} in the outer pipeline.<br />
* If not set, field names used within {@link AggregationOperation pipeline operations} are taken as is.
*
* @param domainType the domain type to map field names used in pipeline operations to. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation mapFieldsTo(Class<?> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
return new UnionWithOperation(collection, pipeline, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
Document $unionWith = new Document("coll", collection);
if (pipeline == null || pipeline.isEmpty()) {
return new Document(getOperator(), $unionWith);
}
$unionWith.append("pipeline", pipeline.toDocuments(computeContext(context)));
return new Document(getOperator(), $unionWith);
}
private AggregationOperationContext computeContext(AggregationOperationContext source) {
if (domainType == null) {
return Aggregation.DEFAULT_CONTEXT;
}
if (source instanceof TypeBasedAggregationOperationContext) {
return ((TypeBasedAggregationOperationContext) source).continueOnMissingFieldReference(domainType);
}
if (source instanceof ExposedFieldsAggregationOperationContext) {
return computeContext(((ExposedFieldsAggregationOperationContext) source).getRootContext());
}
return source;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$unionWith";
}
}

View File

@@ -127,8 +127,8 @@ public interface MongoConverter
@Nullable
default Object convertId(@Nullable Object id, Class<?> targetType) {
if (id == null) {
return null;
if (id == null || ClassUtils.isAssignableValue(targetType, id)) {
return id;
}
if (ClassUtils.isAssignable(ObjectId.class, targetType)) {

View File

@@ -20,47 +20,129 @@ import java.util.Map;
import java.util.Map.Entry;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Field projection.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Patryk Wasik
* @author Christoph Strobl
* @author Mark Paluch
* @author Owen Q
*/
public class Field {
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private final Map<String, Integer> criteria = new HashMap<>();
private final Map<String, Object> slices = new HashMap<>();
private final Map<String, Criteria> elemMatchs = new HashMap<>();
private @Nullable String positionKey;
private int positionValue;
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
/**
* Include a single {@code field} to be returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field include(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 1);
return this;
}
public Field exclude(String key) {
criteria.put(key, Integer.valueOf(0));
/**
* Include one or more {@code fields} to be returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field include(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 1);
}
return this;
}
public Field slice(String key, int size) {
slices.put(key, Integer.valueOf(size));
/**
* Exclude a single {@code field} from being returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field exclude(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 0);
return this;
}
public Field slice(String key, int offset, int size) {
slices.put(key, new Integer[] { Integer.valueOf(offset), Integer.valueOf(size) });
/**
* Exclude one or more {@code fields} from being returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field exclude(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 0);
}
return this;
}
public Field elemMatch(String key, Criteria elemMatchCriteria) {
elemMatchs.put(key, elemMatchCriteria);
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements.
*
* @param field the document field name to project, must be an array field.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int size) {
Assert.notNull(field, "Key must not be null!");
slices.put(field, size);
return this;
}
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements starting at
* {@code offset}.
*
* @param field the document field name to project, must be an array field.
* @param offset the offset to start at.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int offset, int size) {
slices.put(field, new Integer[] { offset, size });
return this;
}
public Field elemMatch(String field, Criteria elemMatchCriteria) {
elemMatchs.put(field, elemMatchCriteria);
return this;
}
@@ -70,7 +152,7 @@ public class Field {
*
* @param field query array field, must not be {@literal null} or empty.
* @param value
* @return
* @return {@code this} field projection instance.
*/
public Field position(String field, int value) {

View File

@@ -94,8 +94,8 @@ public class ReactiveMongoQueryMethod extends MongoQueryMethod {
}
this.method = method;
this.isCollectionQuery = Lazy.of(() -> !(isPageQuery() || isSliceQuery())
&& ReactiveWrappers.isMultiValueType(metadata.getReturnType(method).getType()));
this.isCollectionQuery = Lazy.of(() -> (!(isPageQuery() || isSliceQuery())
&& ReactiveWrappers.isMultiValueType(metadata.getReturnType(method).getType()) || super.isCollectionQuery()));
}
/*

View File

@@ -365,15 +365,7 @@ public class ParameterBindingDocumentCodec implements CollectibleCodec<Document>
reader.readStartArray();
List<Object> list = new ArrayList<>();
while (reader.readBsonType() != BsonType.END_OF_DOCUMENT) {
// Spring Data Customization START
Object listValue = readValue(reader, decoderContext);
if (listValue instanceof Collection) {
list.addAll((Collection) listValue);
break;
}
list.add(listValue);
// Spring Data Customization END
list.add(readValue(reader, decoderContext));
}
reader.readEndArray();
return list;

View File

@@ -20,8 +20,10 @@ import kotlin.reflect.KProperty1
/**
* Abstraction of a property path consisting of [KProperty].
*
* @author Tjeu Kayim
* @author Mark Paluch
* @author Yoann de Martino
* @since 2.2
*/
class KPropertyPath<T, U>(
@@ -45,7 +47,7 @@ internal fun asString(property: KProperty<*>): String {
* Builds [KPropertyPath] from Property References.
* Refer to a field in an embedded/nested document.
*
* For example, referring to the field "book.author":
* For example, referring to the field "author.name":
* ```
* Book::author / Author::name isEqualTo "Herman Melville"
* ```

View File

@@ -0,0 +1,26 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query
import kotlin.reflect.KProperty
/**
* Extension for [KProperty] providing an `toPath` function to render a [KProperty] as property path.
*
* @author Mark Paluch
* @since 3.1
*/
fun KProperty<*>.toPath(): String = asString(this)

View File

@@ -25,7 +25,7 @@ import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.Value;
import lombok.experimental.Wither;
import lombok.With;
import java.lang.reflect.InvocationTargetException;
import java.math.BigDecimal;
@@ -1839,7 +1839,7 @@ public class MongoTemplateTests {
assertThat(result.property3).isEqualTo(obj.property3);
}
@Test // DATAMONGO-702
@Test // DATAMONGO-702, DATAMONGO-2294
public void queryShouldSupportRealAndAliasedPropertyNamesForFieldExclusions() {
ObjectWith3AliasedFields obj = new ObjectWith3AliasedFields();
@@ -1852,8 +1852,7 @@ public class MongoTemplateTests {
Query query = new Query(Criteria.where("id").is(obj.id));
query.fields() //
.exclude("property2") // real property name
.exclude("prop3"); // aliased property name
.exclude("property2", "prop3"); // real property name, aliased property name
ObjectWith3AliasedFields result = template.findOne(query, ObjectWith3AliasedFields.class);
@@ -3669,6 +3668,27 @@ public class MongoTemplateTests {
assertThat(target.inner.id).isEqualTo(innerId);
}
@Test // DATAMONGO-2294
public void shouldProjectWithCollections() {
MyPerson person = new MyPerson("Walter");
person.address = new Address("TX", "Austin");
template.save(person);
Query queryByChainedInclude = query(where("name").is("Walter"));
queryByChainedInclude.fields().include("id").include("name");
Query queryByCollectionInclude = query(where("name").is("Walter"));
queryByCollectionInclude.fields().include("id", "name");
MyPerson first = template.findAndReplace(queryByChainedInclude, new MyPerson("Walter"));
MyPerson second = template.findAndReplace(queryByCollectionInclude, new MyPerson("Walter"));
assertThat(first).isEqualTo(second);
assertThat(first.address).isNull();
assertThat(second.address).isNull();
}
@Test // DATAMONGO-2451
public void sortOnIdFieldWithExplicitTypeShouldWork() {
@@ -4180,7 +4200,7 @@ public class MongoTemplateTests {
// DATAMONGO-1992
@AllArgsConstructor
@Wither
@With
static class ImmutableVersioned {
final @Id String id;
@@ -4193,7 +4213,7 @@ public class MongoTemplateTests {
}
@Value
@Wither
@With
static class ImmutableAudited {
@Id String id;
@LastModifiedDate Instant modified;

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.aggregation.ScriptOperators.*;
import java.util.Collections;
import org.bson.Document;
import org.junit.jupiter.api.Test;
/**
* Unit tests for {@link ScriptOperators}.
*
* @author Christoph Strobl
*/
class ScriptOperatorsUnitTests {
private static final String FUNCTION_BODY = "function(name) { return hex_md5(name) == \"15b0a220baa16331e8d80e15367677ad\" }";
private static final Document EMPTY_ARGS_FUNCTION_DOCUMENT = new Document("body", FUNCTION_BODY)
.append("args", Collections.emptyList()).append("lang", "js");
private static final String INIT_FUNCTION = "function() { return { count: 0, sum: 0 } }";
private static final String ACC_FUNCTION = "function(state, numCopies) { return { count: state.count + 1, sum: state.sum + numCopies } }";
private static final String MERGE_FUNCTION = "function(state1, state2) { return { count: state1.count + state2.count, sum: state1.sum + state2.sum } }";
private static final String FINALIZE_FUNCTION = "function(state) { return (state.sum / state.count) }";
private static final Document $ACCUMULATOR = Document.parse("{" + //
" $accumulator:" + //
" {" + //
" init: '" + INIT_FUNCTION + "'," + //
" accumulate: '" + ACC_FUNCTION + "'," + //
" accumulateArgs: [\"$copies\"]," + //
" merge: '" + MERGE_FUNCTION + "'," + //
" finalize: '" + FINALIZE_FUNCTION + "'," + //
" lang: \"js\"" + //
" }" + //
" }" + //
" }");
@Test // DATAMONGO-2623
void functionWithoutArgsShouldBeRenderedCorrectly() {
assertThat(function(FUNCTION_BODY).toDocument(Aggregation.DEFAULT_CONTEXT))
.isEqualTo($function(EMPTY_ARGS_FUNCTION_DOCUMENT));
}
@Test // DATAMONGO-2623
void functionWithArgsShouldBeRenderedCorrectly() {
assertThat(function(FUNCTION_BODY).args("$name").toDocument(Aggregation.DEFAULT_CONTEXT)).isEqualTo(
$function(new Document(EMPTY_ARGS_FUNCTION_DOCUMENT).append("args", Collections.singletonList("$name"))));
}
@Test // DATAMONGO-2623
void accumulatorWithStringInput() {
Accumulator accumulator = accumulatorBuilder() //
.init(INIT_FUNCTION) //
.accumulate(ACC_FUNCTION).accumulateArgs("$copies") //
.merge(MERGE_FUNCTION) //
.finalize(FINALIZE_FUNCTION);
assertThat(accumulator.toDocument(Aggregation.DEFAULT_CONTEXT)).isEqualTo($ACCUMULATOR);
}
@Test // DATAMONGO-2623
void accumulatorWithFunctionInput() {
Accumulator accumulator = accumulatorBuilder() //
.init(function(INIT_FUNCTION)) //
.accumulate(function(ACC_FUNCTION).args("$copies")) //
.merge(MERGE_FUNCTION) //
.finalize(FINALIZE_FUNCTION);
assertThat(accumulator.toDocument(Aggregation.DEFAULT_CONTEXT)).isEqualTo($ACCUMULATOR);
}
static Document $function(Document source) {
return new Document("$function", source);
}
}

View File

@@ -0,0 +1,131 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.assertj.core.api.Assertions.*;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.junit.jupiter.api.Test;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
/**
* Unit tests for {@link UnionWithOperation}.
*
* @author Christoph Strobl
*/
class UnionWithOperationUnitTests {
@Test // DATAMONGO-2622
void throwsErrorWhenNoCollectionPresent() {
assertThatExceptionOfType(IllegalArgumentException.class).isThrownBy(() -> UnionWithOperation.unionWith(null));
}
@Test // DATAMONGO-2622
void rendersJustCollectionCorrectly() {
assertThat(UnionWithOperation.unionWith("coll-1").toPipelineStages(contextFor(Warehouse.class)))
.containsExactly(new Document("$unionWith", new Document("coll", "coll-1")));
}
@Test // DATAMONGO-2622
void rendersPipelineCorrectly() {
assertThat(UnionWithOperation.unionWith("coll-1").mapFieldsTo(Warehouse.class)
.pipeline(Aggregation.project().and("location").as("region")).toPipelineStages(contextFor(Warehouse.class)))
.containsExactly(new Document("$unionWith", new Document("coll", "coll-1").append("pipeline",
Arrays.asList(new Document("$project", new Document("region", 1))))));
}
@Test // DATAMONGO-2622
void rendersPipelineCorrectlyForDifferentDomainType() {
assertThat(UnionWithOperation.unionWith("coll-1").pipeline(Aggregation.project().and("name").as("name"))
.mapFieldsTo(Supplier.class).toPipelineStages(contextFor(Warehouse.class)))
.containsExactly(new Document("$unionWith", new Document("coll", "coll-1").append("pipeline",
Arrays.asList(new Document("$project", new Document("name", "$supplier"))))));
}
@Test // DATAMONGO-2622
void rendersPipelineCorrectlyForUntypedContext() {
assertThat(UnionWithOperation.unionWith("coll-1").pipeline(Aggregation.project("region"))
.toPipelineStages(contextFor(null)))
.containsExactly(new Document("$unionWith", new Document("coll", "coll-1").append("pipeline",
Arrays.asList(new Document("$project", new Document("region", 1))))));
}
@Test // DATAMONGO-2622
void doesNotMapAgainstFieldsFromAPreviousStage() {
TypedAggregation<Supplier> agg = TypedAggregation.newAggregation(Supplier.class,
Aggregation.project().and("name").as("supplier"),
UnionWithOperation.unionWith("coll-1").pipeline(Aggregation.project().and("name").as("name")));
List<Document> pipeline = agg.toPipeline(contextFor(Supplier.class));
System.out.println("pipeline: " + pipeline);
assertThat(pipeline).containsExactly(new Document("$project", new Document("supplier", 1)), //
new Document("$unionWith", new Document("coll", "coll-1").append("pipeline",
Arrays.asList(new Document("$project", new Document("name", 1))))));
}
@Test // DATAMONGO-2622
void mapAgainstUnionWithDomainTypeEvenWhenInsideTypedAggregation() {
TypedAggregation<Supplier> agg = TypedAggregation.newAggregation(Supplier.class,
Aggregation.project().and("name").as("supplier"), UnionWithOperation.unionWith("coll-1")
.mapFieldsTo(Warehouse.class).pipeline(Aggregation.project().and("location").as("location")));
List<Document> pipeline = agg.toPipeline(contextFor(Supplier.class));
assertThat(pipeline).containsExactly(new Document("$project", new Document("supplier", 1)), //
new Document("$unionWith", new Document("coll", "coll-1").append("pipeline",
Arrays.asList(new Document("$project", new Document("location", "$region"))))));
}
private static AggregationOperationContext contextFor(@Nullable Class<?> type) {
if (type == null) {
return Aggregation.DEFAULT_CONTEXT;
}
MappingMongoConverter mongoConverter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE,
new MongoMappingContext());
mongoConverter.afterPropertiesSet();
return new TypeBasedAggregationOperationContext(type, mongoConverter.getMappingContext(),
new QueryMapper(mongoConverter));
}
static class Warehouse {
String name;
@Field("region") String location;
String state;
}
static class Supplier {
@Field("supplier") String name;
String state;
}
}

View File

@@ -15,29 +15,40 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.test.util.Assertions.*;
import org.junit.jupiter.api.Test;
/**
* Unit tests for {@link DocumentField}.
* Unit tests for {@link Field}.
*
* @author Oliver Gierke
* @author Owen Q
* @author Mark Paluch
*/
public class FieldUnitTests {
class FieldUnitTests {
@Test
public void sameObjectSetupCreatesEqualField() {
void sameObjectSetupCreatesEqualField() {
Field left = new Field().elemMatch("key", Criteria.where("foo").is("bar"));
Field right = new Field().elemMatch("key", Criteria.where("foo").is("bar"));
assertThat(left).isEqualTo(right);
assertThat(right).isEqualTo(left);
assertThat(left.getFieldsObject()).isEqualTo("{key: { $elemMatch: {foo:\"bar\"}}}");
}
@Test // DATAMONGO-2294
void rendersInclusionCorrectly() {
Field fields = new Field().include("foo", "bar").include("baz");
assertThat(fields.getFieldsObject()).isEqualTo("{foo:1, bar:1, baz:1}");
}
@Test
public void differentObjectSetupCreatesEqualField() {
void differentObjectSetupCreatesEqualField() {
Field left = new Field().elemMatch("key", Criteria.where("foo").is("bar"));
Field right = new Field().elemMatch("key", Criteria.where("foo").is("foo"));
@@ -45,4 +56,12 @@ public class FieldUnitTests {
assertThat(left).isNotEqualTo(right);
assertThat(right).isNotEqualTo(left);
}
@Test // DATAMONGO-2294
void rendersExclusionCorrectly() {
Field fields = new Field().exclude("foo", "bar").exclude("baz");
assertThat(fields.getFieldsObject()).isEqualTo("{foo:0, bar:0, baz:0}");
}
}

View File

@@ -374,6 +374,15 @@ class ParameterBindingJsonReaderUnitTests {
.isEqualTo(Document.parse("{ $and: [{'fieldA': {$in: [/ABC.*/, /CDE.*F/]}}, {'fieldB': {$ne: null}}]}"));
}
@Test // DATAMONGO-2633
void shouldParseNestedArrays() {
Document target = parse("{ 'stores.location' : { $geoWithin: { $centerSphere: [ [ ?0, 48.799029 ] , ?1 ] } } }",
1.948516D, 0.004D);
assertThat(target).isEqualTo(Document
.parse("{ 'stores.location' : { $geoWithin: { $centerSphere: [ [ 1.948516, 48.799029 ] , 0.004 ] } } }"));
}
private static Document parse(String json, Object... args) {
ParameterBindingJsonReader reader = new ParameterBindingJsonReader(json, args);

View File

@@ -19,7 +19,11 @@ import org.assertj.core.api.Assertions.assertThat
import org.junit.Test
/**
* Unit tests for [KPropertyPath] and its extensions.
*
* @author Tjeu Kayim
* @author Yoann de Martino
* @author Mark Paluch
*/
class KPropertyPathTests {
@@ -60,6 +64,35 @@ class KPropertyPathTests {
assertThat(property).isEqualTo("entity.book.author.name")
}
@Test
fun `Convert simple KProperty to property path using toPath`() {
class AnotherEntity(val entity: String)
val property = (AnotherEntity::entity).toPath()
assertThat(property).isEqualTo("entity")
}
@Test
fun `Convert nested KProperty to field name using toPath()`() {
val property = (Book::author / Author::name).toPath()
assertThat(property).isEqualTo("author.name")
}
@Test
fun `Convert triple nested KProperty to property path using toPath`() {
class Entity(val book: Book)
class AnotherEntity(val entity: Entity)
val property = (AnotherEntity::entity / Entity::book / Book::author / Author::name).toPath()
assertThat(property).isEqualTo("entity.book.author.name")
}
class Book(val title: String, val author: Author)
class Author(val name: String)
}

View File

@@ -39,6 +39,8 @@ class ReactiveMongoQueryMethodCoroutineUnitTests {
suspend fun findSuspendAllByName(): Flow<Person>
fun findAllByName(): Flow<Person>
suspend fun findSuspendByName(): List<Person>
}
@Test // DATAMONGO-2562
@@ -58,4 +60,13 @@ class ReactiveMongoQueryMethodCoroutineUnitTests {
assertThat(queryMethod.isCollectionQuery).isTrue()
}
@Test // DATAMONGO-2630
internal fun `should consider suspended methods returning List as collection queries`() {
val method = PersonRepository::class.java.getMethod("findSuspendByName", Continuation::class.java)
val queryMethod = ReactiveMongoQueryMethod(method, DefaultRepositoryMetadata(PersonRepository::class.java), projectionFactory, MongoMappingContext())
assertThat(queryMethod.isCollectionQuery).isTrue()
}
}

View File

@@ -7,6 +7,8 @@
* <<mongo.auditing,Reactive auditing>> enabled through `@EnableReactiveMongoAuditing`. `@EnableMongoAuditing` no longer registers `ReactiveAuditingEntityCallback`.
* Reactive SpEL support in `@Query` and `@Aggregation` query methods.
* Aggregation hints via `AggregationOptions.builder().hint(bson).build()`.
* Extension Function `KProperty.asPath()` to render property references into a property path representation.
* Server-side JavaScript aggregation expressions `$function` and `$accumulator` via `ScriptOperators`.
[[new-features.3.0]]
== What's New in Spring Data MongoDB 3.0

View File

@@ -17,10 +17,10 @@ public interface PersonRepository extends CrudReppsitory<Person, String> {
@Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $firstname } } }")
List<PersonAggregate> groupByLastnameAndFirstnames(Sort sort); <2>
@Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $?0 } } }")
@Aggregation("{ $group: { _id : $lastname, names : { $addToSet : ?0 } } }")
List<PersonAggregate> groupByLastnameAnd(String property); <3>
@Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $?0 } } }")
@Aggregation("{ $group: { _id : $lastname, names : { $addToSet : ?0 } } }")
List<PersonAggregate> groupByLastnameAnd(String property, Pageable page); <4>
@Aggregation("{ $group : { _id : null, total : { $sum : $age } } }")

View File

@@ -27,7 +27,7 @@ First, you need to set up a running MongoDB server. Refer to the https://docs.mo
To create a Spring project in STS:
. Go to File -> New -> Spring Template Project -> Simple Spring Utility Project, and press Yes when prompted. Then enter a project and a package name, such as `org.spring.mongodb.example`.
.Add the following to the pom.xml files `dependencies` element:
. Add the following to the pom.xml files `dependencies` element:
+
[source,xml,subs="+attributes"]
----
@@ -2559,6 +2559,7 @@ The MongoDB Aggregation Framework provides the following types of aggregation op
* Lookup Aggregation Operators
* Convert Aggregation Operators
* Object Aggregation Operators
* Script Aggregation Operators
At the time of this writing, we provide support for the following Aggregation Operations in Spring Data MongoDB:
@@ -2606,6 +2607,9 @@ At the time of this writing, we provide support for the following Aggregation Op
| Object Aggregation Operators
| `objectToArray`, `mergeObjects`
| Script Aggregation Operators
| `function`, `accumulator`
|===
* The operation is mapped or added by Spring Data MongoDB.

View File

@@ -1,6 +1,44 @@
Spring Data MongoDB Changelog
=============================
Changes in version 3.1.0 (2020-10-28)
-------------------------------------
* DATAMONGO-2642 - Upgrade to MongoDB Driver 4.1.1.
* DATAMONGO-2639 - Release 3.1 GA (2020.0.0).
* DATAMONGO-2638 - Fix documentation issues.
Changes in version 3.0.5.RELEASE (2020-10-28)
---------------------------------------------
* DATAMONGO-2643 - Adopt to AssertJ API changes.
* DATAMONGO-2638 - Fix documentation issues.
* DATAMONGO-2633 - @Query annotation does not support $centerSphere.
* DATAMONGO-2625 - Release 3.0.5 (Neumann SR5).
Changes in version 2.2.11.RELEASE (2020-10-28)
----------------------------------------------
* DATAMONGO-2638 - Fix documentation issues.
* DATAMONGO-2633 - @Query annotation does not support $centerSphere.
* DATAMONGO-2624 - Release 2.2.11 (Moore SR11).
Changes in version 2.1.21.RELEASE (2020-10-28)
----------------------------------------------
* DATAMONGO-2641 - Release 2.1.21 (Lovelace SR21).
Changes in version 3.1.0-RC2 (2020-10-14)
-----------------------------------------
* DATAMONGO-2633 - @Query annotation does not support $centerSphere.
* DATAMONGO-2630 - Add support for suspend repository query methods returning List<T>.
* DATAMONGO-2626 - Release 3.1 RC2 (2020.0.0).
* DATAMONGO-2623 - Add support for custom Aggregation expressions.
* DATAMONGO-2622 - Add support for $unionWith aggregation.
* DATAMONGO-2596 - Introduce extension to render KProperty/KPropertyPath as property path.
* DATAMONGO-2294 - Support multiple parameters for query field projections.
Changes in version 3.1.0-RC1 (2020-09-16)
-----------------------------------------
* DATAMONGO-2621 - Adapt to changed array assertions in AssertJ.
@@ -3169,6 +3207,11 @@ Repository

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 3.1 RC1 (2020.0.0)
Spring Data MongoDB 3.1 GA (2020.0.0)
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -18,3 +18,5 @@ conditions of the subcomponent's license, as noted in the LICENSE file.