Compare commits

..

31 Commits
4.0.2 ... 4.0.5

Author SHA1 Message Date
Greg L. Turnquist
53c56f6e20 Release version 4.0.5 (2022.0.5).
See #4336
2023-04-14 10:19:14 -05:00
Greg L. Turnquist
6af9b562f6 Prepare 4.0.5 (2022.0.5).
See #4336
2023-04-14 10:18:32 -05:00
Mark Paluch
a5f73850af Polishing.
Reformat code. Remove unused fields, modifiers and documentation artifacts.

See #4088
Original pull request: #4341
2023-04-14 08:55:32 +02:00
Christoph Strobl
68b4a09273 Skip output for void methods using declarative Aggregations having $out stage.
We now set the skipOutput flag if an annotated Aggregation defines an $out stage and when the method is declared to return no result (void / Mono<Void>, kotlin.Unit)

Closes: #4088
Original pull request: #4341
2023-04-14 08:55:17 +02:00
Christoph Strobl
7290d78053 Fix null value handling in ParameterBindingJsonReader#readStringFromExtendedJson.
This commit makes sure to return null for a null parameter value avoiding a potential NPE when parsing data.
In doing so we can ensure object creation is done with the intended value that may or may not lead to a downstream error eg. when trying to create an ObjectId with a null hexString.

Closes: #4282
Original pull request: #4334
2023-04-13 11:31:28 +02:00
Mark Paluch
4acbb1282f Polishing.
Tweak naming.

Original pull request: #4352
See #4351
2023-04-13 11:12:17 +02:00
Christoph Strobl
b89c4cd231 Fix AOT processing for lazy-loading Jdk proxies.
This commit makes sure to use the ProxyFactory for retrieving the proxied interfaces. This makes sure to capture the exact interface order required when finally loading the proxy at runtime.

Original pull request: #4352
Closes #4351
2023-04-13 11:12:16 +02:00
Greg L. Turnquist
8d650ca1d2 Test against Java 20 on CI.
See #4350.
2023-04-12 14:59:27 -05:00
Mark Paluch
b9a23baf16 Adopt to Mockito 5.1 changes.
See #4290
2023-04-11 08:34:37 +02:00
Mark Paluch
d1326a45fc Upgrade to Maven Wrapper 3.9.1.
See #4357
2023-04-06 16:17:30 +02:00
Greg L. Turnquist
ca24950014 Update CI properties.
See #4336
2023-03-28 14:01:00 -05:00
Christoph Strobl
f157560d1c Update visibility of ConversionContext.
The ConversionContext should not be package private due to its usage in protected method signatures.

Closes: #4345
2023-03-24 13:41:01 +01:00
Christoph Strobl
4aa12fd24b After release cleanups.
See #4314
2023-03-20 14:26:21 +01:00
Christoph Strobl
c4a99370d8 Prepare next development iteration.
See #4314
2023-03-20 14:26:19 +01:00
Christoph Strobl
77b2b28ab2 Release version 4.0.4 (2022.0.4).
See #4314
2023-03-20 14:22:48 +01:00
Christoph Strobl
040dfaea2c Prepare 4.0.4 (2022.0.4).
See #4314
2023-03-20 14:22:20 +01:00
Mark Paluch
dfa2bb9906 Polishing.
Remove duplicate logging in imperative FindOneCallback.

See #4253
Original pull request: #4259
2023-03-17 09:36:19 +01:00
Raghav2211
cf12c34c22 Remove duplicate log in reactive findOne operation.
Closes #4253
Original pull request: #4259
2023-03-17 09:36:08 +01:00
Mark Paluch
07a224d9ac Polishing.
Simplify field creation considering simplified projection expressions.

See #3917
Original pull request: #4328
2023-03-16 09:54:06 +01:00
Christoph Strobl
aecca2998b Fix field resolution for ExposedFieldsAggregationContext.
This commit fixes an issue where the context is not relaxed and errors on unknown fields if multiple stages of nesting contexts happen.

Closes #3917
Original pull request: #4328
2023-03-16 09:54:06 +01:00
Christoph Strobl
190516f297 Fix property value conversion for $in clauses.
This commit fixes an issue where a property value converter is not applied if the query is using an $in clause that compares the value against a collection of potential candidates.

Original pull request: #4324
Closes #4080
2023-03-15 11:26:33 +01:00
Mark Paluch
39e30cd2f0 Polishing.
Extract duplicates into peek method.

See #4312
Original pull request: #4323
2023-03-15 11:13:48 +01:00
Christoph Strobl
e611f2a868 Allow reading already resolved references.
This commit adds the ability to read (eg. by an aggregation $lookup) already fully resolved references between documents.
No proxy will be created for lazy loading references and we'll also skip the additional server roundtrip to load the reference by its id.

Closes #4312
Original pull request: #4323
2023-03-15 11:13:36 +01:00
Mark Paluch
37dc81bde2 After release cleanups.
See #4293
2023-03-03 11:02:48 +01:00
Mark Paluch
f95da91866 Prepare next development iteration.
See #4293
2023-03-03 11:02:46 +01:00
Mark Paluch
8ebc632130 Release version 4.0.3 (2022.0.3).
See #4293
2023-03-03 10:59:33 +01:00
Mark Paluch
95c1483bb1 Prepare 4.0.3 (2022.0.3).
See #4293
2023-03-03 10:59:17 +01:00
Christoph Strobl
2486f162ed Fix regression in findAndReplace when using native MongoDB types as domain value.
This commit fixes a regression that prevented native org.bson.Document to serve as source for a findAndReplaceOperation.

Closes: #4300
Original Pull Request: #4310
2023-03-02 10:02:31 +01:00
Mark Paluch
95a05a24e1 Upgrade to Maven Wrapper 3.9.0.
See #4298
2023-02-20 11:59:26 +01:00
Mark Paluch
47f8e0599e After release cleanups.
See #4274
2023-02-17 11:02:27 +01:00
Mark Paluch
15ebf4c37c Prepare next development iteration.
See #4274
2023-02-17 11:02:25 +01:00
38 changed files with 529 additions and 288 deletions

View File

@@ -1,2 +1,2 @@
#Mon Jan 30 10:47:19 CET 2023
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.7/apache-maven-3.8.7-bin.zip
#Thu Apr 06 16:17:30 CEST 2023
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.1/apache-maven-3.9.1-bin.zip

90
Jenkinsfile vendored
View File

@@ -18,69 +18,7 @@ pipeline {
}
stages {
stage("Docker images") {
parallel {
stage('Publish JDK (Java 17) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 6.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-6.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk17-mongodb-6.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
}
}
stage("test: baseline (Java 17)") {
stage("test: baseline (main)") {
when {
beforeAgent(true)
anyOf {
@@ -119,7 +57,7 @@ pipeline {
}
parallel {
stage("test: MongoDB 5.0 (Java 17)") {
stage("test: MongoDB 5.0 (main)") {
agent {
label 'data'
}
@@ -141,7 +79,7 @@ pipeline {
}
}
stage("test: MongoDB 6.0 (Java 17)") {
stage("test: MongoDB 6.0 (main)") {
agent {
label 'data'
}
@@ -162,6 +100,28 @@ pipeline {
}
}
}
stage("test: MongoDB 6.0 (next)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.next.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
}

View File

@@ -1,22 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,24 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 && \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,24 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,8 +1,10 @@
# Java versions
java.main.tag=17.0.6_10-jdk-focal
java.next.tag=20-jdk-jammy
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.next.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.next.tag}
# Supported versions of MongoDB
docker.mongodb.4.4.version=4.4.18

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.2</version>
<version>4.0.5</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.0.2</version>
<version>3.0.5</version>
</parent>
<modules>
@@ -26,7 +26,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.0.2</springdata.commons>
<springdata.commons>3.0.5</springdata.commons>
<mongo>4.8.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.2</version>
<version>4.0.5</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.2</version>
<version>4.0.5</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.2</version>
<version>4.0.5</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
@@ -25,7 +26,6 @@ import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
@@ -33,7 +33,6 @@ import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
import org.springframework.data.util.TypeUtils;
/**
* @author Christoph Strobl
@@ -66,9 +65,7 @@ public class LazyLoadingProxyAotProcessor {
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
Arrays.asList(LazyLoadingProxyFactory.prepareFactory(field.getType()).getProxiedInterfaces()));
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
@@ -77,7 +74,7 @@ public class LazyLoadingProxyAotProcessor {
} else {
Class<?> proxyClass = LazyLoadingProxyFactory.resolveProxyType(field.getType(),
() -> LazyLoadingInterceptor.none());
LazyLoadingInterceptor::none);
// see: spring-projects/spring-framework/issues/29309
generationContext.getRuntimeHints().reflection().registerType(proxyClass,

View File

@@ -283,6 +283,10 @@ class EntityOperations {
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
if (!queryMapper.getMappingContext().hasPersistentEntityFor(entityType)) {
return (EntityProjection) EntityProjection.nonProjecting(resultType);
}
return introspector.introspect(resultType, entityType);
}

View File

@@ -30,7 +30,6 @@ import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -70,16 +69,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
@@ -117,16 +107,7 @@ import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.AggregateIterable;
import com.mongodb.client.ClientSession;
import com.mongodb.client.DistinctIterable;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MapReduceIterable;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.MongoDatabase;
import com.mongodb.client.MongoIterable;
import com.mongodb.client.*;
import com.mongodb.client.model.*;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
@@ -2837,13 +2818,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
FindIterable<Document> iterable = cursorPreparer.initiateFind(collection, col -> col.find(query, Document.class));
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findOne using query: %s fields: %s in db.collection: %s",
serializeToJsonSafely(query), serializeToJsonSafely(fields.orElseGet(Document::new)),
collection.getNamespace() != null ? collection.getNamespace().getFullName() : "n/a"));
}
if (fields.isPresent()) {
iterable = iterable.projection(fields.get());
}

View File

@@ -2631,13 +2631,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(
String.format("findOne using query: %s fields: %s in db.collection: %s", serializeToJsonSafely(query),
serializeToJsonSafely(fields.orElseGet(Document::new)), collection.getNamespace().getFullName()));
}
FindPublisher<Document> publisher = preparer.initiateFind(collection, col -> col.find(query, Document.class));
if (fields.isPresent()) {

View File

@@ -96,6 +96,15 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return exposedField;
}
if (rootContext instanceof RelaxedTypeBasedAggregationOperationContext) {
if (field != null) {
return new DirectFieldReference(new ExposedField(field, true));
}
return new DirectFieldReference(new ExposedField(name, true));
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'", name));
}

View File

@@ -72,7 +72,7 @@ public final class LazyLoadingProxyFactory {
/**
* Predict the proxy target type. This will advice the infrastructure to resolve as many pieces as possible in a
* potential AOT scenario without necessarily resolving the entire object.
*
*
* @param propertyType the type to proxy
* @param interceptor the interceptor to be added.
* @return the proxy type.
@@ -90,16 +90,30 @@ public final class LazyLoadingProxyFactory {
.getProxyClass(LazyLoadingProxy.class.getClassLoader());
}
private ProxyFactory prepareProxyFactory(Class<?> propertyType, Supplier<LazyLoadingInterceptor> interceptor) {
/**
* Create the {@link ProxyFactory} for the given type, already adding required additional interfaces.
*
* @param targetType the type to proxy.
* @return the prepared {@link ProxyFactory}.
* @since 4.0.5
*/
public static ProxyFactory prepareFactory(Class<?> targetType) {
ProxyFactory proxyFactory = new ProxyFactory();
for (Class<?> type : propertyType.getInterfaces()) {
for (Class<?> type : targetType.getInterfaces()) {
proxyFactory.addInterface(type);
}
proxyFactory.addInterface(LazyLoadingProxy.class);
proxyFactory.addInterface(propertyType);
proxyFactory.addInterface(targetType);
return proxyFactory;
}
private ProxyFactory prepareProxyFactory(Class<?> propertyType, Supplier<LazyLoadingInterceptor> interceptor) {
ProxyFactory proxyFactory = prepareFactory(propertyType);
proxyFactory.addAdvice(interceptor.get());
return proxyFactory;

View File

@@ -668,9 +668,32 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
if (value instanceof DBRef dbref) {
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
return;
}
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
/*
* The value might be a pre resolved full document (eg. resulting from an aggregation $lookup).
* In this case we try to map that object to the target type without an additional step ($dbref resolution server roundtrip)
* in between.
*/
if (value instanceof Document document) {
if (property.isMap()) {
if (document.isEmpty() || peek(document.values()) instanceof DBRef) {
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, null, callback, handler));
} else {
accessor.setProperty(property, readMap(context, document, property.getTypeInformation()));
}
} else {
accessor.setProperty(property, read(property.getActualType(), document));
}
} else if (value instanceof Collection<?> collection && !collection.isEmpty()
&& peek(collection) instanceof Document) {
accessor.setProperty(property, readCollectionOrArray(context, collection, property.getTypeInformation()));
} else {
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, null, callback, handler));
}
}
@Nullable
@@ -699,8 +722,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
// DATAMONGO-913
if (object instanceof LazyLoadingProxy) {
return ((LazyLoadingProxy) object).toDBRef();
if (object instanceof LazyLoadingProxy proxy) {
return proxy.toDBRef();
}
return createDBRef(object, referringProperty);
@@ -999,11 +1022,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
.getPointer();
}).collect(Collectors.toList());
return writeCollectionInternal(targetCollection, TypeInformation.of(DocumentPointer.class), new ArrayList<>(targetCollection.size()));
return writeCollectionInternal(targetCollection, TypeInformation.of(DocumentPointer.class),
new ArrayList<>(targetCollection.size()));
}
if (property.hasExplicitWriteTarget()) {
return writeCollectionInternal(collection, new FieldTypeInformation<>(property), new ArrayList<>(collection.size()));
return writeCollectionInternal(collection, new FieldTypeInformation<>(property),
new ArrayList<>(collection.size()));
}
return writeCollectionInternal(collection, property.getTypeInformation(), new ArrayList<>(collection.size()));
@@ -1651,7 +1676,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
List<Object> result = bulkReadAndConvertDBRefs(context, Collections.singletonList(dbref), type);
return CollectionUtils.isEmpty(result) ? null : result.iterator().next();
return CollectionUtils.isEmpty(result) ? null : peek(result);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@@ -1676,10 +1701,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Collections.emptyList();
}
List<Document> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next()))
List<Document> referencedRawDocuments = dbrefs.size() == 1 ? Collections.singletonList(readRef(peek(dbrefs)))
: bulkReadRefs(dbrefs);
String collectionName = dbrefs.iterator().next().getCollectionName();
String collectionName = peek(dbrefs).getCollectionName();
List<T> targetList = new ArrayList<>(dbrefs.size());
@@ -1824,6 +1848,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return true;
}
private static <T> T peek(Iterable<T> result) {
return result.iterator().next();
}
static Predicate<MongoPersistentProperty> isIdentifier(PersistentEntity<?, ?> entity) {
return entity::isIdProperty;
}
@@ -1902,7 +1930,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public MongoDbPropertyValueProvider withContext(ConversionContext context) {
return context == this.context ? this : new MongoDbPropertyValueProvider(context, accessor, evaluator);
return context == this.context ? this
: new MongoDbPropertyValueProvider(context, accessor, evaluator);
}
}
@@ -2118,7 +2147,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @since 3.4.3
*/
interface ConversionContext {
protected interface ConversionContext {
/**
* Converts a source object into {@link TypeInformation target}.

View File

@@ -30,6 +30,8 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Reference;
import org.springframework.data.convert.PropertyValueConverter;
import org.springframework.data.convert.ValueConversionContext;
import org.springframework.data.domain.Example;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.MappingException;
@@ -435,9 +437,17 @@ public class QueryMapper {
if (documentField.getProperty() != null
&& converter.getCustomConversions().hasValueConverter(documentField.getProperty())) {
return converter.getCustomConversions().getPropertyValueConversions()
.getValueConverter(documentField.getProperty())
.write(value, new MongoConversionContext(documentField.getProperty(), converter));
MongoConversionContext conversionContext = new MongoConversionContext(documentField.getProperty(), converter);
PropertyValueConverter<Object, Object, ValueConversionContext<MongoPersistentProperty>> valueConverter = converter
.getCustomConversions().getPropertyValueConversions().getValueConverter(documentField.getProperty());
/* might be an $in clause with multiple entries */
if (!documentField.getProperty().isCollectionLike() && sourceValue instanceof Collection<?> collection) {
return collection.stream().map(it -> valueConverter.write(it, conversionContext)).collect(Collectors.toList());
}
return valueConverter.write(value, conversionContext);
}
if (documentField.isIdField() && !documentField.isAssociation()) {

View File

@@ -127,6 +127,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @param accessor for providing invocation arguments. Never {@literal null}.
* @param typeToRead the desired component target type. Can be {@literal null}.
*/
@Nullable
protected Object doExecute(MongoQueryMethod method, ResultProcessor processor, ConvertingParameterAccessor accessor,
@Nullable Class<?> typeToRead) {

View File

@@ -27,6 +27,7 @@ import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
@@ -109,7 +110,7 @@ abstract class AggregationUtils {
* @param accessor
* @param targetType
*/
static void appendSortIfPresent(List<AggregationOperation> aggregationPipeline, ConvertingParameterAccessor accessor,
static void appendSortIfPresent(AggregationPipeline aggregationPipeline, ConvertingParameterAccessor accessor,
Class<?> targetType) {
if (accessor.getSort().isUnsorted()) {
@@ -134,7 +135,7 @@ abstract class AggregationUtils {
* @param aggregationPipeline
* @param accessor
*/
static void appendLimitAndOffsetIfPresent(List<AggregationOperation> aggregationPipeline,
static void appendLimitAndOffsetIfPresent(AggregationPipeline aggregationPipeline,
ConvertingParameterAccessor accessor) {
appendLimitAndOffsetIfPresent(aggregationPipeline, accessor, LongUnaryOperator.identity(),
IntUnaryOperator.identity());
@@ -150,7 +151,7 @@ abstract class AggregationUtils {
* @param limitOperator
* @since 3.3
*/
static void appendLimitAndOffsetIfPresent(List<AggregationOperation> aggregationPipeline,
static void appendLimitAndOffsetIfPresent(AggregationPipeline aggregationPipeline,
ConvertingParameterAccessor accessor, LongUnaryOperator offsetOperator, IntUnaryOperator limitOperator) {
Pageable pageable = accessor.getPageable();

View File

@@ -38,6 +38,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.support.PageableExecutionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -55,6 +56,7 @@ import com.mongodb.client.result.DeleteResult;
@FunctionalInterface
interface MongoQueryExecution {
@Nullable
Object execute(Query query);
/**
@@ -291,7 +293,6 @@ interface MongoQueryExecution {
final class UpdateExecution implements MongoQueryExecution {
private final ExecutableUpdate<?> updateOps;
private final MongoQueryMethod method;
private Supplier<UpdateDefinition> updateDefinitionSupplier;
private final MongoParameterAccessor accessor;
@@ -299,7 +300,6 @@ interface MongoQueryExecution {
MongoParameterAccessor accessor) {
this.updateOps = updateOps;
this.method = method;
this.updateDefinitionSupplier = updateSupplier;
this.accessor = accessor;
}

View File

@@ -61,7 +61,7 @@ interface ReactiveMongoQueryExecution {
*
* @author Mark Paluch
*/
class GeoNearExecution implements ReactiveMongoQueryExecution {
final class GeoNearExecution implements ReactiveMongoQueryExecution {
private final ReactiveMongoOperations operations;
private final MongoParameterAccessor accessor;
@@ -83,7 +83,7 @@ interface ReactiveMongoQueryExecution {
}
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Flux<GeoResult<Object>> doExecuteQuery(@Nullable Query query, Class<?> type, String collection) {
private Flux<GeoResult<Object>> doExecuteQuery(@Nullable Query query, Class<?> type, String collection) {
Point nearLocation = accessor.getGeoNearLocation();
NearQuery nearQuery = NearQuery.near(nearLocation);
@@ -154,7 +154,6 @@ interface ReactiveMongoQueryExecution {
final class UpdateExecution implements ReactiveMongoQueryExecution {
private final ReactiveUpdate<?> updateOps;
private final MongoQueryMethod method;
private final MongoParameterAccessor accessor;
private Mono<UpdateDefinition> update;
@@ -162,7 +161,6 @@ interface ReactiveMongoQueryExecution {
Mono<UpdateDefinition> update) {
this.updateOps = updateOps;
this.method = method;
this.accessor = accessor;
this.update = update;
}

View File

@@ -26,12 +26,15 @@ import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.util.ReflectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.ExpressionParser;
import org.springframework.util.ClassUtils;
@@ -68,10 +71,6 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
this.evaluationContextProvider = evaluationContextProvider;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#doExecute(org.springframework.data.mongodb.repository.query.ReactiveMongoQueryMethod, org.springframework.data.repository.query.ResultProcessor, org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor, java.lang.Class)
*/
@Override
protected Publisher<Object> doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
ConvertingParameterAccessor accessor, Class<?> typeToRead) {
@@ -81,7 +80,7 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
Class<?> sourceType = method.getDomainClass();
Class<?> targetType = typeToRead;
List<AggregationOperation> pipeline = it;
AggregationPipeline pipeline = new AggregationPipeline(it);
AggregationUtils.appendSortIfPresent(pipeline, accessor, typeToRead);
AggregationUtils.appendLimitAndOffsetIfPresent(pipeline, accessor);
@@ -93,10 +92,13 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
targetType = Document.class;
}
AggregationOptions options = computeOptions(method, accessor);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline, options);
AggregationOptions options = computeOptions(method, accessor, pipeline);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline.getOperations(), options);
Flux<?> flux = reactiveMongoOperations.aggregate(aggregation, targetType);
if (ReflectionUtils.isVoid(typeToRead)) {
return flux.then();
}
if (isSimpleReturnType && !isRawReturnType) {
flux = flux.handle((item, sink) -> {
@@ -121,7 +123,8 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
return parseAggregationPipeline(getQueryMethod().getAnnotatedAggregation(), accessor);
}
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor,
AggregationPipeline pipeline) {
AggregationOptions.Builder builder = Aggregation.newAggregationOptions();
@@ -129,49 +132,37 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
expressionParser, evaluationContextProvider);
AggregationUtils.applyMeta(builder, method);
TypeInformation<?> returnType = method.getReturnType();
if (returnType.getComponentType() != null) {
returnType = returnType.getRequiredComponentType();
}
if (ReflectionUtils.isVoid(returnType.getType()) && pipeline.isOutOrMerge()) {
builder.skipOutput();
}
return builder.build();
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Mono<Query> createQuery(ConvertingParameterAccessor accessor) {
throw new UnsupportedOperationException("No query support for aggregation");
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isExistsQuery()
*/
@Override
protected boolean isExistsQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return false;

View File

@@ -21,14 +21,13 @@ import java.util.function.LongUnaryOperator;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.SliceImpl;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -36,7 +35,9 @@ import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.util.ReflectionUtils;
import org.springframework.expression.ExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
@@ -60,8 +61,8 @@ public class StringBasedAggregation extends AbstractMongoQuery {
*
* @param method must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param expressionParser
* @param evaluationContextProvider
* @param expressionParser must not be {@literal null}.
* @param evaluationContextProvider must not be {@literal null}.
*/
public StringBasedAggregation(MongoQueryMethod method, MongoOperations mongoOperations,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
@@ -79,18 +80,15 @@ public class StringBasedAggregation extends AbstractMongoQuery {
this.evaluationContextProvider = evaluationContextProvider;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#doExecute(org.springframework.data.mongodb.repository.query.MongoQueryMethod, org.springframework.data.repository.query.ResultProcessor, org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor, java.lang.Class)
*/
@Override
@Nullable
protected Object doExecute(MongoQueryMethod method, ResultProcessor resultProcessor,
ConvertingParameterAccessor accessor, Class<?> typeToRead) {
Class<?> sourceType = method.getDomainClass();
Class<?> targetType = typeToRead;
List<AggregationOperation> pipeline = computePipeline(method, accessor);
AggregationPipeline pipeline = computePipeline(method, accessor);
AggregationUtils.appendSortIfPresent(pipeline, accessor, typeToRead);
if (method.isSliceQuery()) {
@@ -111,8 +109,8 @@ public class StringBasedAggregation extends AbstractMongoQuery {
targetType = method.getReturnType().getRequiredActualType().getRequiredComponentType().getType();
}
AggregationOptions options = computeOptions(method, accessor);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline, options);
AggregationOptions options = computeOptions(method, accessor, pipeline);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline.getOperations(), options);
if (method.isStreamQuery()) {
@@ -126,6 +124,9 @@ public class StringBasedAggregation extends AbstractMongoQuery {
}
AggregationResults<Object> result = (AggregationResults<Object>) mongoOperations.aggregate(aggregation, targetType);
if (ReflectionUtils.isVoid(typeToRead)) {
return null;
}
if (isRawAggregationResult) {
return result;
@@ -167,11 +168,12 @@ public class StringBasedAggregation extends AbstractMongoQuery {
return MongoSimpleTypes.HOLDER.isSimpleType(targetType);
}
List<AggregationOperation> computePipeline(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
return parseAggregationPipeline(method.getAnnotatedAggregation(), accessor);
AggregationPipeline computePipeline(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
return new AggregationPipeline(parseAggregationPipeline(method.getAnnotatedAggregation(), accessor));
}
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor,
AggregationPipeline pipeline) {
AggregationOptions.Builder builder = Aggregation.newAggregationOptions();
@@ -179,49 +181,33 @@ public class StringBasedAggregation extends AbstractMongoQuery {
expressionParser, evaluationContextProvider);
AggregationUtils.applyMeta(builder, method);
if (ReflectionUtils.isVoid(method.getReturnType().getType()) && pipeline.isOutOrMerge()) {
builder.skipOutput();
}
return builder.build();
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
throw new UnsupportedOperationException("No query support for aggregation");
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
*/
@Override
protected boolean isCountQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isExistsQuery()
*/
@Override
protected boolean isExistsQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isDeleteQuery()
*/
@Override
protected boolean isDeleteQuery() {
return false;
}
/*
* (non-Javascript)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isLimiting()
*/
@Override
protected boolean isLimiting() {
return false;

View File

@@ -1425,7 +1425,8 @@ public class ParameterBindingJsonReader extends AbstractBsonReader {
// Spring Data Customization START
if (patternToken.getType() == JsonTokenType.STRING || patternToken.getType() == JsonTokenType.UNQUOTED_STRING) {
return bindableValueFor(patternToken).getValue().toString();
Object value = bindableValueFor(patternToken).getValue();
return value != null ? value.toString() : null;
}
throw new JsonParseException("JSON reader expected a string but found '%s'.", patternToken.getValue());

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import static org.assertj.core.api.Assertions.*;
import java.util.List;
import org.junit.jupiter.api.Test;
import org.springframework.aot.generate.ClassNameGenerator;
import org.springframework.aot.generate.DefaultGenerationContext;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.generate.InMemoryGeneratedFiles;
import org.springframework.aot.hint.predicate.RuntimeHintsPredicates;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.javapoet.ClassName;
/**
* Unit tests for {@link LazyLoadingProxyAotProcessor}.
*
* @author Christoph Strobl
*/
class LazyLoadingProxyAotProcessorUnitTests {
@Test // GH-4351
void registersProxyForLazyDbRefCorrectlyWhenTypeIsCollectionInterface() {
GenerationContext ctx = new DefaultGenerationContext(new ClassNameGenerator(ClassName.get(this.getClass())),
new InMemoryGeneratedFiles());
new LazyLoadingProxyAotProcessor().registerLazyLoadingProxyIfNeeded(A.class, ctx);
assertThat(ctx.getRuntimeHints())
.satisfies(RuntimeHintsPredicates.proxies().forInterfaces(java.util.Collection.class,
org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class, java.util.List.class,
org.springframework.aop.SpringProxy.class, org.springframework.aop.framework.Advised.class,
org.springframework.core.DecoratingProxy.class)::test);
}
static class A {
String id;
@DBRef(lazy = true) //
List<B> listRef;
}
static class B {
String id;
}
}

View File

@@ -2526,6 +2526,26 @@ public class MongoTemplateTests {
assertThat(projection.getName()).isEqualTo("Walter");
}
@Test // GH-4300
public void findAndReplaceShouldAllowNativeDomainTypesAndReturnAProjection() {
MyPerson person = new MyPerson("Walter");
person.address = new Address("TX", "Austin");
template.save(person);
MyPerson previous = template.findAndReplace(query(where("name").is("Walter")),
new org.bson.Document("name", "Heisenberg"), FindAndReplaceOptions.options(), org.bson.Document.class,
"myPerson", MyPerson.class);
assertThat(previous).isNotNull();
assertThat(previous.getAddress()).isEqualTo(person.address);
org.bson.Document loaded = template.execute(MyPerson.class, collection -> {
return collection.find(new org.bson.Document("name", "Heisenberg")).first();
});
assertThat(loaded.get("_id")).isEqualTo(new ObjectId(person.id));
}
@Test // DATAMONGO-407
public void updatesShouldRetainTypeInformationEvenForCollections() {

View File

@@ -2312,6 +2312,17 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
.isEqualTo(new com.mongodb.client.model.TimeSeriesOptions("time_stamp").toString());
}
@Test // GH-4300
void findAndReplaceAllowsDocumentSourceType() {
template.findAndReplace(new Query(), new Document("spring", "data"), FindAndReplaceOptions.options().upsert(),
Document.class, "coll-1", Person.class);
verify(db).getCollection(eq("coll-1"), eq(Document.class));
verify(collection).findOneAndReplace((Bson) any(Bson.class), eq(new Document("spring", "data")),
any(FindOneAndReplaceOptions.class));
}
class AutogenerateableId {
@Id BigInteger id;

View File

@@ -729,6 +729,32 @@ public class ReactiveMongoTemplateTests {
}).verifyComplete();
}
@Test // GH-4300
public void findAndReplaceShouldAllowNativeDomainTypesAndReturnAProjection() {
MongoTemplateTests.MyPerson person = new MongoTemplateTests.MyPerson("Walter");
person.address = new Address("TX", "Austin");
template.save(person) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
template
.findAndReplace(query(where("name").is("Walter")), new org.bson.Document("name", "Heisenberg"),
FindAndReplaceOptions.options(), org.bson.Document.class, "myPerson", MongoTemplateTests.MyPerson.class)
.as(StepVerifier::create) //
.consumeNextWith(actual -> {
assertThat(actual.getAddress()).isEqualTo(person.address);
}).verifyComplete();
template.execute(MongoTemplateTests.MyPerson.class, collection -> {
return collection.find(new org.bson.Document("name", "Heisenberg")).first();
}).as(StepVerifier::create) //
.consumeNextWith(loaded -> {
assertThat(loaded.get("_id")).isEqualTo(new ObjectId(person.id));
}).verifyComplete();
}
@Test // DATAMONGO-1827
void findAndReplaceShouldReplaceObjectReturingNew() {

View File

@@ -29,6 +29,7 @@ import java.util.Map;
import org.bson.Document;
import org.junit.jupiter.api.Test;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperationUnitTests.BookWithFieldAnnotation;
@@ -600,7 +601,7 @@ public class AggregationUnitTests {
new RelaxedTypeBasedAggregationOperationContext(BookWithFieldAnnotation.class, mappingContext,
new QueryMapper(new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext))));
assertThat(extractPipelineElement(target, 1, "$project")).isEqualTo(Document.parse(" { \"_id\" : \"$_id\" }"));
assertThat(extractPipelineElement(target, 1, "$project")).isEqualTo(Document.parse(" { \"_id\" : 1 }"));
}
@Test // GH-3898
@@ -630,6 +631,28 @@ public class AggregationUnitTests {
assertThat(extractPipelineElement(target, 0, "$project")).containsKey("name");
}
@Test // GH-3917
void inheritedFieldsExposingContextShouldNotFailOnUnknownFieldReferenceForRelaxedRootContext() {
List<AggregationOperation> aggregationOperations = new ArrayList<>();
GroupOperation groupOperation = Aggregation.group("_id", "label_name");
aggregationOperations.add(groupOperation);
ProjectionOperation projectionOperation = Aggregation.project("label_name").andExclude("_id");
aggregationOperations.add(projectionOperation);
Sort sort = Sort.by(Sort.Direction.DESC, "serial_number");
SortOperation sortOperation = new SortOperation(sort).and(Sort.Direction.DESC, "label_name");
aggregationOperations.add(sortOperation);
MongoMappingContext mappingContext = new MongoMappingContext();
QueryMapper queryMapper = new QueryMapper(new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext));
List<Document> documents = newAggregation(City.class, aggregationOperations).toPipeline(new RelaxedTypeBasedAggregationOperationContext(City.class, mappingContext, queryMapper));
assertThat(documents.get(2)).isEqualTo("{ $sort : { 'serial_number' : -1, 'label_name' : -1 } }");
}
private Document extractPipelineElement(Document agg, int index, String operation) {
List<Document> pipeline = (List<Document>) agg.get("pipeline");

View File

@@ -33,6 +33,7 @@ import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import org.bson.Document;
import org.junit.jupiter.api.BeforeEach;
@@ -110,6 +111,98 @@ public class MappingMongoConverterTests {
verify(dbRefResolver).bulkFetch(any());
}
@Test // GH-4312
void conversionShouldAllowReadingAlreadyResolvedReferences() {
Document sampleSource = new Document("_id", "sample-1").append("value", "one");
Document source = new Document("_id", "id-1").append("sample", sampleSource);
WithSingleValueDbRef read = converter.read(WithSingleValueDbRef.class, source);
assertThat(read.sample).isEqualTo(converter.read(Sample.class, sampleSource));
verifyNoInteractions(dbRefResolver);
}
@Test // GH-4312
void conversionShouldAllowReadingAlreadyResolvedListOfReferences() {
Document sample1Source = new Document("_id", "sample-1").append("value", "one");
Document sample2Source = new Document("_id", "sample-2").append("value", "two");
Document source = new Document("_id", "id-1").append("lazyList", List.of(sample1Source, sample2Source));
WithLazyDBRef read = converter.read(WithLazyDBRef.class, source);
assertThat(read.lazyList).containsExactly(converter.read(Sample.class, sample1Source),
converter.read(Sample.class, sample2Source));
verifyNoInteractions(dbRefResolver);
}
@Test // GH-4312
void conversionShouldAllowReadingAlreadyResolvedMapOfReferences() {
Document sample1Source = new Document("_id", "sample-1").append("value", "one");
Document sample2Source = new Document("_id", "sample-2").append("value", "two");
Document source = new Document("_id", "id-1").append("sampleMap",
new Document("s1", sample1Source).append("s2", sample2Source));
WithMapValueDbRef read = converter.read(WithMapValueDbRef.class, source);
assertThat(read.sampleMap) //
.containsEntry("s1", converter.read(Sample.class, sample1Source)) //
.containsEntry("s2", converter.read(Sample.class, sample2Source));
verifyNoInteractions(dbRefResolver);
}
@Test // GH-4312
void conversionShouldAllowReadingAlreadyResolvedMapOfLazyReferences() {
Document sample1Source = new Document("_id", "sample-1").append("value", "one");
Document sample2Source = new Document("_id", "sample-2").append("value", "two");
Document source = new Document("_id", "id-1").append("sampleMapLazy",
new Document("s1", sample1Source).append("s2", sample2Source));
WithMapValueDbRef read = converter.read(WithMapValueDbRef.class, source);
assertThat(read.sampleMapLazy) //
.containsEntry("s1", converter.read(Sample.class, sample1Source)) //
.containsEntry("s2", converter.read(Sample.class, sample2Source));
verifyNoInteractions(dbRefResolver);
}
@Test // GH-4312
void resolvesLazyDBRefMapOnAccess() {
client.getDatabase(DATABASE).getCollection("samples")
.insertMany(Arrays.asList(new Document("_id", "sample-1").append("value", "one"),
new Document("_id", "sample-2").append("value", "two")));
Document source = new Document("_id", "id-1").append("sampleMapLazy",
new Document("s1", new com.mongodb.DBRef("samples", "sample-1")).append("s2",
new com.mongodb.DBRef("samples", "sample-2")));
WithMapValueDbRef target = converter.read(WithMapValueDbRef.class, source);
verify(dbRefResolver).resolveDbRef(any(), isNull(), any(), any());
assertThat(target.sampleMapLazy).isInstanceOf(LazyLoadingProxy.class);
assertThat(target.getSampleMapLazy()).containsEntry("s1", new Sample("sample-1", "one")).containsEntry("s2",
new Sample("sample-2", "two"));
verify(dbRefResolver).bulkFetch(any());
}
@Test // GH-4312
void conversionShouldAllowReadingAlreadyResolvedLazyReferences() {
Document sampleSource = new Document("_id", "sample-1").append("value", "one");
Document source = new Document("_id", "id-1").append("sampleLazy", sampleSource);
WithSingleValueDbRef read = converter.read(WithSingleValueDbRef.class, source);
assertThat(read.sampleLazy).isEqualTo(converter.read(Sample.class, sampleSource));
verifyNoInteractions(dbRefResolver);
}
@Test // DATAMONGO-2004
void resolvesLazyDBRefConstructorArgOnAccess() {
@@ -164,6 +257,31 @@ public class MappingMongoConverterTests {
}
}
@Data
public static class WithSingleValueDbRef {
@Id //
String id;
@DBRef //
Sample sample;
@DBRef(lazy = true) //
Sample sampleLazy;
}
@Data
public static class WithMapValueDbRef {
@Id String id;
@DBRef //
Map<String, Sample> sampleMap;
@DBRef(lazy = true) //
Map<String, Sample> sampleMapLazy;
}
public static class WithLazyDBRefAsConstructorArg {
@Id String id;

View File

@@ -1453,7 +1453,7 @@ public class QueryMapperUnitTests {
assertThat(mappedQuery.get("_id"))
.isEqualTo(org.bson.Document.parse("{ $in: [ {$oid: \"5b8bedceb1e0bfc07b008828\" } ]}"));
}
@Test // GH-3596
void considersValueConverterWhenPresent() {
@@ -1461,6 +1461,15 @@ public class QueryMapperUnitTests {
assertThat(mappedObject).isEqualTo(new org.bson.Document("text", "eulav"));
}
@Test // GH-4080
void convertsListOfValuesForPropertyThatHasValueConverterButIsNotCollectionLikeOneByOne() {
org.bson.Document mappedObject = mapper.getMappedObject(query(where("text").in("spring", "data")).getQueryObject(),
context.getPersistentEntity(WithPropertyValueConverter.class));
assertThat(mappedObject).isEqualTo("{ 'text' : { $in : ['gnirps', 'atad'] } }");
}
class WithDeepArrayNesting {
List<WithNestedArray> level0;
@@ -1739,9 +1748,9 @@ public class QueryMapperUnitTests {
static class MyAddress {
private String street;
}
static class WithPropertyValueConverter {
@ValueConverter(ReversingValueConverter.class)
String text;
}

View File

@@ -78,6 +78,7 @@ public class ReactiveStringBasedAggregationUnitTests {
private static final String RAW_SORT_STRING = "{ '$sort' : { 'lastname' : -1 } }";
private static final String RAW_GROUP_BY_LASTNAME_STRING = "{ '$group': { '_id' : '$lastname', 'names' : { '$addToSet' : '$firstname' } } }";
private static final String RAW_OUT = "{ '$out' : 'authors' }";
private static final String GROUP_BY_LASTNAME_STRING_WITH_PARAMETER_PLACEHOLDER = "{ '$group': { '_id' : '$lastname', names : { '$addToSet' : '$?0' } } }";
private static final String GROUP_BY_LASTNAME_STRING_WITH_SPEL_PARAMETER_PLACEHOLDER = "{ '$group': { '_id' : '$lastname', 'names' : { '$addToSet' : '$?#{[0]}' } } }";
@@ -188,6 +189,22 @@ public class ReactiveStringBasedAggregationUnitTests {
return new AggregationInvocation(aggregationCaptor.getValue(), targetTypeCaptor.getValue(), result);
}
@Test // GH-4088
void aggregateWithVoidReturnTypeSkipsResultOnOutStage() {
AggregationInvocation invocation = executeAggregation("outSkipResult");
assertThat(skipResultsOf(invocation)).isTrue();
}
@Test // GH-4088
void aggregateWithOutStageDoesNotSkipResults() {
AggregationInvocation invocation = executeAggregation("outDoNotSkipResult");
assertThat(skipResultsOf(invocation)).isFalse();
}
private ReactiveStringBasedAggregation createAggregationForMethod(String name, Class<?>... parameters) {
Method method = ClassUtils.getMethod(SampleRepository.class, name, parameters);
@@ -216,6 +233,11 @@ public class ReactiveStringBasedAggregationUnitTests {
: null;
}
private Boolean skipResultsOf(AggregationInvocation invocation) {
return invocation.aggregation.getOptions() != null ? invocation.aggregation.getOptions().isSkipResults()
: false;
}
private Class<?> targetTypeOf(AggregationInvocation invocation) {
return invocation.getTargetType();
}
@@ -243,6 +265,12 @@ public class ReactiveStringBasedAggregationUnitTests {
@Aggregation(pipeline = RAW_GROUP_BY_LASTNAME_STRING, collation = "de_AT")
Mono<PersonAggregate> aggregateWithCollation(Collation collation);
@Aggregation(pipeline = { RAW_GROUP_BY_LASTNAME_STRING, RAW_OUT })
Flux<Person> outDoNotSkipResult();
@Aggregation(pipeline = { RAW_GROUP_BY_LASTNAME_STRING, RAW_OUT })
Mono<Void> outSkipResult();
}
static class PersonAggregate {

View File

@@ -91,6 +91,7 @@ public class StringBasedAggregationUnitTests {
private static final String RAW_SORT_STRING = "{ '$sort' : { 'lastname' : -1 } }";
private static final String RAW_GROUP_BY_LASTNAME_STRING = "{ '$group': { '_id' : '$lastname', 'names' : { '$addToSet' : '$firstname' } } }";
private static final String RAW_OUT = "{ '$out' : 'authors' }";
private static final String GROUP_BY_LASTNAME_STRING_WITH_PARAMETER_PLACEHOLDER = "{ '$group': { '_id' : '$lastname', names : { '$addToSet' : '$?0' } } }";
private static final String GROUP_BY_LASTNAME_STRING_WITH_SPEL_PARAMETER_PLACEHOLDER = "{ '$group': { '_id' : '$lastname', 'names' : { '$addToSet' : '$?#{[0]}' } } }";
@@ -260,6 +261,22 @@ public class StringBasedAggregationUnitTests {
.withMessageContaining("Page");
}
@Test // GH-4088
void aggregateWithVoidReturnTypeSkipsResultOnOutStage() {
AggregationInvocation invocation = executeAggregation("outSkipResult");
assertThat(skipResultsOf(invocation)).isTrue();
}
@Test // GH-4088
void aggregateWithOutStageDoesNotSkipResults() {
AggregationInvocation invocation = executeAggregation("outDoNotSkipResult");
assertThat(skipResultsOf(invocation)).isFalse();
}
private AggregationInvocation executeAggregation(String name, Object... args) {
Class<?>[] argTypes = Arrays.stream(args).map(Object::getClass).toArray(Class[]::new);
@@ -302,6 +319,11 @@ public class StringBasedAggregationUnitTests {
: null;
}
private Boolean skipResultsOf(AggregationInvocation invocation) {
return invocation.aggregation.getOptions() != null ? invocation.aggregation.getOptions().isSkipResults()
: false;
}
private Class<?> targetTypeOf(AggregationInvocation invocation) {
return invocation.getTargetType();
}
@@ -350,6 +372,12 @@ public class StringBasedAggregationUnitTests {
@Aggregation(RAW_GROUP_BY_LASTNAME_STRING)
String simpleReturnType();
@Aggregation(pipeline = { RAW_GROUP_BY_LASTNAME_STRING, RAW_OUT })
List<Person> outDoNotSkipResult();
@Aggregation(pipeline = { RAW_GROUP_BY_LASTNAME_STRING, RAW_OUT })
void outSkipResult();
}
private interface UnsupportedRepository extends Repository<Person, Long> {

View File

@@ -28,7 +28,6 @@ import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Mock;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
@@ -43,8 +42,6 @@ import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
class SimpleReactiveMongoRepositoryUnitTests {
private SimpleReactiveMongoRepository<Object, String> repository;
@Mock Mono mono;
@Mock Flux flux;
@Mock ReactiveMongoOperations mongoOperations;
@Mock MongoEntityInformation<Object, String> entityInformation;
@@ -56,7 +53,7 @@ class SimpleReactiveMongoRepositoryUnitTests {
@Test // DATAMONGO-1854
void shouldAddDefaultCollationToCountForExampleIfPresent() {
when(mongoOperations.count(any(), any(), any())).thenReturn(mono);
when(mongoOperations.count(any(), any(), any())).thenReturn(Mono.just(0L));
Collation collation = Collation.of("en_US");
@@ -72,7 +69,7 @@ class SimpleReactiveMongoRepositoryUnitTests {
@Test // DATAMONGO-1854
void shouldAddDefaultCollationToExistsForExampleIfPresent() {
when(mongoOperations.exists(any(), any(), any())).thenReturn(mono);
when(mongoOperations.exists(any(), any(), any())).thenReturn(Mono.just(false));
Collation collation = Collation.of("en_US");
@@ -88,7 +85,7 @@ class SimpleReactiveMongoRepositoryUnitTests {
@Test // DATAMONGO-1854
void shouldAddDefaultCollationToFindForExampleIfPresent() {
when(mongoOperations.find(any(), any(), any())).thenReturn(flux);
when(mongoOperations.find(any(), any(), any())).thenReturn(Flux.empty());
Collation collation = Collation.of("en_US");
@@ -104,7 +101,7 @@ class SimpleReactiveMongoRepositoryUnitTests {
@Test // DATAMONGO-1854
void shouldAddDefaultCollationToFindWithSortForExampleIfPresent() {
when(mongoOperations.find(any(), any(), any())).thenReturn(flux);
when(mongoOperations.find(any(), any(), any())).thenReturn(Flux.empty());
Collation collation = Collation.of("en_US");
@@ -120,7 +117,7 @@ class SimpleReactiveMongoRepositoryUnitTests {
@Test // DATAMONGO-1854
void shouldAddDefaultCollationToFindOneForExampleIfPresent() {
when(mongoOperations.find(any(), any(), any())).thenReturn(flux);
when(mongoOperations.find(any(), any(), any())).thenReturn(Flux.empty());
Collation collation = Collation.of("en_US");

View File

@@ -209,6 +209,13 @@ class ParameterBindingJsonReaderUnitTests {
assertThat(target).isEqualTo(Document.parse("{ 'end_date' : { $gte : { $date : " + time + " } } } "));
}
@Test // GH-4282
public void shouldReturnNullAsSuch() {
String json = "{ 'value' : ObjectId(?0) }";
assertThatExceptionOfType(IllegalArgumentException.class).isThrownBy(() -> parse(json, new Object[] { null }));
}
@Test // DATAMONGO-2418
void shouldNotAccessSpElEvaluationContextWhenNoSpElPresentInBindableTarget() {

View File

@@ -37,6 +37,12 @@ public interface PersonRepository extends CrudRepository<Person, String> {
@Aggregation("{ '$project': { '_id' : '$lastname' } }")
List<String> findAllLastnames(); <9>
@Aggregation(pipeline = {
"{ $group : { _id : '$author', books: { $push: '$title' } } }",
"{ $out : 'authors' }"
})
void groupAndOutSkippingOutput(); <10>
}
----
[source,java]
@@ -75,6 +81,7 @@ Therefore, the `Sort` properties are mapped against the methods return type `Per
To gain more control, you might consider `AggregationResult` as method return type as shown in <7>.
<8> Obtain the raw `AggregationResults` mapped to the generic target wrapper type `SumValue` or `org.bson.Document`.
<9> Like in <6>, a single value can be directly obtained from multiple result ``Document``s.
<10> Skips the output of the `$out` stage when return type is `void`.
====
In some scenarios, aggregations might require additional options, such as a maximum run time, additional log comments, or the permission to temporarily write data to disk.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 4.0.2 (2022.0.2)
Spring Data MongoDB 4.0.5 (2022.0.5)
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -40,6 +40,9 @@ conditions of the subcomponent's license, as noted in the LICENSE file.