Compare commits

...

95 Commits

Author SHA1 Message Date
Greg L. Turnquist
6b85edfa84 Release version 4.1 RC1 (2023.0.0).
See #4337
2023-04-14 11:53:59 -05:00
Greg L. Turnquist
f4ec21792f Prepare 4.1 RC1 (2023.0.0).
See #4337
2023-04-14 11:53:21 -05:00
Mark Paluch
67bd722cfd Polishing.
Extract common code into BulkOperationsSupport. Reorder methods. Add missing verifyComplete to tests.

See #2821
Original pull request: #4342
2023-04-14 14:51:20 +02:00
Christoph Strobl
86dd81f770 Add support for reactive bulk operations.
Closes #2821
Original pull request: #4342
2023-04-14 14:51:19 +02:00
Mark Paluch
2f146dd142 Polishing.
Refine updateOne/updateMulti signatures to accept UpdateDefinition in the generic signature. Use pattern variables and records where applicable. Resolve code duplicates.

See #3872
Original pull request: #4344
2023-04-14 09:34:54 +02:00
Christoph Strobl
0ba857aa22 Add support for AggregationUpdate to BulkOperations.
We now accept `UpdateDefinition` in `BulkOperations` to support custom update definitions and aggregation updates.

Closes #3872
Original pull request: #4344
2023-04-14 09:34:54 +02:00
Mark Paluch
a94ea17e0e Polishing.
Reformat code. Remove unused fields, modifiers and documentation artifacts.

See #4088
Original pull request: #4341
2023-04-14 08:58:42 +02:00
Christoph Strobl
3b99fa0fb4 Skip output for void methods using declarative Aggregations having $out stage.
We now set the skipOutput flag if an annotated Aggregation defines an $out stage and when the method is declared to return no result (void / Mono<Void>, kotlin.Unit)

Closes: #4088
Original pull request: #4341
2023-04-14 08:58:42 +02:00
Christoph Strobl
4b0c0274e8 Enable index modification via IndexOperations.
Introduce IndexOptions that can be used to alter an existing index via IndexOperations.

See: #4348
2023-04-14 08:21:08 +02:00
Christoph Strobl
83217f3413 Polishing.
Add since tags.
Make sure IndexInfo holds hidden information.
Move and add tests.

Original Pull Request: #4349
2023-04-14 08:20:17 +02:00
张行帅
aaa1450b2a Add hidden support for index.
Closes: #4348
Original Pull Request: #4349
2023-04-14 08:18:40 +02:00
Mark Paluch
89c6099a3c Polishing.
Reformat code. Make getAnnotatedHint non-nullable.

See #3230
Original pull request: #4339
2023-04-13 11:51:16 +02:00
Christoph Strobl
7b44f78133 Add Hint annotation.
This commit introduces the new `@Hint` annotation that allows to override MongoDB's default index selection for repository query, update and aggregate operations.

```
@Hint("lastname-idx")
List<Person> findByLastname(String lastname);

@Query(value = "{ 'firstname' : ?0 }", hint="firstname-idx")
List<Person> findByFirstname(String firstname);
```

Closes: #3230
Original pull request: #4339
2023-04-13 11:51:12 +02:00
Christoph Strobl
af2076d4a5 Fix null value handling in ParameterBindingJsonReader#readStringFromExtendedJson.
This commit makes sure to return null for a null parameter value avoiding a potential NPE when parsing data.
In doing so we can ensure object creation is done with the intended value that may or may not lead to a downstream error eg. when trying to create an ObjectId with a null hexString.

Closes: #4282
Original pull request: #4334
2023-04-13 11:32:19 +02:00
Christoph Strobl
79c6427cc9 Update date time parser.
See: #3750
Original pull request: #4334
2023-04-13 11:32:19 +02:00
Christoph Strobl
d54f2e47b0 Follow changes in uuid processing.
Closes: #3750
Original pull request: #4334
2023-04-13 11:32:18 +02:00
Mark Paluch
cc3b33e885 Polishing.
Tweak naming.

Original pull request: #4352
See #4351
2023-04-13 11:12:29 +02:00
Christoph Strobl
b081089df4 Fix AOT processing for lazy-loading Jdk proxies.
This commit makes sure to use the ProxyFactory for retrieving the proxied interfaces. This makes sure to capture the exact interface order required when finally loading the proxy at runtime.

Original pull request: #4352
Closes #4351
2023-04-13 11:12:28 +02:00
Mark Paluch
414cf51f98 Upgrade to MongoDB 4.9.1.
Closes #4362
2023-04-11 15:58:58 +02:00
Mark Paluch
8bbc64317d Upgrade to Maven Wrapper 3.9.1.
See #4356
2023-04-06 16:16:28 +02:00
Oliver Drotbohm
5f48ee5644 Prefer implementing PersistentPropertyAccessor over PersistentPropertyPathAccessor.
In preparation of spring-projects/spring-data-commons#2813 we're moving off the implementation of PersistentPropertyPathAccessor and rather only implement PersistenPropertyAccessor.

Fixes #4354.
2023-04-04 11:10:27 +02:00
Greg L. Turnquist
5733a00f54 Test against Java 20 on CI.
See #4350.
2023-03-29 16:28:08 -05:00
Greg L. Turnquist
894acbb0fc Update CI properties.
See #4337
2023-03-28 13:58:15 -05:00
Christoph Strobl
d70a9ed7c6 Update visibility of ConversionContext.
The ConversionContext should not be package private due to its usage in protected method signatures.

Closes: #4345
2023-03-24 13:40:48 +01:00
Mark Paluch
27de3680b4 Fix reverse keyset scrolling.
We now use gt/lt instead of gte/lte to avoid duplicates during reverse keyset scrolling.

Closes #4343
2023-03-24 11:14:03 +01:00
Christoph Strobl
244b949b6d After release cleanups.
See #4295
2023-03-20 15:05:35 +01:00
Christoph Strobl
29b9123f5b Prepare next development iteration.
See #4295
2023-03-20 15:05:33 +01:00
Christoph Strobl
a2296534c7 Release version 4.1 M3 (2023.0.0).
See #4295
2023-03-20 15:01:47 +01:00
Christoph Strobl
dc8e9d2e0f Prepare 4.1 M3 (2023.0.0).
See #4295
2023-03-20 15:01:19 +01:00
Mark Paluch
25f610cc8a Fix keyset backwards scrolling.
We now correctly scroll backwards by reversing sort order to apply the correct limit and reverse the results again to restore the actual sort order.

Closes #4332
2023-03-20 08:54:50 +01:00
Christoph Strobl
d8c04f0ec9 Use projecting read callback to allow interface projections.
Along the lines fix entity operations proxy handling by reading the underlying map instead of inspecting the proxy interface.
Also make sure to map potential raw fields back to the according property.

See: #4308
Original Pull Request: #4317
2023-03-17 13:07:44 +01:00
Mark Paluch
85826e1fe0 Document limitations around nullable properties.
See: #4308
Original Pull Request: #4317
2023-03-17 13:07:05 +01:00
Mark Paluch
14a722f2fd Add support for reverse scrolling.
Closes #4325
Related to: #4308
Original Pull Request: #4317
2023-03-17 13:06:27 +01:00
Christoph Strobl
9d0afc975a Prevent key extraction if a keyset value is null.
Follow the changes in data commons that renamed scroll to window.
Also error when a certain scroll position does not allow creating a query out of it because of null values.

See: #4308
Original Pull Request: #4317
2023-03-17 13:05:20 +01:00
Mark Paluch
eaa6393798 Add support for keyset extraction of nested property paths.
Closes #4326
Original Pull Request: #4317
2023-03-17 13:04:41 +01:00
Mark Paluch
7d485d732a Add support for scrolling using offset- and keyset-based strategies.
We now support scrolling through large query results using ScrollPosition and Window's of data.

See: #4308
Original Pull Request: #4317
2023-03-17 13:03:53 +01:00
Mark Paluch
aff4e4fd02 Polishing.
Remove duplicate logging in imperative FindOneCallback.

See #4253
Original pull request: #4259
2023-03-17 09:36:55 +01:00
Raghav2211
18413586fb Remove duplicate log in reactive findOne operation.
Closes #4253
Original pull request: #4259
2023-03-17 09:36:53 +01:00
Mark Paluch
aeea743921 Polishing.
Simplify field creation considering simplified projection expressions.

See #3917
Original pull request: #4328
2023-03-16 09:53:50 +01:00
Christoph Strobl
e5bba39c62 Fix field resolution for ExposedFieldsAggregationContext.
This commit fixes an issue where the context is not relaxed and errors on unknown fields if multiple stages of nesting contexts happen.

Closes #3917
Original pull request: #4328
2023-03-16 09:53:27 +01:00
Christoph Strobl
c2f708a37a Fix property value conversion for $in clauses.
This commit fixes an issue where a property value converter is not applied if the query is using an $in clause that compares the value against a collection of potential candidates.

Closes #4080
Original pull request: #4324
2023-03-15 11:32:06 +01:00
Mark Paluch
7f50fe1cb7 Polishing.
Extract duplicates into peek method.

See #4312
Original pull request: #4323
2023-03-15 10:58:43 +01:00
Christoph Strobl
a2127a4da9 Allow reading already resolved references.
This commit adds the ability to read (eg. by an aggregation $lookup) already fully resolved references between documents.
No proxy will be created for lazy loading references and we'll also skip the additional server roundtrip to load the reference by its id.

Closes #4312
Original pull request: #4323
2023-03-15 10:58:37 +01:00
Mark Paluch
67215f1209 Polishing.
Remove caching variant of MongoClientEncryption. Rename types for consistent key alt name scheme. Rename annotation to ExplicitEncrypted.

Add package-info. Improve documentation wording. Reduce visibility of KeyId and KeyAltName to package-private.

Original pull request: #4302
See: #4284
2023-03-15 10:27:37 +01:00
Christoph Strobl
3b33f90e5c Add support for explicit field encryption.
We now support explicit field encryption using mapped entities through the `@ExplicitEncrypted` annotation.

class Person {
  ObjectId id;

  @ExplicitEncrypted(algorithm = AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic, altKeyName = "my-secret-key")
  String socialSecurityNumber;
}

Encryption is applied transparently to all mapped entities leveraging the existing converter infrastructure.

Original pull request: #4302
Closes: #4284
2023-03-15 10:27:37 +01:00
Mark Paluch
3b7b1ace8b Polishing.
Introduce isEmpty method for HintFunction for easier invocation avoiding negations on the call site.

See #3218
Original pull request: #4311
2023-03-06 14:43:07 +01:00
Christoph Strobl
cd63501680 Support hints on Update.
This commit makes sure to read query hints and apply them to the MongoDB UpdateOptions when running an update via Reactive-/MongoTemplate.

Original pull request: #4311
Closes: #3218
2023-03-06 14:37:17 +01:00
Mark Paluch
0020499d4e Polishing.
Reformat code.

See #2750
Original pull request: #4316
2023-03-06 14:31:19 +01:00
Christoph Strobl
f4b85242d4 Polishing.
Favor Base64 over deprecated Base64Utils.

See: #2750
Original pull request: #4316.
2023-03-06 14:28:45 +01:00
Christoph Strobl
a416441427 Support $expr via criteria query.
This commit introduces AggregationExpressionCriteria to be used along with Query to run an $expr operator within the find query.

query(whereExpr(valueOf("spent").greaterThan("budget")))

Closes: #2750
Original pull request: #4316.
2023-03-06 14:28:19 +01:00
Christoph Strobl
e3ef84a56c Fix regression in findAndReplace when using native MongoDB types as domain value.
This commit fixes a regression that prevented native org.bson.Document to serve as source for a findAndReplaceOperation.

Closes: #4300
Original Pull Request: #4310
2023-03-02 09:55:42 +01:00
Mark Paluch
3ab78fc1ed Upgrade to Maven Wrapper 3.9.0.
See #4297
2023-02-20 11:58:01 +01:00
Christoph Strobl
fa0f026410 After release cleanups.
See #4294
2023-02-17 14:25:48 +01:00
Christoph Strobl
9c96a2b2c3 Prepare next development iteration.
See #4294
2023-02-17 14:25:46 +01:00
Christoph Strobl
0986210221 Release version 4.1 M2 (2023.0.0).
See #4294
2023-02-17 14:22:30 +01:00
Christoph Strobl
7d5372f049 Prepare 4.1 M2 (2023.0.0).
See #4294
2023-02-17 14:22:15 +01:00
Christoph Strobl
a5022e9bc4 After release cleanups.
See #4235
2023-02-17 13:31:54 +01:00
Christoph Strobl
aff8fbd62a Prepare next development iteration.
See #4235
2023-02-17 13:31:52 +01:00
Christoph Strobl
633fbceb5a Release version 4.1 M1 (2023.0.0).
See #4235
2023-02-17 13:27:49 +01:00
Christoph Strobl
fb9a0d8482 Prepare 4.1 M1 (2023.0.0).
See #4235
2023-02-17 13:27:08 +01:00
Christoph Strobl
d73807df1b Support ReadConcern and ReadPreference via NearQuery.
Implement ReadConcernAware and ReadPreferenceAware for NearQuery and make sure those get applied when working with the template API.

Original Pull Request: #4288
2023-02-16 14:28:11 +01:00
Mark Paluch
e56f6ce87f Polishing.
Documentation, refine parameter ordering.

Original Pull Request: #4288
2023-02-16 14:28:11 +01:00
Mark Paluch
c5c6fc107c Support ReadConcern & ReadPreference via the Query and Aggregation API.
Add support for setting the ReadConcern and ReadPreference via the Query and Aggregation API.

Closes: #4277, #4286
Original Pull Request: #4288
2023-02-16 14:28:10 +01:00
Christoph Strobl
368c644922 Guard tests for $lookup with let & pipeline
Add guard to skip tests prior to 5.0 server version.

Related to: #3322
2023-02-16 09:33:37 +01:00
Christoph Strobl
4d050f5021 Polishing.
Reuse Let from VariableOperators.
Limit API exposure and favor builders.
Update nullability constraints and assertions.
Update integration tests.
Add unit tests.

Original Pull Request: #4272
2023-02-16 08:33:07 +01:00
sangyongchoi
83923e0e2a Add support for 'let' and 'pipeline' in $lookup
This commit introduces let and pipline to the Lookup aggregation stage.

Closes: #3322
Original Pull Request: #4272
2023-02-16 08:30:21 +01:00
Mark Paluch
25588850dd Disable flakey test.
See #4290
2023-02-14 11:25:06 +01:00
Mark Paluch
55c81f4f54 Adopt to Mockito 5.1 changes.
Closes #4290
2023-02-14 10:50:30 +01:00
Christoph Strobl
ac7551e47f Upgrade to MongoDB driver 4.9.0
Closes: #4289
2023-02-14 07:51:11 +01:00
Mark Paluch
6d3043de9a Update CI properties.
See #4235
2023-01-30 10:49:50 +01:00
Mark Paluch
1a94b6e4ee Upgrade to Maven Wrapper 3.8.7.
See #4281
2023-01-30 10:48:12 +01:00
Mark Paluch
33902b5061 Polishing.
Move QuerydslPredicateExecutor hints to RepositoryRuntimeHints.

See #4244
Original pull request: #4245
2023-01-23 14:08:43 +01:00
Christoph Strobl
d00db4bd40 Add missing hints for Querydsl integration.
This commit adds missing reflection configuration for Querydsl integration. We now also make sure to call the queryMixing getter instead of reading the field via reflection.

Closes #4244
Original pull request: #4245
2023-01-23 14:08:43 +01:00
Christoph Strobl
a5dcbf043a Update links in reference documentation.
We now use the springDocsUrl attribute provided via spring-projects/spring-data-build#1895 to resolve links to framework documentation.

Original Pull Request: #4267
2023-01-18 14:23:22 +01:00
robeatoz
c31203582f Fix parameter and method name in reference documentation.
Closes: #4247
2023-01-16 11:22:29 +01:00
Emre Uygun
f146afecdc Fix typo in reference documentation.
Closes: #4250
2023-01-16 11:19:32 +01:00
Christoph Strobl
324a541a64 Polishing.
Original Pull Request: #4255
2023-01-16 11:17:49 +01:00
Michael Krog
6b71d773d7 Fixes return in Javadoc.
Closes: #4255
2023-01-16 11:15:14 +01:00
Patouche
10447afe0c Fix typo in reference documentation.
Closes: #4268
2023-01-16 10:49:42 +01:00
soumyaPrakashB
c9dfd60f0f Add missing Nullable annotation.
For one of constructor arguments of the AggregationOptions the Nullable annotation for the cursor argument is missing.

Closes: #4256
2023-01-16 10:47:28 +01:00
Mark Paluch
26a8fafd03 Upgrade to MongoDB driver 4.8.2.
Closes #4270
2023-01-13 10:30:01 +01:00
Mark Paluch
00f652a094 Polishing.
Add missing package-info.

See #4248
Original pull request: #4249
2023-01-12 08:47:16 +01:00
Christoph Strobl
d050ae5732 Exclude mongodb and data.mongodb namespaces from reflection contribution.
In some cases the users domain model may hold references to spring data or MongoDB specific types which should not be included in the reflection configuration as they are part of the static runtime hints configuration.

Closes #4248
Original pull request: #4249
2023-01-12 08:47:16 +01:00
Christoph Strobl
8bcab93588 Avoid multiple mapping iterations.
A 2nd pass is no longer needed as the context already does all the work.

Closes: #4043
Original pull request: #4240
2023-01-11 16:04:36 +01:00
Mark Paluch
1839f55055 Polishing.
Introduce HintFunction to encapsulate how hints are applied and to remove code duplications.

See #4238
Original pull request: #4243
2023-01-11 16:02:14 +01:00
Christoph Strobl
4220df5bf8 Accept index names as hint for aggregations.
Closes #4238
Original pull request: #4243
2023-01-11 16:02:05 +01:00
Christoph Strobl
95c6d1531f Fix invalid format specifier in debug statement.
Closes #4241
Original pull request: #4246
2023-01-11 15:29:22 +01:00
Christoph Strobl
b7ed099e06 Update broken links in reference documentation.
Original Pull Request: #4267
2023-01-11 13:46:50 +01:00
Maksymilian Babarowski
7e2e546e55 Update links to Spring Framework reference docs.
Closes: #4267
2023-01-11 13:46:38 +01:00
yangwenjie008
7ce2ebe26e Fix class loader issue with LazyLoadingProxyInterceptor.
Restore original behaviour that was unintentionally changed by modifications related to #4148.

Closes: #4260
Original Pull Request: #4261
2023-01-11 08:49:38 +01:00
Mark Paluch
fbf4d1baa8 Extend license header copyright years to 2023.
See #4264
2023-01-02 09:53:33 +01:00
Christoph Strobl
187f260fe4 Upgrade to MongoDB driver 4.8.1
Closes: #4251
2022-12-12 13:45:20 +01:00
Mark Paluch
04411075b4 Update CI properties.
See #4235
2022-11-18 15:31:10 +01:00
Mark Paluch
459a9c191b After release cleanups.
See #4209
2022-11-18 14:30:20 +01:00
Mark Paluch
137cba8bbb Prepare next development iteration.
See #4209
2022-11-18 14:30:19 +01:00
1094 changed files with 10625 additions and 2310 deletions

View File

@@ -1,2 +1,2 @@
#Fri Jun 03 09:32:40 CEST 2022
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.5/apache-maven-3.8.5-bin.zip
#Thu Apr 06 16:16:28 CEST 2023
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.1/apache-maven-3.9.1-bin.zip

47
Jenkinsfile vendored
View File

@@ -77,10 +77,29 @@ pipeline {
}
}
}
stage('Publish JDK (Java 20) + MongoDB 6.0') {
when {
anyOf {
changeset "ci/openjdk20-mongodb-6.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.next.tag']}", "--build-arg BASE=${p['docker.java.next.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk20-mongodb-6.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
}
}
stage("test: baseline (Java 17)") {
stage("test: baseline (main)") {
when {
beforeAgent(true)
anyOf {
@@ -119,7 +138,7 @@ pipeline {
}
parallel {
stage("test: MongoDB 5.0 (Java 17)") {
stage("test: MongoDB 5.0 (main)") {
agent {
label 'data'
}
@@ -141,7 +160,7 @@ pipeline {
}
}
stage("test: MongoDB 6.0 (Java 17)") {
stage("test: MongoDB 6.0 (main)") {
agent {
label 'data'
}
@@ -162,6 +181,28 @@ pipeline {
}
}
}
stage("test: MongoDB 6.0 (next)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.next.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
}

View File

@@ -0,0 +1,24 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu jammy/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,16 +1,18 @@
# Java versions
java.main.tag=17.0.4.1_1-jdk-focal
java.main.tag=17.0.6_10-jdk-focal
java.next.tag=20-jdk-jammy
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.next.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.next.tag}
# Supported versions of MongoDB
docker.mongodb.4.4.version=4.4.17
docker.mongodb.5.0.version=5.0.13
docker.mongodb.6.0.version=6.0.2
docker.mongodb.4.4.version=4.4.18
docker.mongodb.5.0.version=5.0.14
docker.mongodb.6.0.version=6.0.4
# Supported versions of Redis
docker.redis.6.version=6.2.6
docker.redis.6.version=6.2.10
# Supported versions of Cassandra
docker.cassandra.3.version=3.11.14

12
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0</version>
<version>4.1.0-RC1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.0.0</version>
<version>3.1.0-RC1</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.0.0</springdata.commons>
<mongo>4.8.0</mongo>
<springdata.commons>3.1.0-RC1</springdata.commons>
<mongo>4.9.1</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -145,8 +145,8 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<snapshots>
<enabled>true</enabled>
</snapshots>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0</version>
<version>4.1.0-RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0</version>
<version>4.1.0-RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0</version>
<version>4.1.0-RC1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -112,6 +112,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-crypt</artifactId>
<version>1.6.1</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2021-2022 the original author or authors.
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2022 the original author or authors.
* Copyright 2013-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2021-2022 the original author or authors.
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -353,7 +353,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("closed = %b, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -361,7 +361,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("closed = %b, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2020-2022 the original author or authors.
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,12 +16,13 @@
package org.springframework.data.mongodb;
import org.springframework.dao.UncategorizedDataAccessException;
import org.springframework.lang.Nullable;
public class UncategorizedMongoDbException extends UncategorizedDataAccessException {
private static final long serialVersionUID = -2336595514062364929L;
public UncategorizedMongoDbException(String msg, Throwable cause) {
public UncategorizedMongoDbException(String msg, @Nullable Throwable cause) {
super(msg, cause);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
@@ -25,7 +26,6 @@ import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
@@ -33,7 +33,6 @@ import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
import org.springframework.data.util.TypeUtils;
/**
* @author Christoph Strobl
@@ -66,9 +65,7 @@ public class LazyLoadingProxyAotProcessor {
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
Arrays.asList(LazyLoadingProxyFactory.prepareFactory(field.getType()).getProxiedInterfaces()));
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
@@ -77,7 +74,7 @@ public class LazyLoadingProxyAotProcessor {
} else {
Class<?> proxyClass = LazyLoadingProxyFactory.resolveProxyType(field.getType(),
() -> LazyLoadingInterceptor.none());
LazyLoadingInterceptor::none);
// see: spring-projects/spring-framework/issues/29309
generationContext.getRuntimeHints().reflection().registerType(proxyClass,

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -62,13 +62,14 @@ class MongoRuntimeHints implements RuntimeHintsRegistrar {
TypeReference.of(ReactiveAfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
}
}
private static void registerTransactionProxyHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
if (MongoAotPredicates.isSyncClientPresent(classLoader) && ClassUtils.isPresent("org.springframework.aop.SpringProxy", classLoader)) {
if (MongoAotPredicates.isSyncClientPresent(classLoader)
&& ClassUtils.isPresent("org.springframework.aop.SpringProxy", classLoader)) {
hints.proxies().registerJdkProxy(TypeReference.of("com.mongodb.client.MongoDatabase"),
TypeReference.of("org.springframework.aop.SpringProxy"),
@@ -78,4 +79,5 @@ class MongoRuntimeHints implements RuntimeHintsRegistrar {
TypeReference.of("org.springframework.core.DecoratingProxy"));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2022 the original author or authors.
* Copyright 2013-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2020-2022 the original author or authors.
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2022 the original author or authors.
* Copyright 2013-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2022 the original author or authors.
* Copyright 2012-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2022 the original author or authors.
* Copyright 2013-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2020-2022 the original author or authors.
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2020-2022 the original author or authors.
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2022 the original author or authors.
* Copyright 2012-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2020-2022 the original author or authors.
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,7 +32,6 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
@@ -96,12 +95,7 @@ class AggregationUtil {
* @return
*/
List<Document> createPipeline(Aggregation aggregation, AggregationOperationContext context) {
if (ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toPipeline(context);
}
return mapAggregationPipeline(aggregation.toPipeline(context));
return aggregation.toPipeline(context);
}
/**
@@ -112,16 +106,7 @@ class AggregationUtil {
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
return aggregation.toDocument(collection, context);
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.Pair;
import com.mongodb.bulk.BulkWriteResult;
@@ -28,6 +29,15 @@ import com.mongodb.bulk.BulkWriteResult;
* make use of low level bulk commands on the protocol level. This interface defines a fluent API to add multiple single
* operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* <pre class="code">
* MongoOperations ops = …;
*
* ops.bulkOps(BulkMode.UNORDERED, Person.class)
* .insert(newPerson)
* .updateOne(where("firstname").is("Joe"), Update.update("lastname", "Doe"))
* .execute();
* </pre>
* <p>
* Bulk operations are issued as one batch that pulls together all insert, update, and delete operations. Operations
* that require individual operation results such as optimistic locking (using {@code @Version}) are not supported and
@@ -75,7 +85,19 @@ public interface BulkOperations {
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
default BulkOperations updateOne(Query query, Update update) {
return updateOne(query, (UpdateDefinition) update);
}
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations updateOne(Query query, UpdateDefinition update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
@@ -83,7 +105,7 @@ public interface BulkOperations {
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
BulkOperations updateOne(List<Pair<Query, UpdateDefinition>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
@@ -92,7 +114,19 @@ public interface BulkOperations {
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
default BulkOperations updateMulti(Query query, Update update) {
return updateMulti(query, (UpdateDefinition) update);
}
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations updateMulti(Query query, UpdateDefinition update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
@@ -100,7 +134,7 @@ public interface BulkOperations {
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
BulkOperations updateMulti(List<Pair<Query, UpdateDefinition>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
@@ -110,7 +144,20 @@ public interface BulkOperations {
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
default BulkOperations upsert(Query query, Update update) {
return upsert(query, (UpdateDefinition) update);
}
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations upsert(Query query, UpdateDefinition update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
@@ -142,7 +189,7 @@ public interface BulkOperations {
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @return the current {@link BulkOperations} instance with the replacement added, will never be {@literal null}.
* @since 2.2
*/
default BulkOperations replaceOne(Query query, Object replacement) {
@@ -155,7 +202,7 @@ public interface BulkOperations {
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @return the current {@link BulkOperations} instance with the replacement added, will never be {@literal null}.
* @since 2.2
*/
BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options);

View File

@@ -0,0 +1,221 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEvent;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationUpdate;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.util.Assert;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOneModel;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/**
* Support class for bulk operations.
*
* @author Mark Paluch
* @since 4.1
*/
abstract class BulkOperationsSupport {
private final String collectionName;
BulkOperationsSupport(String collectionName) {
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
this.collectionName = collectionName;
}
/**
* Emit a {@link BeforeSaveEvent}.
*
* @param holder
*/
void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(holder.source(), target, collectionName));
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(holder.source(), target, collectionName));
}
}
/**
* Emit a {@link AfterSaveEvent}.
*
* @param holder
*/
void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(holder.source(), target, collectionName));
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(holder.source(), target, collectionName));
}
}
WriteModel<Document> mapWriteModel(Object source, WriteModel<Document> writeModel) {
if (writeModel instanceof UpdateOneModel<Document> model) {
if (source instanceof AggregationUpdate aggregationUpdate) {
List<Document> pipeline = mapUpdatePipeline(aggregationUpdate);
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), pipeline, model.getOptions());
}
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof UpdateManyModel<Document> model) {
if (source instanceof AggregationUpdate aggregationUpdate) {
List<Document> pipeline = mapUpdatePipeline(aggregationUpdate);
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), pipeline, model.getOptions());
}
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof DeleteOneModel<Document> model) {
return new DeleteOneModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
if (writeModel instanceof DeleteManyModel<Document> model) {
return new DeleteManyModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
return writeModel;
}
private List<Document> mapUpdatePipeline(AggregationUpdate source) {
Class<?> type = entity().isPresent() ? entity().map(PersistentEntity::getType).get() : Object.class;
AggregationOperationContext context = new RelaxedTypeBasedAggregationOperationContext(type,
updateMapper().getMappingContext(), queryMapper());
return new AggregationUtil(queryMapper(), queryMapper().getMappingContext()).createPipeline(source, context);
}
/**
* Emit a {@link ApplicationEvent} if event multicasting is enabled.
*
* @param event
*/
protected abstract void maybeEmitEvent(ApplicationEvent event);
/**
* @return the {@link UpdateMapper} to use.
*/
protected abstract UpdateMapper updateMapper();
/**
* @return the {@link QueryMapper} to use.
*/
protected abstract QueryMapper queryMapper();
/**
* @return the associated {@link PersistentEntity}. Can be {@link Optional#empty()}.
*/
protected abstract Optional<? extends MongoPersistentEntity<?>> entity();
protected Bson getMappedUpdate(Bson update) {
return updateMapper().getMappedObject(update, entity());
}
protected Bson getMappedQuery(Bson query) {
return queryMapper().getMappedObject(query, entity());
}
protected static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
return switch (bulkMode) {
case ORDERED -> options.ordered(true);
case UNORDERED -> options.ordered(false);
};
}
/**
* @param filterQuery The {@link Query} to read a potential {@link Collation} from. Must not be {@literal null}.
* @param update The {@link Update} to apply
* @param upsert flag to indicate if document should be upserted.
* @return new instance of {@link UpdateOptions}.
*/
protected static UpdateOptions computeUpdateOptions(Query filterQuery, UpdateDefinition update, boolean upsert) {
UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (update.hasArrayFilters()) {
List<Document> list = new ArrayList<>(update.getArrayFilters().size());
for (ArrayFilter arrayFilter : update.getArrayFilters()) {
list.add(arrayFilter.asDocument());
}
options.arrayFilters(list);
}
filterQuery.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
return options;
}
/**
* Value object chaining together an actual source with its {@link WriteModel} representation.
*
* @author Christoph Strobl
*/
record SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.client.MongoCollection;
/**
* Interface for functional preparation of a {@link MongoCollection}.
*
* @author Mark Paluch
* @since 4.1
*/
public interface CollectionPreparer<T> {
/**
* Returns a preparer that always returns its input collection.
*
* @return a preparer that always returns its input collection.
*/
static <T> CollectionPreparer<T> identity() {
return it -> it;
}
/**
* Prepare the {@code collection}.
*
* @param collection the collection to prepare.
* @return the prepared collection.
*/
T prepare(T collection);
/**
* Returns a composed {@code CollectionPreparer} that first applies this preparer to the collection, and then applies
* the {@code after} preparer to the result. If evaluation of either function throws an exception, it is relayed to
* the caller of the composed function.
*
* @param after the collection preparer to apply after this function is applied.
* @return a composed {@code CollectionPreparer} that first applies this preparer and then applies the {@code after}
* preparer.
*/
default CollectionPreparer<T> andThen(CollectionPreparer<T> after) {
Assert.notNull(after, "After CollectionPreparer must not be null");
return c -> after.prepare(prepare(c));
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import java.util.function.BiFunction;
import java.util.function.Function;
import org.bson.Document;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
import com.mongodb.client.MongoCollection;
/**
* Support class for delegate implementations to apply {@link ReadConcern} and {@link ReadPreference} settings upon
* {@link CollectionPreparer preparing a collection}.
*
* @author Mark Paluch
* @since 4.1
*/
class CollectionPreparerSupport implements ReadConcernAware, ReadPreferenceAware {
private final List<Object> sources;
private CollectionPreparerSupport(List<Object> sources) {
this.sources = sources;
}
<T> T doPrepare(T collection, Function<T, ReadConcern> concernAccessor, BiFunction<T, ReadConcern, T> concernFunction,
Function<T, ReadPreference> preferenceAccessor, BiFunction<T, ReadPreference, T> preferenceFunction) {
T collectionToUse = collection;
for (Object source : sources) {
if (source instanceof ReadConcernAware rca && rca.hasReadConcern()) {
ReadConcern concern = rca.getReadConcern();
if (concernAccessor.apply(collectionToUse) != concern) {
collectionToUse = concernFunction.apply(collectionToUse, concern);
}
break;
}
}
for (Object source : sources) {
if (source instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
ReadPreference preference = rpa.getReadPreference();
if (preferenceAccessor.apply(collectionToUse) != preference) {
collectionToUse = preferenceFunction.apply(collectionToUse, preference);
}
break;
}
}
return collectionToUse;
}
@Override
public boolean hasReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return true;
}
}
return false;
}
@Override
public ReadConcern getReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return rca.getReadConcern();
}
}
return null;
}
@Override
public boolean hasReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return true;
}
}
return false;
}
@Override
public ReadPreference getReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return rpa.getReadPreference();
}
}
return null;
}
static class CollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<MongoCollection<Document>> {
private CollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static CollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static CollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (CollectionPreparerDelegate) mixedAwares[0];
}
return new CollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public MongoCollection<Document> prepare(MongoCollection<Document> collection) {
return doPrepare(collection, MongoCollection::getReadConcern, MongoCollection::withReadConcern,
MongoCollection::getReadPreference, MongoCollection::withReadPreference);
}
}
static class ReactiveCollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<com.mongodb.reactivestreams.client.MongoCollection<Document>> {
private ReactiveCollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static ReactiveCollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static ReactiveCollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (ReactiveCollectionPreparerDelegate) mixedAwares[0];
}
return new ReactiveCollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public com.mongodb.reactivestreams.client.MongoCollection<Document> prepare(
com.mongodb.reactivestreams.client.MongoCollection<Document> collection) {
return doPrepare(collection, //
com.mongodb.reactivestreams.client.MongoCollection::getReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::withReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::getReadPreference,
com.mongodb.reactivestreams.client.MongoCollection::withReadPreference);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2022 the original author or authors.
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2022 the original author or authors.
* Copyright 2002-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,42 +16,47 @@
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.*;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/**
* Default implementation for {@link BulkOperations}.
@@ -67,7 +72,7 @@ import com.mongodb.client.model.*;
* @author Jacob Botuck
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
class DefaultBulkOperations extends BulkOperationsSupport implements BulkOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -75,7 +80,6 @@ class DefaultBulkOperations implements BulkOperations {
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
/**
@@ -90,6 +94,7 @@ class DefaultBulkOperations implements BulkOperations {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
super(collectionName);
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
@@ -97,7 +102,7 @@ class DefaultBulkOperations implements BulkOperations {
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
/**
@@ -132,21 +137,20 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
public BulkOperations updateOne(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateOne(Collections.singletonList(Pair.of(query, update)));
return update(query, update, false, false);
}
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
public BulkOperations updateOne(List<Pair<Query, UpdateDefinition>> updates) {
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
for (Pair<Query, UpdateDefinition> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
@@ -154,21 +158,22 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
public BulkOperations updateMulti(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
update(query, update, false, true);
return this;
}
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
public BulkOperations updateMulti(List<Pair<Query, UpdateDefinition>> updates) {
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
for (Pair<Query, UpdateDefinition> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
@@ -176,7 +181,7 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
public BulkOperations upsert(Query query, Update update) {
public BulkOperations upsert(Query query, UpdateDefinition update) {
return update(query, update, true, true);
}
@@ -248,7 +253,7 @@ class DefaultBulkOperations implements BulkOperations {
return result;
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
}
@@ -267,9 +272,8 @@ class DefaultBulkOperations implements BulkOperations {
bulkOptions);
} catch (RuntimeException ex) {
if (ex instanceof MongoBulkWriteException) {
if (ex instanceof MongoBulkWriteException mongoBulkWriteException) {
MongoBulkWriteException mongoBulkWriteException = (MongoBulkWriteException) ex;
if (mongoBulkWriteException.getWriteConcernError() != null) {
throw new DataIntegrityViolationException(ex.getMessage(), ex);
}
@@ -284,17 +288,17 @@ class DefaultBulkOperations implements BulkOperations {
maybeEmitBeforeSaveEvent(it);
if (it.getModel() instanceof InsertOneModel) {
if (it.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((InsertOneModel<Document>) it.model()).getDocument();
maybeInvokeBeforeSaveCallback(it.source(), target);
} else if (it.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
Document target = ((ReplaceOneModel<Document>) it.model()).getReplacement();
maybeInvokeBeforeSaveCallback(it.source(), target);
}
return mapWriteModel(it.getModel());
return mapWriteModel(it.source(), it.model());
}
/**
@@ -306,7 +310,7 @@ class DefaultBulkOperations implements BulkOperations {
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
private BulkOperations update(Query query, UpdateDefinition update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
@@ -322,47 +326,24 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
private WriteModel<Document> mapWriteModel(WriteModel<Document> writeModel) {
if (writeModel instanceof UpdateOneModel) {
UpdateOneModel<Document> model = (UpdateOneModel<Document>) writeModel;
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof UpdateManyModel) {
UpdateManyModel<Document> model = (UpdateManyModel<Document>) writeModel;
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof DeleteOneModel) {
DeleteOneModel<Document> model = (DeleteOneModel<Document>) writeModel;
return new DeleteOneModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
if (writeModel instanceof DeleteManyModel) {
DeleteManyModel<Document> model = (DeleteManyModel<Document>) writeModel;
return new DeleteManyModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
return writeModel;
@Override
protected void maybeEmitEvent(ApplicationEvent event) {
bulkOperationContext.publishEvent(event);
}
private Bson getMappedUpdate(Bson update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
@Override
protected UpdateMapper updateMapper() {
return bulkOperationContext.updateMapper();
}
private Bson getMappedQuery(Bson query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
@Override
protected QueryMapper queryMapper() {
return bulkOperationContext.queryMapper();
}
@Override
protected Optional<? extends MongoPersistentEntity<?>> entity() {
return bulkOperationContext.entity();
}
private Document getMappedObject(Object source) {
@@ -381,268 +362,83 @@ class DefaultBulkOperations implements BulkOperations {
models.add(new SourceAwareWriteModelHolder(source, model));
}
private void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
}
}
private void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
}
}
private void maybeInvokeAfterSaveCallback(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
maybeInvokeAfterSaveCallback(holder.source(), target);
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
maybeInvokeAfterSaveCallback(holder.source(), target);
}
}
private <E extends MongoMappingEvent<T>, T> E maybeEmitEvent(E event) {
if (bulkOperationContext.getEventPublisher() == null) {
return event;
}
bulkOperationContext.getEventPublisher().publishEvent(event);
return event;
private void publishEvent(MongoMappingEvent<?> event) {
bulkOperationContext.publishEvent(event);
}
private Object maybeInvokeBeforeConvertCallback(Object value) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeConvertCallback.class, value, collectionName);
return bulkOperationContext.callback(BeforeConvertCallback.class, value, collectionName);
}
private Object maybeInvokeBeforeSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeSaveCallback.class, value, mappedDocument,
collectionName);
return bulkOperationContext.callback(BeforeSaveCallback.class, value, mappedDocument, collectionName);
}
private Object maybeInvokeAfterSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(AfterSaveCallback.class, value, mappedDocument,
collectionName);
}
private static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
switch (bulkMode) {
case ORDERED:
return options.ordered(true);
case UNORDERED:
return options.ordered(false);
}
throw new IllegalStateException("BulkMode was null");
return bulkOperationContext.callback(AfterSaveCallback.class, value, mappedDocument, collectionName);
}
/**
* @param filterQuery The {@link Query} to read a potential {@link Collation} from. Must not be {@literal null}.
* @param update The {@link Update} to apply
* @param upsert flag to indicate if document should be upserted.
* @return new instance of {@link UpdateOptions}.
*/
private static UpdateOptions computeUpdateOptions(Query filterQuery, UpdateDefinition update, boolean upsert) {
UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (update.hasArrayFilters()) {
List<Document> list = new ArrayList<>(update.getArrayFilters().size());
for (ArrayFilter arrayFilter : update.getArrayFilters()) {
list.add(arrayFilter.asDocument());
}
options.arrayFilters(list);
}
filterQuery.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
return options;
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link BulkOperationContext} holds information about {@link BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
static final class BulkOperationContext {
record BulkOperationContext(BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, @Nullable ApplicationEventPublisher eventPublisher,
@Nullable EntityCallbacks entityCallbacks) {
private final BulkMode bulkMode;
private final Optional<? extends MongoPersistentEntity<?>> entity;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final ApplicationEventPublisher eventPublisher;
private final EntityCallbacks entityCallbacks;
BulkOperationContext(BulkOperations.BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, ApplicationEventPublisher eventPublisher,
EntityCallbacks entityCallbacks) {
this.bulkMode = bulkMode;
this.entity = entity;
this.queryMapper = queryMapper;
this.updateMapper = updateMapper;
this.eventPublisher = eventPublisher;
this.entityCallbacks = entityCallbacks;
public boolean skipEntityCallbacks() {
return entityCallbacks == null;
}
public BulkMode getBulkMode() {
return this.bulkMode;
public boolean skipEventPublishing() {
return eventPublisher == null;
}
public Optional<? extends MongoPersistentEntity<?>> getEntity() {
return this.entity;
}
@SuppressWarnings("rawtypes")
public <T> T callback(Class<? extends EntityCallback> callbackType, T entity, String collectionName) {
public QueryMapper getQueryMapper() {
return this.queryMapper;
}
public UpdateMapper getUpdateMapper() {
return this.updateMapper;
}
public ApplicationEventPublisher getEventPublisher() {
return this.eventPublisher;
}
public EntityCallbacks getEntityCallbacks() {
return this.entityCallbacks;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BulkOperationContext that = (BulkOperationContext) o;
if (bulkMode != that.bulkMode)
return false;
if (!ObjectUtils.nullSafeEquals(this.entity, that.entity)) {
return false;
if (skipEntityCallbacks()) {
return entity;
}
if (!ObjectUtils.nullSafeEquals(this.queryMapper, that.queryMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.updateMapper, that.updateMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.eventPublisher, that.eventPublisher)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.entityCallbacks, that.entityCallbacks);
return entityCallbacks.callback(callbackType, entity, collectionName);
}
@Override
public int hashCode() {
int result = bulkMode != null ? bulkMode.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(entity);
result = 31 * result + ObjectUtils.nullSafeHashCode(queryMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(updateMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(eventPublisher);
result = 31 * result + ObjectUtils.nullSafeHashCode(entityCallbacks);
return result;
@SuppressWarnings("rawtypes")
public <T> T callback(Class<? extends EntityCallback> callbackType, T entity, Document document,
String collectionName) {
if (skipEntityCallbacks()) {
return entity;
}
return entityCallbacks.callback(callbackType, entity, document, collectionName);
}
public String toString() {
return "DefaultBulkOperations.BulkOperationContext(bulkMode=" + this.getBulkMode() + ", entity="
+ this.getEntity() + ", queryMapper=" + this.getQueryMapper() + ", updateMapper=" + this.getUpdateMapper()
+ ", eventPublisher=" + this.getEventPublisher() + ", entityCallbacks=" + this.getEntityCallbacks() + ")";
public void publishEvent(ApplicationEvent event) {
if (skipEventPublishing()) {
return;
}
eventPublisher.publishEvent(event);
}
}
/**
* Value object chaining together an actual source with its {@link WriteModel} representation.
*
* @since 2.2
* @author Christoph Strobl
*/
private static final class SourceAwareWriteModelHolder {
private final Object source;
private final WriteModel<Document> model;
SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
this.source = source;
this.model = model;
}
public Object getSource() {
return this.source;
}
public WriteModel<Document> getModel() {
return this.model;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SourceAwareWriteModelHolder that = (SourceAwareWriteModelHolder) o;
if (!ObjectUtils.nullSafeEquals(this.source, that.source)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.model, that.model);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(model);
result = 31 * result + ObjectUtils.nullSafeHashCode(source);
return result;
}
public String toString() {
return "DefaultBulkOperations.SourceAwareWriteModelHolder(source=" + this.getSource() + ", model="
+ this.getModel() + ")";
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2022 the original author or authors.
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
@@ -155,6 +157,20 @@ public class DefaultIndexOperations implements IndexOperations {
}
@Override
public void alterIndex(String name, org.springframework.data.mongodb.core.index.IndexOptions options) {
Document indexOptions = new Document("name", name);
indexOptions.putAll(options.toDocument());
Document result = mongoOperations
.execute(db -> db.runCommand(new Document("collMod", collectionName).append("index", indexOptions)));
if(NumberUtils.convertNumberToTargetClass(result.get("ok", (Number) 0), Integer.class) != 1) {
throw new UncategorizedMongoDbException("Index '%s' could not be modified. Response was %s".formatted(name, result.toJson()), null);
}
}
public void dropAllIndexes() {
dropIndex("*");
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -0,0 +1,390 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.callback.ReactiveEntityCallbacks;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCallback;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.reactivestreams.client.MongoCollection;
/**
* Default implementation for {@link ReactiveBulkOperations}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 4.1
*/
class DefaultReactiveBulkOperations extends BulkOperationsSupport implements ReactiveBulkOperations {
private final ReactiveMongoOperations mongoOperations;
private final String collectionName;
private final ReactiveBulkOperationContext bulkOperationContext;
private final List<Mono<SourceAwareWriteModelHolder>> models = new ArrayList<>();
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
/**
* Creates a new {@link DefaultReactiveBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link ReactiveBulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
*/
DefaultReactiveBulkOperations(ReactiveMongoOperations mongoOperations, String collectionName,
ReactiveBulkOperationContext bulkOperationContext) {
super(collectionName);
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
void setDefaultWriteConcern(@Nullable WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
@Override
public ReactiveBulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null");
this.models.add(Mono.just(document).flatMap(it -> {
maybeEmitEvent(new BeforeConvertEvent<>(it, collectionName));
return maybeInvokeBeforeConvertCallback(it);
}).map(it -> new SourceAwareWriteModelHolder(it, new InsertOneModel<>(getMappedObject(it)))));
return this;
}
@Override
public ReactiveBulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null");
documents.forEach(this::insert);
return this;
}
@Override
public ReactiveBulkOperations updateOne(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
update(query, update, false, false);
return this;
}
@Override
public ReactiveBulkOperations updateMulti(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
update(query, update, false, true);
return this;
}
@Override
public ReactiveBulkOperations upsert(Query query, UpdateDefinition update) {
return update(query, update, true, true);
}
@Override
public ReactiveBulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null");
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
this.models.add(Mono.just(query)
.map(it -> new SourceAwareWriteModelHolder(it, new DeleteManyModel<>(it.getQueryObject(), deleteOptions))));
return this;
}
@Override
public ReactiveBulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null");
for (Query query : removes) {
remove(query);
}
return this;
}
@Override
public ReactiveBulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(replacement, "Replacement must not be null");
Assert.notNull(options, "Options must not be null");
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert());
query.getCollation().map(Collation::toMongoCollation).ifPresent(replaceOptions::collation);
this.models.add(Mono.just(replacement).flatMap(it -> {
maybeEmitEvent(new BeforeConvertEvent<>(it, collectionName));
return maybeInvokeBeforeConvertCallback(it);
}).map(it -> new SourceAwareWriteModelHolder(it,
new ReplaceOneModel<>(getMappedQuery(query.getQueryObject()), getMappedObject(it), replaceOptions))));
return this;
}
@Override
public Mono<BulkWriteResult> execute() {
try {
return mongoOperations.execute(collectionName, this::bulkWriteTo).next();
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
}
private Mono<BulkWriteResult> bulkWriteTo(MongoCollection<Document> collection) {
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
Flux<SourceAwareWriteModelHolder> concat = Flux.concat(models).flatMap(it -> {
if (it.model()instanceof InsertOneModel<Document> iom) {
Document target = iom.getDocument();
maybeEmitBeforeSaveEvent(it);
return maybeInvokeBeforeSaveCallback(it.source(), target)
.map(afterCallback -> new SourceAwareWriteModelHolder(afterCallback, mapWriteModel(afterCallback, iom)));
} else if (it.model()instanceof ReplaceOneModel<Document> rom) {
Document target = rom.getReplacement();
maybeEmitBeforeSaveEvent(it);
return maybeInvokeBeforeSaveCallback(it.source(), target)
.map(afterCallback -> new SourceAwareWriteModelHolder(afterCallback, mapWriteModel(afterCallback, rom)));
}
return Mono.just(new SourceAwareWriteModelHolder(it.source(), mapWriteModel(it.source(), it.model())));
});
MongoCollection<Document> theCollection = collection;
return concat.collectList().flatMap(it -> {
return Mono
.from(theCollection
.bulkWrite(it.stream().map(SourceAwareWriteModelHolder::model).collect(Collectors.toList()), bulkOptions))
.doOnSuccess(state -> {
it.forEach(this::maybeEmitAfterSaveEvent);
}).flatMap(state -> {
List<Mono<Object>> monos = it.stream().map(this::maybeInvokeAfterSaveCallback).collect(Collectors.toList());
return Flux.concat(monos).then(Mono.just(state));
});
});
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private ReactiveBulkOperations update(Query query, UpdateDefinition update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
UpdateOptions options = computeUpdateOptions(query, update, upsert);
this.models.add(Mono.just(update).map(it -> {
if (multi) {
return new SourceAwareWriteModelHolder(update,
new UpdateManyModel<>(query.getQueryObject(), it.getUpdateObject(), options));
}
return new SourceAwareWriteModelHolder(update,
new UpdateOneModel<>(query.getQueryObject(), it.getUpdateObject(), options));
}));
return this;
}
@Override
protected void maybeEmitEvent(ApplicationEvent event) {
bulkOperationContext.publishEvent(event);
}
@Override
protected UpdateMapper updateMapper() {
return bulkOperationContext.updateMapper();
}
@Override
protected QueryMapper queryMapper() {
return bulkOperationContext.queryMapper();
}
@Override
protected Optional<? extends MongoPersistentEntity<?>> entity() {
return bulkOperationContext.entity();
}
private Document getMappedObject(Object source) {
if (source instanceof Document) {
return (Document) source;
}
Document sink = new Document();
mongoOperations.getConverter().write(source, sink);
return sink;
}
private Mono<Object> maybeInvokeAfterSaveCallback(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
return maybeInvokeAfterSaveCallback(holder.source(), target);
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
return maybeInvokeAfterSaveCallback(holder.source(), target);
}
return Mono.just(holder.source());
}
private Mono<Object> maybeInvokeBeforeConvertCallback(Object value) {
return bulkOperationContext.callback(ReactiveBeforeConvertCallback.class, value, collectionName);
}
private Mono<Object> maybeInvokeBeforeSaveCallback(Object value, Document mappedDocument) {
return bulkOperationContext.callback(ReactiveBeforeSaveCallback.class, value, mappedDocument, collectionName);
}
private Mono<Object> maybeInvokeAfterSaveCallback(Object value, Document mappedDocument) {
return bulkOperationContext.callback(ReactiveAfterSaveCallback.class, value, mappedDocument, collectionName);
}
/**
* {@link ReactiveBulkOperationContext} holds information about {@link BulkMode} the entity in use as well as
* references to {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
record ReactiveBulkOperationContext(BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, @Nullable ApplicationEventPublisher eventPublisher,
@Nullable ReactiveEntityCallbacks entityCallbacks) {
public boolean skipEntityCallbacks() {
return entityCallbacks == null;
}
public boolean skipEventPublishing() {
return eventPublisher == null;
}
@SuppressWarnings("rawtypes")
public <T> Mono<T> callback(Class<? extends EntityCallback> callbackType, T entity, String collectionName) {
if (skipEntityCallbacks()) {
return Mono.just(entity);
}
return entityCallbacks.callback(callbackType, entity, collectionName);
}
@SuppressWarnings("rawtypes")
public <T> Mono<T> callback(Class<? extends EntityCallback> callbackType, T entity, Document document,
String collectionName) {
if (skipEntityCallbacks()) {
return Mono.just(entity);
}
return entityCallbacks.callback(callbackType, entity, document, collectionName);
}
public void publishEvent(ApplicationEvent event) {
if (skipEventPublishing()) {
return;
}
eventPublisher.publishEvent(event);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import java.util.Collection;
import java.util.Optional;
import org.bson.Document;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import com.mongodb.client.model.IndexOptions;
@@ -104,6 +106,22 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
}).next();
}
@Override
public Mono<Void> alterIndex(String name, org.springframework.data.mongodb.core.index.IndexOptions options) {
return mongoOperations.execute(db -> {
Document indexOptions = new Document("name", name);
indexOptions.putAll(options.toDocument());
return Flux.from(db.runCommand(new Document("collMod", collectionName).append("index", indexOptions)))
.doOnNext(result -> {
if(NumberUtils.convertNumberToTargetClass(result.get("ok", (Number) 0), Integer.class) != 1) {
throw new UncategorizedMongoDbException("Index '%s' could not be modified. Response was %s".formatted(name, result.toJson()), null);
}
});
}).then();
}
@Nullable
private MongoPersistentEntity<?> lookupPersistentEntity(String collection) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2022 the original author or authors.
* Copyright 2014-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2022 the original author or authors.
* Copyright 2015-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2022 the original author or authors.
* Copyright 2010-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2021-2022 the original author or authors.
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2022 the original author or authors.
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,9 +17,11 @@ package org.springframework.data.mongodb.core;
import java.util.Collection;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Optional;
import org.bson.BsonNull;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
@@ -28,6 +30,8 @@ import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PersistentPropertyPath;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.CollectionOptions.TimeSeriesOptions;
@@ -44,9 +48,11 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.EntityProjectionIntrospector;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.TargetAware;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -114,15 +120,19 @@ class EntityOperations {
Assert.notNull(entity, "Bean must not be null");
if (entity instanceof TargetAware targetAware) {
return new SimpleMappedEntity((Map<String, Object>) targetAware.getTarget(), this);
}
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
return new UnmappedEntity(parse(entity.toString()), this);
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
return new SimpleMappedEntity((Map<String, Object>) entity, this);
}
return MappedEntity.of(entity, context);
return MappedEntity.of(entity, context, this);
}
/**
@@ -139,14 +149,14 @@ class EntityOperations {
Assert.notNull(conversionService, "ConversionService must not be null");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
return new UnmappedEntity(parse(entity.toString()), this);
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
return new SimpleMappedEntity((Map<String, Object>) entity, this);
}
return AdaptibleMappedEntity.of(entity, context, conversionService);
return AdaptibleMappedEntity.of(entity, context, conversionService, this);
}
/**
@@ -283,6 +293,11 @@ class EntityOperations {
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
MongoPersistentEntity<?> persistentEntity = queryMapper.getMappingContext().getPersistentEntity(entityType);
if (persistentEntity == null && !resultType.isInterface() || ClassUtils.isAssignable(Document.class, resultType)) {
return (EntityProjection) EntityProjection.nonProjecting(resultType);
}
return introspector.introspect(resultType, entityType);
}
@@ -362,6 +377,7 @@ class EntityOperations {
* A representation of information about an entity.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 2.1
*/
interface Entity<T> {
@@ -380,6 +396,16 @@ class EntityOperations {
*/
Object getId();
/**
* Returns the property value for {@code key}.
*
* @param key
* @return
* @since 4.1
*/
@Nullable
Object getPropertyValue(String key);
/**
* Returns the {@link Query} to find the entity by its identifier.
*
@@ -450,6 +476,15 @@ class EntityOperations {
* @since 2.1.2
*/
boolean isNew();
/**
* @param sortObject
* @return
* @since 4.1
* @throws IllegalStateException if a sort key yields {@literal null}.
*/
Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType);
}
/**
@@ -471,7 +506,7 @@ class EntityOperations {
T populateIdIfNecessary(@Nullable Object id);
/**
* Initializes the version property of the of the current entity if available.
* Initializes the version property of the current entity if available.
*
* @return the entity with the version property updated if available.
*/
@@ -497,9 +532,11 @@ class EntityOperations {
private static class UnmappedEntity<T extends Map<String, Object>> implements AdaptibleEntity<T> {
private final T map;
private final EntityOperations entityOperations;
protected UnmappedEntity(T map) {
protected UnmappedEntity(T map, EntityOperations entityOperations) {
this.map = map;
this.entityOperations = entityOperations;
}
@Override
@@ -509,7 +546,12 @@ class EntityOperations {
@Override
public Object getId() {
return map.get(ID_FIELD);
return getPropertyValue(ID_FIELD);
}
@Override
public Object getPropertyValue(String key) {
return map.get(key);
}
@Override
@@ -563,12 +605,50 @@ class EntityOperations {
public boolean isNew() {
return map.get(ID_FIELD) != null;
}
@Override
public Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType) {
Map<String, Object> keyset = new LinkedHashMap<>();
MongoPersistentEntity<?> sourceEntity = entityOperations.context.getPersistentEntity(sourceType);
if (sourceEntity != null && sourceEntity.hasIdProperty()) {
keyset.put(sourceEntity.getRequiredIdProperty().getName(), getId());
} else {
keyset.put(ID_FIELD, getId());
}
for (String key : sortObject.keySet()) {
Object value = resolveValue(key, sourceEntity);
if (value == null) {
throw new IllegalStateException(
String.format("Cannot extract value for key %s because its value is null", key));
}
keyset.put(key, value);
}
return keyset;
}
@Nullable
private Object resolveValue(String key, @Nullable MongoPersistentEntity<?> sourceEntity) {
if (sourceEntity == null) {
return BsonUtils.resolveValue(map, key);
}
PropertyPath from = PropertyPath.from(key, sourceEntity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> persistentPropertyPath = entityOperations.context
.getPersistentPropertyPath(from);
return BsonUtils.resolveValue(map, persistentPropertyPath.toDotPath(p -> p.getFieldName()));
}
}
private static class SimpleMappedEntity<T extends Map<String, Object>> extends UnmappedEntity<T> {
protected SimpleMappedEntity(T map) {
super(map);
protected SimpleMappedEntity(T map, EntityOperations entityOperations) {
super(map, entityOperations);
}
@Override
@@ -591,23 +671,26 @@ class EntityOperations {
private final MongoPersistentEntity<?> entity;
private final IdentifierAccessor idAccessor;
private final PersistentPropertyAccessor<T> propertyAccessor;
private final EntityOperations entityOperations;
protected MappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor idAccessor,
PersistentPropertyAccessor<T> propertyAccessor) {
PersistentPropertyAccessor<T> propertyAccessor, EntityOperations entityOperations) {
this.entity = entity;
this.idAccessor = idAccessor;
this.propertyAccessor = propertyAccessor;
this.entityOperations = entityOperations;
}
private static <T> MappedEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
EntityOperations entityOperations) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor, entityOperations);
}
@Override
@@ -620,6 +703,11 @@ class EntityOperations {
return idAccessor.getRequiredIdentifier();
}
@Override
public Object getPropertyValue(String key) {
return propertyAccessor.getProperty(entity.getRequiredPersistentProperty(key));
}
@Override
public Query getByIdQuery() {
@@ -697,6 +785,60 @@ class EntityOperations {
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
}
@Override
public Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType) {
Map<String, Object> keyset = new LinkedHashMap<>();
MongoPersistentEntity<?> sourceEntity = entityOperations.context.getPersistentEntity(sourceType);
if (sourceEntity != null && sourceEntity.hasIdProperty()) {
keyset.put(sourceEntity.getRequiredIdProperty().getName(), getId());
} else {
keyset.put(entity.getRequiredIdProperty().getName(), getId());
}
for (String key : sortObject.keySet()) {
Object value;
if (key.indexOf('.') != -1) {
// follow the path across nested levels.
// TODO: We should have a MongoDB-specific property path abstraction to allow diving into Document.
value = getNestedPropertyValue(key);
} else {
value = getPropertyValue(key);
}
if (value == null) {
throw new IllegalStateException(
String.format("Cannot extract value for key %s because its value is null", key));
}
keyset.put(key, value);
}
return keyset;
}
@Nullable
private Object getNestedPropertyValue(String key) {
String[] segments = key.split("\\.");
Entity<?> currentEntity = this;
Object currentValue = BsonNull.VALUE;
for (int i = 0; i < segments.length; i++) {
String segment = segments[i];
currentValue = currentEntity.getPropertyValue(segment);
if (i < segments.length - 1) {
currentEntity = entityOperations.forEntity(currentValue);
}
}
return currentValue != null ? currentValue : BsonNull.VALUE;
}
}
private static class AdaptibleMappedEntity<T> extends MappedEntity<T> implements AdaptibleEntity<T> {
@@ -706,9 +848,9 @@ class EntityOperations {
private final IdentifierAccessor identifierAccessor;
private AdaptibleMappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor identifierAccessor,
ConvertingPropertyAccessor<T> propertyAccessor) {
ConvertingPropertyAccessor<T> propertyAccessor, EntityOperations entityOperations) {
super(entity, identifierAccessor, propertyAccessor);
super(entity, identifierAccessor, propertyAccessor, entityOperations);
this.entity = entity;
this.propertyAccessor = propertyAccessor;
@@ -717,14 +859,14 @@ class EntityOperations {
private static <T> AdaptibleEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
ConversionService conversionService) {
ConversionService conversionService, EntityOperations entityOperations) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new AdaptibleMappedEntity<>(entity, identifierAccessor,
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService), entityOperations);
}
@Nullable
@@ -825,6 +967,14 @@ class EntityOperations {
* @since 3.3
*/
TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options);
/**
* @return the name of the id field.
* @since 4.1
*/
default String getIdKeyName() {
return ID_FIELD;
}
}
/**
@@ -947,6 +1097,11 @@ class EntityOperations {
MongoPersistentProperty persistentProperty = entity.getPersistentProperty(name);
return persistentProperty != null ? persistentProperty.getFieldName() : name;
}
@Override
public String getIdKeyName() {
return entity.getIdProperty().getName();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,9 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.springframework.dao.DataAccessException;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -124,12 +127,28 @@ public interface ExecutableFindOperation {
Stream<T> stream();
/**
* Get the number of matching elements.
* <br />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link MongoOperations#estimatedCount(String)} for empty queries instead.
* Return a window of elements either starting or resuming at
* {@link org.springframework.data.domain.ScrollPosition}.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable
* {@link org.springframework.data.domain.Sort sort properties} as MongoDB does not support criteria to reconstruct
* a query result from absent document fields or {@code null} values through {@code $gt/$lt} operators.
*
* @param scrollPosition the scroll position.
* @return a window of the resulting elements.
* @since 4.1
* @see org.springframework.data.domain.OffsetScrollPosition
* @see org.springframework.data.domain.KeysetScrollPosition
*/
Window<T> scroll(ScrollPosition scrollPosition);
/**
* Get the number of matching elements. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but
* guarantees shard, session and transaction compliance. In case an inaccurate count satisfies the applications
* needs use {@link MongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return total number of matching elements.
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,8 +20,9 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.Window;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
@@ -72,8 +73,8 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final @Nullable String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
@Nullable String collection, Query query) {
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType, @Nullable String collection,
Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
@@ -139,6 +140,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return doStream();
}
@Override
public Window<T> scroll(ScrollPosition scrollPosition) {
return template.doScroll(query.with(scrollPosition), domainType, returnType, getCollectionName());
}
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
@@ -168,8 +174,8 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
Document queryObject = query.getQueryObject();
Document fieldsObject = query.getFieldsObject();
return template.doFind(getCollectionName(), queryObject, fieldsObject, domainType, returnType,
getCursorPreparer(query, preparer));
return template.doFind(template.createDelegate(query), getCollectionName(), queryObject, fieldsObject, domainType,
returnType, getCursorPreparer(query, preparer));
}
private List<T> doFindDistinct(String field) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2022 the original author or authors.
* Copyright 2017-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2022 the original author or authors.
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

Some files were not shown because too many files have changed in this diff Show More