Compare commits

...

156 Commits

Author SHA1 Message Date
Christoph Strobl
da7a62c2d0 Reduce method signatures in Reactive-/MongoOperations and add fluent reactive api variant 2023-08-31 14:08:25 +02:00
Mark Paluch
e14e1f57db Guard against potential NPE. 2023-08-31 11:58:26 +02:00
Mark Paluch
86d289c862 Polishing.
Consistently use Document instead of record. Reformat code. Tweak documentation wording.
2023-08-31 11:58:26 +02:00
Christoph Strobl
8d56a63016 Aggregation update operators cannot be be used here - remove the leftover
should we fail right away if we encounter an aggregation update?
2023-08-31 11:58:26 +02:00
Christoph Strobl
e206d12f5c apply write concern 2023-08-31 11:58:26 +02:00
Christoph Strobl
a4edb6cacc Tests for reactive API 2023-08-31 11:58:26 +02:00
Christoph Strobl
d2a0e739e8 some more updates 2023-08-31 11:58:25 +02:00
Christoph Strobl
53669b9e50 update javadoc and make sure to have a fluent variant as well 2023-08-31 11:58:25 +02:00
Christoph Strobl
f21ca57dcd move stuff into place 2023-08-31 11:58:25 +02:00
Christoph Strobl
9c4072359a moar hacking 2023-08-31 11:58:25 +02:00
Christoph Strobl
0eccd2794d let's have some tests 2023-08-31 11:58:25 +02:00
Christoph Strobl
1762491714 Update a bit of java doc 2023-08-31 11:58:24 +02:00
Jakub
f60c529334 Add support for replaceOne operation 2023-08-31 11:58:24 +02:00
Christoph Strobl
ce140e3deb Prepare issue branch. 2023-08-31 11:58:24 +02:00
Christoph Strobl
492f09fbdf Add isReadable method to UnwrappedMongoPersistentProperty.
Closes: #4489
2023-08-31 09:43:58 +02:00
Julia
e3e73f5351 Fix #self @DocumentReference resolution when used in constructor.
This commit enables document reference lookup to use `DocumentReferenceSource` to properly instantiate an entity containig a @DocumentReference `#self` property.

Closes #4484
Original Pull Request: #4486
2023-08-31 07:57:44 +02:00
Mark Paluch
e5aff2645b Polishing.
Refactor duplicate code into callback.

See #4481
2023-08-24 14:01:28 +02:00
Mark Paluch
b3c0fbb02d Guard command completion listener against unsupported observation context.
We now no longer attempt to complete the Observation if the context is not a MongoDB one. For commands that target the admin database and run within a parent observation, we still might have an Observation but that one points to the parent invocation and not the MongoDB one as we do not record commands for the admin database.

Closes #4481
2023-08-24 14:01:28 +02:00
Julia Lee
51de522e88 After release cleanups.
See #4450
2023-08-18 08:59:39 -04:00
Julia Lee
d2842b246f Prepare next development iteration.
See #4450
2023-08-18 08:59:35 -04:00
Julia Lee
1f954f45b4 Release version 4.2 M2 (2023.1.0).
See #4450
2023-08-18 08:50:31 -04:00
Julia Lee
c14d7bf616 Prepare 4.2 M2 (2023.1.0).
See #4450
2023-08-18 08:49:38 -04:00
Mark Paluch
ec7cfa3b8e Polishing.
Update since tags. Add missing Override annotation.

See #4070
Original pull request: #4242
2023-08-17 14:18:11 +02:00
Christoph Strobl
f1cff3cdaa Introduce AggregationVariable type.
This commit introduces a new AggregationVariable type that is intended to better identify variables within a pipeline to avoid mapping failures caused by invalid field names.

Closes #4070
Original pull request: #4242
2023-08-17 14:18:11 +02:00
Christoph Strobl
0fd1273ed9 Update documentation regarding java.time type conversion.
Closes #3482
Original pull request: #4460
2023-08-17 10:45:04 +02:00
Julia Lee
16798cb3c2 Update CI properties.
See #4450
2023-08-14 12:15:21 -04:00
Julia Lee
963072ec0a Upgrade to Maven Wrapper 3.9.4.
See #4470
2023-08-14 08:53:22 -04:00
Julia
7f64c020b4 Polishing for formatting
Original Pull Request: #4455
2023-08-08 08:55:04 -04:00
Julia
a93854fb09 Add integration test to ensure schema validation fails when domain type property values are not encrypted as expected.
Closes #4454
Original Pull Request: #4455
2023-08-08 08:55:04 -04:00
Christoph Strobl
b6cd129c93 Fix schema generation for encrypted fields that are considered domain entities.
This commit makes sure to consider the encrypted annotation on fields that are considered domain type property values, encrypting the entire object if necessary.
2023-08-08 08:55:04 -04:00
Oliver Christen
c532ec343a Correct misspellings in documentation
Closes: #4461
2023-07-31 14:11:04 -04:00
Christoph Strobl
a44e240402 Polishing.
Use previous context instead of root for mapping objects within an Inheriting context. This avoids accidental mapping of fields against the root entity after eg. a projection stage.
Add missing tests for AggregationOperationRenderer to ensure intended context propagation.

Original Pull Request: #4459
2023-07-28 07:20:13 +02:00
Julia Lee
9b82ede047 Fix mapping custom field names in downstream stages in TypedAggregation pipelines.
Use the root AggregationOperationContext in nested ExposedFieldsAggregationOperationContext to properly apply mapping for domain properties that use @Field.

Closes #4443
Original Pull Request: #4459
2023-07-28 07:18:43 +02:00
Julia Lee
e1986373fd Polishing.
Remove duplicate test configuration.

Original Pull Request: #4447
2023-07-17 10:20:59 +02:00
Julia Lee
5407456973 Fix test setup so that temporal conversions use symmetric timezone setting.
Closes: #4446
Original Pull Request: #4447
2023-07-17 10:12:02 +02:00
Mark Paluch
31f0aa348d After release cleanups.
See #4387
2023-07-14 14:57:12 +02:00
Mark Paluch
28abf1c15b Prepare next development iteration.
See #4387
2023-07-14 14:57:10 +02:00
Mark Paluch
2deede7513 Release version 4.2 M1 (2023.1.0).
See #4387
2023-07-14 14:53:18 +02:00
Mark Paluch
5a48825439 Prepare 4.2 M1 (2023.1.0).
See #4387
2023-07-14 14:52:12 +02:00
Mark Paluch
f4a3e293e8 Upgrade to MongoDB driver 4.10.2.
Closes #4445
2023-07-11 15:46:37 +02:00
Mark Paluch
f0697db32b Polishing.
Reformat code, replace known unsupported constructor with UnsupportedOperationException.

See #4432
Original pull request: #4439
2023-07-10 10:48:29 +02:00
Christoph Strobl
2cc5e427bc Delegate Bson conversion to MongoDB Codec.
Instead of reimplementing conversion we now try to delegate to the native MongoDB codec infrastructure using a custom writer that will only capture values without actually pushing values to an output stream.

See #4432
Original pull request: #4439
2023-07-10 10:48:29 +02:00
Christoph Strobl
a8f08bab86 Fix encryption of java.time types.
This commit makes sure to convert java.time types into their BsonValue representation before encrypting.

See #4432
Original pull request: #4439
2023-07-10 10:48:29 +02:00
Christoph Strobl
19211a0f8e Fix decryption when client is using AutoEncryptionSettings#isBypassAutoEncryption().
This commit makes sure to convert already decrypted entries returned by the driver in case the client is configured with encryption settings.

Closes #4432
Original pull request: #4439
2023-07-10 10:48:28 +02:00
Christoph Strobl
9e0c24435c Upgrade mongodb-crypt to 1.8.0
Closes: #4440
2023-07-07 11:40:06 +02:00
Christoph Strobl
19b1e713b2 Upgrade to MongoDB driver 4.10.1
Closes: #4441
2023-07-07 11:39:42 +02:00
Mark Paluch
af26bb6b31 Polishing.
Introduce limit(Limit) method to limit query results applying the Limit domain type.

See #4397
Original pull request: #4398
2023-07-05 12:02:04 +02:00
Christoph Strobl
d78f47f035 Add tests to verify Limit is supported.
Closes #4397
Original pull request: #4398
2023-07-05 12:01:44 +02:00
Mark Paluch
8cd956e90a Update CI properties.
See #4387
2023-07-03 09:50:16 +02:00
Mark Paluch
49cc6a708d Upgrade to Maven Wrapper 3.9.3.
See #4436
2023-07-03 09:49:43 +02:00
Christoph Strobl
0bf472a29b Polishing.
Update tests to make use of ValueSource.
Replace regex based path inspection with segment by segment analysis.

Original Pull Request: #4427
2023-06-28 13:25:08 +02:00
lijixue
2de00cdb2f Fix QueryMapper property path resolution for nested paths containing numeric values.
Prior to this fix a path that contains numeric values used as position parameters would have been stripped in a way that left out the last digit. This could lead to wrong path resolution if the incorrectly constructed property name accidentally matched an existing one.

Closes: #4426
Original Pull Request: #4427
2023-06-28 13:25:08 +02:00
Mark Paluch
05c38b819f Retain scroll direction across keyset scroll requests.
Closes #4413
2023-06-15 15:09:44 +02:00
Christoph Strobl
c5674d9264 Delombok test source.
Closes: #4411
2023-06-14 15:20:07 +02:00
Christoph Strobl
048af85be0 Accept expression as input for filter aggregation operator.
Closes #4394
Original pull request: #4395
2023-06-14 14:19:30 +02:00
Christoph Strobl
110877eb04 Fix converter registration when using driver native time codec.
This commit prevents converters from being used as writing converter causing asymmetric write/read operations.

Closes #4390
Original pull request: #4392
2023-06-14 11:04:10 +02:00
Mark Paluch
5d571005bb Polishing.
Use extended switch syntax.

See #4404
Original pull request: #4412
2023-06-14 10:00:15 +02:00
Christoph Strobl
57688d8642 Polishing.
Mark method potentially returning null as such and remove unused imports.

See #4404
Original pull request: #4412
2023-06-14 10:00:15 +02:00
Christoph Strobl
d6227e52f9 Use exact matching for IN clause with ignore case.
Prior to this change the generated pattern would have matched more entries than it should have. The behavior is now aligned to its counterpart not using the IgnoreCase flag.

Closes #4404
Original pull request: #4412
2023-06-14 10:00:15 +02:00
Mark Paluch
7e6e029352 Upgrade to Maven Wrapper 3.9.2.
See #4410
2023-06-13 08:54:58 +02:00
Mark Paluch
12c4bf6361 Use snapshot and milestone repositories instead of libs-snapshot and libs-milestone.
Closes #4401
2023-06-06 09:47:04 +02:00
Christoph Strobl
98795cb33e Convert BsonUndefined to null value.
Register a reading converter that returns null when attempting to read a value of type BsonUndefined.
Prior to this change users faced a ConverterNotFoundException when source documents contained BsonUndefined.

Resolves: #2350
2023-06-01 09:24:29 +02:00
Christoph Strobl
fa63efcb24 Add tests using $slice on dbref field.
Closes: #2191
2023-05-31 16:11:40 +02:00
Christoph Strobl
5ffaa79f4e Fix code snippet in change streams reference documentation.
Closes: #4376
2023-05-30 13:01:57 +02:00
Christoph Strobl
f775d485c6 Update docker image build instructions.
Closes: #4372
2023-05-30 13:01:57 +02:00
Mark Paluch
370b4145d2 Polishing.
Add assertions and missing Override annotations. Avoid recursive self-call on getClassLoader. Extend documentation.

See #1627
Original pull request: #4389
2023-05-26 14:48:08 +02:00
Christoph Strobl
4b78ef6523 Extend GridFsTemplate and its reactive variant to accept a provided GridFSBucket instance.
Allow to pass in a GridFSBucket from outside to avoid recreating instances on every method call.

Closes #1627
Original pull request:# 4389
2023-05-26 14:48:08 +02:00
Christoph Strobl
5163e544ae After release cleanups.
See #4369
2023-05-12 14:18:53 +02:00
Christoph Strobl
431512a66c Prepare next development iteration.
See #4369
2023-05-12 14:18:51 +02:00
Christoph Strobl
532b460067 Release version 4.1 GA (2023.0.0).
See #4369
2023-05-12 14:14:38 +02:00
Christoph Strobl
af846a962a Prepare 4.1 GA (2023.0.0).
See #4369
2023-05-12 14:14:05 +02:00
Mark Paluch
776dadeac8 Polishing.
Introduce has…() and getRequired…() methods for comment and max time limit to remove code duplications.

See #4374
Original pull request: #4378
2023-05-11 10:13:56 +02:00
Christoph Strobl
629dfc187e Fix missing query options when calling MongoOperations#count.
This commit makes sure to forward maxTimeMsec and comment options from the query to the CountOptions.

Closes: #4374
Original pull request: #4378
2023-05-10 15:19:17 +02:00
Christoph Strobl
289438b1e4 Fix regression in value to String mapping.
Previous versions allow arbitrary values to be mapped to an string property by calling the ObjectToString converter. This behaviour got lost and is not reestablished.

Closes #4371
Original pull request #4373
2023-05-10 14:52:53 +02:00
Oliver Drotbohm
83958ba316 Adapt to ScrollPosition API changes in Spring Data Commons.
Fixes #4377.
Related ticket: #2824.
2023-04-27 19:42:27 +02:00
Christoph Strobl
3a99d4c29a Fix broken links in observability section of reference documentation.
Add back micrometer-docs-generator plugin in version 1.0.1.

Fixes: #4236
2023-04-24 13:04:38 +02:00
Tomasz Forys
561c3d4f39 Instanceof casting simplification.
Closes: #4265
2023-04-24 09:57:40 +02:00
Christoph Ahlers
7f74794a11 Fix inconsistent strong tag usage in javadoc.
Closes: #4177
2023-04-20 15:04:25 +02:00
Thom
238d8c5ed0 Fix link to custom conversion section in reference documentation.
Closes: #4287
2023-04-20 15:04:15 +02:00
Christoph Strobl
c096caac7d Provide context configuration hints for Spring Boot.
Closes: #3381
2023-04-20 15:04:04 +02:00
Christoph Strobl
c794acaf61 Cover missing Aggregation Stages in reference documentation.
Closes: #3938
2023-04-20 15:01:49 +02:00
Christoph Strobl
339db9d1b8 Add 2023.0 release to compatibility matrix.
Closes: #3940
2023-04-20 15:01:39 +02:00
Christoph Strobl
c04fe744cb Update AggregationExpression javadoc.
See: #4370
2023-04-20 15:01:32 +02:00
Greg L. Turnquist
a16558e4a3 After release cleanups.
See #4337
2023-04-14 12:00:02 -05:00
Greg L. Turnquist
0675b052c6 Prepare next development iteration.
See #4337
2023-04-14 11:59:56 -05:00
Greg L. Turnquist
6b85edfa84 Release version 4.1 RC1 (2023.0.0).
See #4337
2023-04-14 11:53:59 -05:00
Greg L. Turnquist
f4ec21792f Prepare 4.1 RC1 (2023.0.0).
See #4337
2023-04-14 11:53:21 -05:00
Mark Paluch
67bd722cfd Polishing.
Extract common code into BulkOperationsSupport. Reorder methods. Add missing verifyComplete to tests.

See #2821
Original pull request: #4342
2023-04-14 14:51:20 +02:00
Christoph Strobl
86dd81f770 Add support for reactive bulk operations.
Closes #2821
Original pull request: #4342
2023-04-14 14:51:19 +02:00
Mark Paluch
2f146dd142 Polishing.
Refine updateOne/updateMulti signatures to accept UpdateDefinition in the generic signature. Use pattern variables and records where applicable. Resolve code duplicates.

See #3872
Original pull request: #4344
2023-04-14 09:34:54 +02:00
Christoph Strobl
0ba857aa22 Add support for AggregationUpdate to BulkOperations.
We now accept `UpdateDefinition` in `BulkOperations` to support custom update definitions and aggregation updates.

Closes #3872
Original pull request: #4344
2023-04-14 09:34:54 +02:00
Mark Paluch
a94ea17e0e Polishing.
Reformat code. Remove unused fields, modifiers and documentation artifacts.

See #4088
Original pull request: #4341
2023-04-14 08:58:42 +02:00
Christoph Strobl
3b99fa0fb4 Skip output for void methods using declarative Aggregations having $out stage.
We now set the skipOutput flag if an annotated Aggregation defines an $out stage and when the method is declared to return no result (void / Mono<Void>, kotlin.Unit)

Closes: #4088
Original pull request: #4341
2023-04-14 08:58:42 +02:00
Christoph Strobl
4b0c0274e8 Enable index modification via IndexOperations.
Introduce IndexOptions that can be used to alter an existing index via IndexOperations.

See: #4348
2023-04-14 08:21:08 +02:00
Christoph Strobl
83217f3413 Polishing.
Add since tags.
Make sure IndexInfo holds hidden information.
Move and add tests.

Original Pull Request: #4349
2023-04-14 08:20:17 +02:00
张行帅
aaa1450b2a Add hidden support for index.
Closes: #4348
Original Pull Request: #4349
2023-04-14 08:18:40 +02:00
Mark Paluch
89c6099a3c Polishing.
Reformat code. Make getAnnotatedHint non-nullable.

See #3230
Original pull request: #4339
2023-04-13 11:51:16 +02:00
Christoph Strobl
7b44f78133 Add Hint annotation.
This commit introduces the new `@Hint` annotation that allows to override MongoDB's default index selection for repository query, update and aggregate operations.

```
@Hint("lastname-idx")
List<Person> findByLastname(String lastname);

@Query(value = "{ 'firstname' : ?0 }", hint="firstname-idx")
List<Person> findByFirstname(String firstname);
```

Closes: #3230
Original pull request: #4339
2023-04-13 11:51:12 +02:00
Christoph Strobl
af2076d4a5 Fix null value handling in ParameterBindingJsonReader#readStringFromExtendedJson.
This commit makes sure to return null for a null parameter value avoiding a potential NPE when parsing data.
In doing so we can ensure object creation is done with the intended value that may or may not lead to a downstream error eg. when trying to create an ObjectId with a null hexString.

Closes: #4282
Original pull request: #4334
2023-04-13 11:32:19 +02:00
Christoph Strobl
79c6427cc9 Update date time parser.
See: #3750
Original pull request: #4334
2023-04-13 11:32:19 +02:00
Christoph Strobl
d54f2e47b0 Follow changes in uuid processing.
Closes: #3750
Original pull request: #4334
2023-04-13 11:32:18 +02:00
Mark Paluch
cc3b33e885 Polishing.
Tweak naming.

Original pull request: #4352
See #4351
2023-04-13 11:12:29 +02:00
Christoph Strobl
b081089df4 Fix AOT processing for lazy-loading Jdk proxies.
This commit makes sure to use the ProxyFactory for retrieving the proxied interfaces. This makes sure to capture the exact interface order required when finally loading the proxy at runtime.

Original pull request: #4352
Closes #4351
2023-04-13 11:12:28 +02:00
Mark Paluch
414cf51f98 Upgrade to MongoDB 4.9.1.
Closes #4362
2023-04-11 15:58:58 +02:00
Mark Paluch
8bbc64317d Upgrade to Maven Wrapper 3.9.1.
See #4356
2023-04-06 16:16:28 +02:00
Oliver Drotbohm
5f48ee5644 Prefer implementing PersistentPropertyAccessor over PersistentPropertyPathAccessor.
In preparation of spring-projects/spring-data-commons#2813 we're moving off the implementation of PersistentPropertyPathAccessor and rather only implement PersistenPropertyAccessor.

Fixes #4354.
2023-04-04 11:10:27 +02:00
Greg L. Turnquist
5733a00f54 Test against Java 20 on CI.
See #4350.
2023-03-29 16:28:08 -05:00
Greg L. Turnquist
894acbb0fc Update CI properties.
See #4337
2023-03-28 13:58:15 -05:00
Christoph Strobl
d70a9ed7c6 Update visibility of ConversionContext.
The ConversionContext should not be package private due to its usage in protected method signatures.

Closes: #4345
2023-03-24 13:40:48 +01:00
Mark Paluch
27de3680b4 Fix reverse keyset scrolling.
We now use gt/lt instead of gte/lte to avoid duplicates during reverse keyset scrolling.

Closes #4343
2023-03-24 11:14:03 +01:00
Christoph Strobl
244b949b6d After release cleanups.
See #4295
2023-03-20 15:05:35 +01:00
Christoph Strobl
29b9123f5b Prepare next development iteration.
See #4295
2023-03-20 15:05:33 +01:00
Christoph Strobl
a2296534c7 Release version 4.1 M3 (2023.0.0).
See #4295
2023-03-20 15:01:47 +01:00
Christoph Strobl
dc8e9d2e0f Prepare 4.1 M3 (2023.0.0).
See #4295
2023-03-20 15:01:19 +01:00
Mark Paluch
25f610cc8a Fix keyset backwards scrolling.
We now correctly scroll backwards by reversing sort order to apply the correct limit and reverse the results again to restore the actual sort order.

Closes #4332
2023-03-20 08:54:50 +01:00
Christoph Strobl
d8c04f0ec9 Use projecting read callback to allow interface projections.
Along the lines fix entity operations proxy handling by reading the underlying map instead of inspecting the proxy interface.
Also make sure to map potential raw fields back to the according property.

See: #4308
Original Pull Request: #4317
2023-03-17 13:07:44 +01:00
Mark Paluch
85826e1fe0 Document limitations around nullable properties.
See: #4308
Original Pull Request: #4317
2023-03-17 13:07:05 +01:00
Mark Paluch
14a722f2fd Add support for reverse scrolling.
Closes #4325
Related to: #4308
Original Pull Request: #4317
2023-03-17 13:06:27 +01:00
Christoph Strobl
9d0afc975a Prevent key extraction if a keyset value is null.
Follow the changes in data commons that renamed scroll to window.
Also error when a certain scroll position does not allow creating a query out of it because of null values.

See: #4308
Original Pull Request: #4317
2023-03-17 13:05:20 +01:00
Mark Paluch
eaa6393798 Add support for keyset extraction of nested property paths.
Closes #4326
Original Pull Request: #4317
2023-03-17 13:04:41 +01:00
Mark Paluch
7d485d732a Add support for scrolling using offset- and keyset-based strategies.
We now support scrolling through large query results using ScrollPosition and Window's of data.

See: #4308
Original Pull Request: #4317
2023-03-17 13:03:53 +01:00
Mark Paluch
aff4e4fd02 Polishing.
Remove duplicate logging in imperative FindOneCallback.

See #4253
Original pull request: #4259
2023-03-17 09:36:55 +01:00
Raghav2211
18413586fb Remove duplicate log in reactive findOne operation.
Closes #4253
Original pull request: #4259
2023-03-17 09:36:53 +01:00
Mark Paluch
aeea743921 Polishing.
Simplify field creation considering simplified projection expressions.

See #3917
Original pull request: #4328
2023-03-16 09:53:50 +01:00
Christoph Strobl
e5bba39c62 Fix field resolution for ExposedFieldsAggregationContext.
This commit fixes an issue where the context is not relaxed and errors on unknown fields if multiple stages of nesting contexts happen.

Closes #3917
Original pull request: #4328
2023-03-16 09:53:27 +01:00
Christoph Strobl
c2f708a37a Fix property value conversion for $in clauses.
This commit fixes an issue where a property value converter is not applied if the query is using an $in clause that compares the value against a collection of potential candidates.

Closes #4080
Original pull request: #4324
2023-03-15 11:32:06 +01:00
Mark Paluch
7f50fe1cb7 Polishing.
Extract duplicates into peek method.

See #4312
Original pull request: #4323
2023-03-15 10:58:43 +01:00
Christoph Strobl
a2127a4da9 Allow reading already resolved references.
This commit adds the ability to read (eg. by an aggregation $lookup) already fully resolved references between documents.
No proxy will be created for lazy loading references and we'll also skip the additional server roundtrip to load the reference by its id.

Closes #4312
Original pull request: #4323
2023-03-15 10:58:37 +01:00
Mark Paluch
67215f1209 Polishing.
Remove caching variant of MongoClientEncryption. Rename types for consistent key alt name scheme. Rename annotation to ExplicitEncrypted.

Add package-info. Improve documentation wording. Reduce visibility of KeyId and KeyAltName to package-private.

Original pull request: #4302
See: #4284
2023-03-15 10:27:37 +01:00
Christoph Strobl
3b33f90e5c Add support for explicit field encryption.
We now support explicit field encryption using mapped entities through the `@ExplicitEncrypted` annotation.

class Person {
  ObjectId id;

  @ExplicitEncrypted(algorithm = AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic, altKeyName = "my-secret-key")
  String socialSecurityNumber;
}

Encryption is applied transparently to all mapped entities leveraging the existing converter infrastructure.

Original pull request: #4302
Closes: #4284
2023-03-15 10:27:37 +01:00
Mark Paluch
3b7b1ace8b Polishing.
Introduce isEmpty method for HintFunction for easier invocation avoiding negations on the call site.

See #3218
Original pull request: #4311
2023-03-06 14:43:07 +01:00
Christoph Strobl
cd63501680 Support hints on Update.
This commit makes sure to read query hints and apply them to the MongoDB UpdateOptions when running an update via Reactive-/MongoTemplate.

Original pull request: #4311
Closes: #3218
2023-03-06 14:37:17 +01:00
Mark Paluch
0020499d4e Polishing.
Reformat code.

See #2750
Original pull request: #4316
2023-03-06 14:31:19 +01:00
Christoph Strobl
f4b85242d4 Polishing.
Favor Base64 over deprecated Base64Utils.

See: #2750
Original pull request: #4316.
2023-03-06 14:28:45 +01:00
Christoph Strobl
a416441427 Support $expr via criteria query.
This commit introduces AggregationExpressionCriteria to be used along with Query to run an $expr operator within the find query.

query(whereExpr(valueOf("spent").greaterThan("budget")))

Closes: #2750
Original pull request: #4316.
2023-03-06 14:28:19 +01:00
Christoph Strobl
e3ef84a56c Fix regression in findAndReplace when using native MongoDB types as domain value.
This commit fixes a regression that prevented native org.bson.Document to serve as source for a findAndReplaceOperation.

Closes: #4300
Original Pull Request: #4310
2023-03-02 09:55:42 +01:00
Mark Paluch
3ab78fc1ed Upgrade to Maven Wrapper 3.9.0.
See #4297
2023-02-20 11:58:01 +01:00
Christoph Strobl
fa0f026410 After release cleanups.
See #4294
2023-02-17 14:25:48 +01:00
Christoph Strobl
9c96a2b2c3 Prepare next development iteration.
See #4294
2023-02-17 14:25:46 +01:00
Christoph Strobl
0986210221 Release version 4.1 M2 (2023.0.0).
See #4294
2023-02-17 14:22:30 +01:00
Christoph Strobl
7d5372f049 Prepare 4.1 M2 (2023.0.0).
See #4294
2023-02-17 14:22:15 +01:00
Christoph Strobl
a5022e9bc4 After release cleanups.
See #4235
2023-02-17 13:31:54 +01:00
Christoph Strobl
aff8fbd62a Prepare next development iteration.
See #4235
2023-02-17 13:31:52 +01:00
Christoph Strobl
633fbceb5a Release version 4.1 M1 (2023.0.0).
See #4235
2023-02-17 13:27:49 +01:00
Christoph Strobl
fb9a0d8482 Prepare 4.1 M1 (2023.0.0).
See #4235
2023-02-17 13:27:08 +01:00
Christoph Strobl
d73807df1b Support ReadConcern and ReadPreference via NearQuery.
Implement ReadConcernAware and ReadPreferenceAware for NearQuery and make sure those get applied when working with the template API.

Original Pull Request: #4288
2023-02-16 14:28:11 +01:00
Mark Paluch
e56f6ce87f Polishing.
Documentation, refine parameter ordering.

Original Pull Request: #4288
2023-02-16 14:28:11 +01:00
Mark Paluch
c5c6fc107c Support ReadConcern & ReadPreference via the Query and Aggregation API.
Add support for setting the ReadConcern and ReadPreference via the Query and Aggregation API.

Closes: #4277, #4286
Original Pull Request: #4288
2023-02-16 14:28:10 +01:00
Christoph Strobl
368c644922 Guard tests for $lookup with let & pipeline
Add guard to skip tests prior to 5.0 server version.

Related to: #3322
2023-02-16 09:33:37 +01:00
Christoph Strobl
4d050f5021 Polishing.
Reuse Let from VariableOperators.
Limit API exposure and favor builders.
Update nullability constraints and assertions.
Update integration tests.
Add unit tests.

Original Pull Request: #4272
2023-02-16 08:33:07 +01:00
sangyongchoi
83923e0e2a Add support for 'let' and 'pipeline' in $lookup
This commit introduces let and pipline to the Lookup aggregation stage.

Closes: #3322
Original Pull Request: #4272
2023-02-16 08:30:21 +01:00
Mark Paluch
25588850dd Disable flakey test.
See #4290
2023-02-14 11:25:06 +01:00
Mark Paluch
55c81f4f54 Adopt to Mockito 5.1 changes.
Closes #4290
2023-02-14 10:50:30 +01:00
Christoph Strobl
ac7551e47f Upgrade to MongoDB driver 4.9.0
Closes: #4289
2023-02-14 07:51:11 +01:00
Mark Paluch
6d3043de9a Update CI properties.
See #4235
2023-01-30 10:49:50 +01:00
Mark Paluch
1a94b6e4ee Upgrade to Maven Wrapper 3.8.7.
See #4281
2023-01-30 10:48:12 +01:00
352 changed files with 20249 additions and 2999 deletions

View File

@@ -1,2 +1,2 @@
#Fri Jun 03 09:32:40 CEST 2022
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.5/apache-maven-3.8.5-bin.zip
#Mon Aug 14 08:53:22 EDT 2023
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.4/apache-maven-3.9.4-bin.zip

View File

@@ -16,7 +16,7 @@ All of these use cases are great reasons to essentially run what the CI server d
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk17-with-mongodb-5.0.3:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+

47
Jenkinsfile vendored
View File

@@ -77,10 +77,29 @@ pipeline {
}
}
}
stage('Publish JDK (Java 20) + MongoDB 6.0') {
when {
anyOf {
changeset "ci/openjdk20-mongodb-6.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.next.tag']}", "--build-arg BASE=${p['docker.java.next.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk20-mongodb-6.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
}
}
stage("test: baseline (Java 17)") {
stage("test: baseline (main)") {
when {
beforeAgent(true)
anyOf {
@@ -119,7 +138,7 @@ pipeline {
}
parallel {
stage("test: MongoDB 5.0 (Java 17)") {
stage("test: MongoDB 5.0 (main)") {
agent {
label 'data'
}
@@ -141,7 +160,7 @@ pipeline {
}
}
stage("test: MongoDB 6.0 (Java 17)") {
stage("test: MongoDB 6.0 (main)") {
agent {
label 'data'
}
@@ -162,6 +181,28 @@ pipeline {
}
}
}
stage("test: MongoDB 6.0 (next)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.next.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
}

View File

@@ -10,7 +10,7 @@ All of these use cases are great reasons to essentially run what Concourse does
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-8-jdk-with-mongodb /bin/bash`
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk17-with-mongodb-5.0.3 /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
@@ -23,7 +23,7 @@ Since the container is binding to your source, you can make edits from your IDE
If you need to test the `build.sh` script, do this:
1. `mkdir /tmp/spring-data-mongodb-artifactory`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-8-jdk-with-mongodb /bin/bash`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-openjdk17-with-mongodb-5.0.3 /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github` and the temporary
artifactory output directory at `spring-data-mongodb-artifactory`.
@@ -36,4 +36,4 @@ IMPORTANT: `build.sh` doesn't actually push to Artifactory so don't worry about
It just deploys to a local folder. That way, the `artifactory-resource` later in the pipeline can pick up these artifacts
and deliver them to artifactory.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -0,0 +1,24 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu jammy/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,19 +1,21 @@
# Java versions
java.main.tag=17.0.5_8-jdk-focal
java.main.tag=17.0.8_7-jdk-focal
java.next.tag=20-jdk-jammy
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.next.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.next.tag}
# Supported versions of MongoDB
docker.mongodb.4.4.version=4.4.17
docker.mongodb.5.0.version=5.0.13
docker.mongodb.6.0.version=6.0.2
docker.mongodb.4.4.version=4.4.23
docker.mongodb.5.0.version=5.0.19
docker.mongodb.6.0.version=6.0.8
# Supported versions of Redis
docker.redis.6.version=6.2.6
docker.redis.6.version=6.2.13
# Supported versions of Cassandra
docker.cassandra.3.version=3.11.14
docker.cassandra.3.version=3.11.15
# Docker environment settings
docker.java.inside.basic=-v $HOME:/tmp/jenkins-home

34
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-SNAPSHOT</version>
<version>4.2.x-4462-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.1.0-SNAPSHOT</version>
<version>3.2.0-SNAPSHOT</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.1.0-SNAPSHOT</springdata.commons>
<mongo>4.8.2</mongo>
<springdata.commons>3.2.0-SNAPSHOT</springdata.commons>
<mongo>4.10.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -145,33 +145,19 @@
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<id>spring-snapshot</id>
<url>https://repo.spring.io/snapshot</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>spring-milestone</id>
<url>https://repo.spring.io/milestone</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-SNAPSHOT</version>
<version>4.2.x-4462-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -96,15 +96,14 @@ class MongoResultsWriter implements ResultsWriter {
for (Object key : doc.keySet()) {
Object value = doc.get(key);
if (value instanceof Document) {
value = fixDocumentKeys((Document) value);
} else if (value instanceof BasicDBObject) {
value = fixDocumentKeys(new Document((BasicDBObject) value));
if (value instanceof Document document) {
value = fixDocumentKeys(document);
} else if (value instanceof BasicDBObject basicDBObject) {
value = fixDocumentKeys(new Document(basicDBObject));
}
if (key instanceof String) {
if (key instanceof String newKey) {
String newKey = (String) key;
if (newKey.contains(".")) {
newKey = newKey.replace('.', ',');
}

View File

@@ -15,13 +15,18 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-SNAPSHOT</version>
<version>4.2.x-4462-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
<!-- Observability -->
<micrometer-docs-generator.inputPath>${maven.multiModuleProjectDirectory}/spring-data-mongodb/</micrometer-docs-generator.inputPath>
<micrometer-docs-generator.inclusionPattern>.*</micrometer-docs-generator.inclusionPattern>
<micrometer-docs-generator.outputPath>${maven.multiModuleProjectDirectory}/target/</micrometer-docs-generator.outputPath>
</properties>
<build>
@@ -30,6 +35,36 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>generate-docs</id>
<phase>generate-resources</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>io.micrometer.docs.DocsGeneratorCommand</mainClass>
<includePluginDependencies>true</includePluginDependencies>
<arguments>
<argument>${micrometer-docs-generator.inputPath}</argument>
<argument>${micrometer-docs-generator.inclusionPattern}</argument>
<argument>${micrometer-docs-generator.outputPath}</argument>
</arguments>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-docs-generator</artifactId>
<version>1.0.1</version>
<type>jar</type>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
@@ -45,15 +80,4 @@
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-plugins-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-SNAPSHOT</version>
<version>4.2.x-4462-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -112,6 +112,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-crypt</artifactId>
<version>1.8.0</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>

View File

@@ -16,12 +16,13 @@
package org.springframework.data.mongodb;
import org.springframework.dao.UncategorizedDataAccessException;
import org.springframework.lang.Nullable;
public class UncategorizedMongoDbException extends UncategorizedDataAccessException {
private static final long serialVersionUID = -2336595514062364929L;
public UncategorizedMongoDbException(String msg, Throwable cause) {
public UncategorizedMongoDbException(String msg, @Nullable Throwable cause) {
super(msg, cause);
}
}

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
@@ -25,7 +26,6 @@ import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
@@ -33,7 +33,6 @@ import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
import org.springframework.data.util.TypeUtils;
/**
* @author Christoph Strobl
@@ -66,9 +65,7 @@ public class LazyLoadingProxyAotProcessor {
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
Arrays.asList(LazyLoadingProxyFactory.prepareFactory(field.getType()).getProxiedInterfaces()));
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
@@ -77,7 +74,7 @@ public class LazyLoadingProxyAotProcessor {
} else {
Class<?> proxyClass = LazyLoadingProxyFactory.resolveProxyType(field.getType(),
() -> LazyLoadingInterceptor.none());
LazyLoadingInterceptor::none);
// see: spring-projects/spring-framework/issues/29309
generationContext.getRuntimeHints().reflection().registerType(proxyClass,

View File

@@ -206,7 +206,7 @@ public abstract class MongoConfigurationSupport {
* {@link org.springframework.data.mongodb.core.index.IndexDefinition} from the entity or not.
*
* @return {@literal false} by default. <br />
* <strong>INFO</strong>: As of 3.x the default is set to {@literal false}; In 2.x it was {@literal true}.
* <strong>INFO:</strong> As of 3.x the default is set to {@literal false}; In 2.x it was {@literal true}.
* @since 2.2
*/
protected boolean autoIndexCreation() {

View File

@@ -19,6 +19,7 @@ import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.Pair;
import com.mongodb.bulk.BulkWriteResult;
@@ -28,6 +29,15 @@ import com.mongodb.bulk.BulkWriteResult;
* make use of low level bulk commands on the protocol level. This interface defines a fluent API to add multiple single
* operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* <pre class="code">
* MongoOperations ops = …;
*
* ops.bulkOps(BulkMode.UNORDERED, Person.class)
* .insert(newPerson)
* .updateOne(where("firstname").is("Joe"), Update.update("lastname", "Doe"))
* .execute();
* </pre>
* <p>
* Bulk operations are issued as one batch that pulls together all insert, update, and delete operations. Operations
* that require individual operation results such as optimistic locking (using {@code @Version}) are not supported and
@@ -75,7 +85,19 @@ public interface BulkOperations {
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
default BulkOperations updateOne(Query query, Update update) {
return updateOne(query, (UpdateDefinition) update);
}
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations updateOne(Query query, UpdateDefinition update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
@@ -83,7 +105,7 @@ public interface BulkOperations {
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
BulkOperations updateOne(List<Pair<Query, UpdateDefinition>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
@@ -92,7 +114,19 @@ public interface BulkOperations {
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
default BulkOperations updateMulti(Query query, Update update) {
return updateMulti(query, (UpdateDefinition) update);
}
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations updateMulti(Query query, UpdateDefinition update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
@@ -100,7 +134,7 @@ public interface BulkOperations {
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
BulkOperations updateMulti(List<Pair<Query, UpdateDefinition>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
@@ -110,7 +144,20 @@ public interface BulkOperations {
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
default BulkOperations upsert(Query query, Update update) {
return upsert(query, (UpdateDefinition) update);
}
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
* @since 4.1
*/
BulkOperations upsert(Query query, UpdateDefinition update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
@@ -142,7 +189,7 @@ public interface BulkOperations {
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @return the current {@link BulkOperations} instance with the replacement added, will never be {@literal null}.
* @since 2.2
*/
default BulkOperations replaceOne(Query query, Object replacement) {
@@ -155,7 +202,7 @@ public interface BulkOperations {
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @return the current {@link BulkOperations} instance with the replacement added, will never be {@literal null}.
* @since 2.2
*/
BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options);

View File

@@ -0,0 +1,221 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEvent;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationUpdate;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.util.Assert;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOneModel;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/**
* Support class for bulk operations.
*
* @author Mark Paluch
* @since 4.1
*/
abstract class BulkOperationsSupport {
private final String collectionName;
BulkOperationsSupport(String collectionName) {
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
this.collectionName = collectionName;
}
/**
* Emit a {@link BeforeSaveEvent}.
*
* @param holder
*/
void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(holder.source(), target, collectionName));
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(holder.source(), target, collectionName));
}
}
/**
* Emit a {@link AfterSaveEvent}.
*
* @param holder
*/
void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(holder.source(), target, collectionName));
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(holder.source(), target, collectionName));
}
}
WriteModel<Document> mapWriteModel(Object source, WriteModel<Document> writeModel) {
if (writeModel instanceof UpdateOneModel<Document> model) {
if (source instanceof AggregationUpdate aggregationUpdate) {
List<Document> pipeline = mapUpdatePipeline(aggregationUpdate);
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), pipeline, model.getOptions());
}
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof UpdateManyModel<Document> model) {
if (source instanceof AggregationUpdate aggregationUpdate) {
List<Document> pipeline = mapUpdatePipeline(aggregationUpdate);
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), pipeline, model.getOptions());
}
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof DeleteOneModel<Document> model) {
return new DeleteOneModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
if (writeModel instanceof DeleteManyModel<Document> model) {
return new DeleteManyModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
return writeModel;
}
private List<Document> mapUpdatePipeline(AggregationUpdate source) {
Class<?> type = entity().isPresent() ? entity().map(PersistentEntity::getType).get() : Object.class;
AggregationOperationContext context = new RelaxedTypeBasedAggregationOperationContext(type,
updateMapper().getMappingContext(), queryMapper());
return new AggregationUtil(queryMapper(), queryMapper().getMappingContext()).createPipeline(source, context);
}
/**
* Emit a {@link ApplicationEvent} if event multicasting is enabled.
*
* @param event
*/
protected abstract void maybeEmitEvent(ApplicationEvent event);
/**
* @return the {@link UpdateMapper} to use.
*/
protected abstract UpdateMapper updateMapper();
/**
* @return the {@link QueryMapper} to use.
*/
protected abstract QueryMapper queryMapper();
/**
* @return the associated {@link PersistentEntity}. Can be {@link Optional#empty()}.
*/
protected abstract Optional<? extends MongoPersistentEntity<?>> entity();
protected Bson getMappedUpdate(Bson update) {
return updateMapper().getMappedObject(update, entity());
}
protected Bson getMappedQuery(Bson query) {
return queryMapper().getMappedObject(query, entity());
}
protected static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
return switch (bulkMode) {
case ORDERED -> options.ordered(true);
case UNORDERED -> options.ordered(false);
};
}
/**
* @param filterQuery The {@link Query} to read a potential {@link Collation} from. Must not be {@literal null}.
* @param update The {@link Update} to apply
* @param upsert flag to indicate if document should be upserted.
* @return new instance of {@link UpdateOptions}.
*/
protected static UpdateOptions computeUpdateOptions(Query filterQuery, UpdateDefinition update, boolean upsert) {
UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (update.hasArrayFilters()) {
List<Document> list = new ArrayList<>(update.getArrayFilters().size());
for (ArrayFilter arrayFilter : update.getArrayFilters()) {
list.add(arrayFilter.asDocument());
}
options.arrayFilters(list);
}
filterQuery.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
return options;
}
/**
* Value object chaining together an actual source with its {@link WriteModel} representation.
*
* @author Christoph Strobl
*/
record SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
}
}

View File

@@ -150,12 +150,12 @@ public class ChangeStreamOptions {
return timestamp;
}
if (timestamp instanceof Instant) {
return new BsonTimestamp((int) ((Instant) timestamp).getEpochSecond(), 0);
if (timestamp instanceof Instant instant) {
return new BsonTimestamp((int) instant.getEpochSecond(), 0);
}
if (timestamp instanceof BsonTimestamp) {
return Instant.ofEpochSecond(((BsonTimestamp) timestamp).getTime());
if (timestamp instanceof BsonTimestamp bsonTimestamp) {
return Instant.ofEpochSecond(bsonTimestamp.getTime());
}
throw new IllegalArgumentException(

View File

@@ -114,7 +114,7 @@ public class CollectionOptions {
/**
* Create new {@link CollectionOptions} with already given settings and capped set to {@literal true}. <br />
* <strong>NOTE</strong> Using capped collections requires defining {@link #size(long)}.
* <strong>NOTE:</strong> Using capped collections requires defining {@link #size(long)}.
*
* @return new {@link CollectionOptions}.
* @since 2.0

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.client.MongoCollection;
/**
* Interface for functional preparation of a {@link MongoCollection}.
*
* @author Mark Paluch
* @since 4.1
*/
public interface CollectionPreparer<T> {
/**
* Returns a preparer that always returns its input collection.
*
* @return a preparer that always returns its input collection.
*/
static <T> CollectionPreparer<T> identity() {
return it -> it;
}
/**
* Prepare the {@code collection}.
*
* @param collection the collection to prepare.
* @return the prepared collection.
*/
T prepare(T collection);
/**
* Returns a composed {@code CollectionPreparer} that first applies this preparer to the collection, and then applies
* the {@code after} preparer to the result. If evaluation of either function throws an exception, it is relayed to
* the caller of the composed function.
*
* @param after the collection preparer to apply after this function is applied.
* @return a composed {@code CollectionPreparer} that first applies this preparer and then applies the {@code after}
* preparer.
*/
default CollectionPreparer<T> andThen(CollectionPreparer<T> after) {
Assert.notNull(after, "After CollectionPreparer must not be null");
return c -> after.prepare(prepare(c));
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import java.util.function.BiFunction;
import java.util.function.Function;
import org.bson.Document;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
import com.mongodb.client.MongoCollection;
/**
* Support class for delegate implementations to apply {@link ReadConcern} and {@link ReadPreference} settings upon
* {@link CollectionPreparer preparing a collection}.
*
* @author Mark Paluch
* @since 4.1
*/
class CollectionPreparerSupport implements ReadConcernAware, ReadPreferenceAware {
private final List<Object> sources;
private CollectionPreparerSupport(List<Object> sources) {
this.sources = sources;
}
<T> T doPrepare(T collection, Function<T, ReadConcern> concernAccessor, BiFunction<T, ReadConcern, T> concernFunction,
Function<T, ReadPreference> preferenceAccessor, BiFunction<T, ReadPreference, T> preferenceFunction) {
T collectionToUse = collection;
for (Object source : sources) {
if (source instanceof ReadConcernAware rca && rca.hasReadConcern()) {
ReadConcern concern = rca.getReadConcern();
if (concernAccessor.apply(collectionToUse) != concern) {
collectionToUse = concernFunction.apply(collectionToUse, concern);
}
break;
}
}
for (Object source : sources) {
if (source instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
ReadPreference preference = rpa.getReadPreference();
if (preferenceAccessor.apply(collectionToUse) != preference) {
collectionToUse = preferenceFunction.apply(collectionToUse, preference);
}
break;
}
}
return collectionToUse;
}
@Override
public boolean hasReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return true;
}
}
return false;
}
@Override
public ReadConcern getReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return rca.getReadConcern();
}
}
return null;
}
@Override
public boolean hasReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return true;
}
}
return false;
}
@Override
public ReadPreference getReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return rpa.getReadPreference();
}
}
return null;
}
static class CollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<MongoCollection<Document>> {
private CollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static CollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static CollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (CollectionPreparerDelegate) mixedAwares[0];
}
return new CollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public MongoCollection<Document> prepare(MongoCollection<Document> collection) {
return doPrepare(collection, MongoCollection::getReadConcern, MongoCollection::withReadConcern,
MongoCollection::getReadPreference, MongoCollection::withReadPreference);
}
}
static class ReactiveCollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<com.mongodb.reactivestreams.client.MongoCollection<Document>> {
private ReactiveCollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static ReactiveCollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static ReactiveCollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (ReactiveCollectionPreparerDelegate) mixedAwares[0];
}
return new ReactiveCollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public com.mongodb.reactivestreams.client.MongoCollection<Document> prepare(
com.mongodb.reactivestreams.client.MongoCollection<Document> collection) {
return doPrepare(collection, //
com.mongodb.reactivestreams.client.MongoCollection::getReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::withReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::getReadPreference,
com.mongodb.reactivestreams.client.MongoCollection::withReadPreference);
}
}
}

View File

@@ -64,18 +64,15 @@ class CountQuery {
for (Map.Entry<String, Object> entry : source.entrySet()) {
if (entry.getValue() instanceof Document && requiresRewrite(entry.getValue())) {
if (entry.getValue() instanceof Document document && requiresRewrite(entry.getValue())) {
Document theValue = (Document) entry.getValue();
target.putAll(createGeoWithin(entry.getKey(), theValue, source.get("$and")));
target.putAll(createGeoWithin(entry.getKey(), document, source.get("$and")));
continue;
}
if (entry.getValue() instanceof Collection && requiresRewrite(entry.getValue())) {
if (entry.getValue() instanceof Collection<?> collection && requiresRewrite(entry.getValue())) {
Collection<?> source = (Collection<?>) entry.getValue();
target.put(entry.getKey(), rewriteCollection(source));
target.put(entry.getKey(), rewriteCollection(collection));
continue;
}
@@ -96,12 +93,12 @@ class CountQuery {
*/
private boolean requiresRewrite(Object valueToInspect) {
if (valueToInspect instanceof Document) {
return requiresRewrite((Document) valueToInspect);
if (valueToInspect instanceof Document document) {
return requiresRewrite(document);
}
if (valueToInspect instanceof Collection) {
return requiresRewrite((Collection<?>) valueToInspect);
if (valueToInspect instanceof Collection<?> collection) {
return requiresRewrite(collection);
}
return false;
@@ -110,7 +107,7 @@ class CountQuery {
private boolean requiresRewrite(Collection<?> collection) {
for (Object o : collection) {
if (o instanceof Document && requiresRewrite((Document) o)) {
if (o instanceof Document document && requiresRewrite(document)) {
return true;
}
}
@@ -139,8 +136,8 @@ class CountQuery {
Collection<Object> rewrittenCollection = new ArrayList<>(source.size());
for (Object item : source) {
if (item instanceof Document && requiresRewrite(item)) {
rewrittenCollection.add(CountQuery.of((Document) item).toQueryDocument());
if (item instanceof Document document && requiresRewrite(item)) {
rewrittenCollection.add(CountQuery.of(document).toQueryDocument());
} else {
rewrittenCollection.add(item);
}
@@ -242,8 +239,8 @@ class CountQuery {
return value;
}
if (value instanceof Point) {
return Arrays.asList(((Point) value).getX(), ((Point) value).getY());
if (value instanceof Point point) {
return Arrays.asList(point.getX(), point.getY());
}
if (value instanceof Document document) {

View File

@@ -16,42 +16,47 @@
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.*;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/**
* Default implementation for {@link BulkOperations}.
@@ -67,7 +72,7 @@ import com.mongodb.client.model.*;
* @author Jacob Botuck
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
class DefaultBulkOperations extends BulkOperationsSupport implements BulkOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -75,7 +80,6 @@ class DefaultBulkOperations implements BulkOperations {
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
/**
@@ -90,6 +94,7 @@ class DefaultBulkOperations implements BulkOperations {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
super(collectionName);
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
@@ -97,7 +102,7 @@ class DefaultBulkOperations implements BulkOperations {
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
/**
@@ -132,21 +137,20 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
public BulkOperations updateOne(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateOne(Collections.singletonList(Pair.of(query, update)));
return update(query, update, false, false);
}
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
public BulkOperations updateOne(List<Pair<Query, UpdateDefinition>> updates) {
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
for (Pair<Query, UpdateDefinition> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
@@ -154,21 +158,22 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
public BulkOperations updateMulti(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
update(query, update, false, true);
return this;
}
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
public BulkOperations updateMulti(List<Pair<Query, UpdateDefinition>> updates) {
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
for (Pair<Query, UpdateDefinition> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
@@ -176,7 +181,7 @@ class DefaultBulkOperations implements BulkOperations {
}
@Override
public BulkOperations upsert(Query query, Update update) {
public BulkOperations upsert(Query query, UpdateDefinition update) {
return update(query, update, true, true);
}
@@ -248,7 +253,7 @@ class DefaultBulkOperations implements BulkOperations {
return result;
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
}
@@ -267,9 +272,8 @@ class DefaultBulkOperations implements BulkOperations {
bulkOptions);
} catch (RuntimeException ex) {
if (ex instanceof MongoBulkWriteException) {
if (ex instanceof MongoBulkWriteException mongoBulkWriteException) {
MongoBulkWriteException mongoBulkWriteException = (MongoBulkWriteException) ex;
if (mongoBulkWriteException.getWriteConcernError() != null) {
throw new DataIntegrityViolationException(ex.getMessage(), ex);
}
@@ -284,17 +288,17 @@ class DefaultBulkOperations implements BulkOperations {
maybeEmitBeforeSaveEvent(it);
if (it.getModel() instanceof InsertOneModel) {
if (it.model() instanceof InsertOneModel<Document> model) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = model.getDocument();
maybeInvokeBeforeSaveCallback(it.source(), target);
} else if (it.model() instanceof ReplaceOneModel<Document> model) {
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
Document target = model.getReplacement();
maybeInvokeBeforeSaveCallback(it.source(), target);
}
return mapWriteModel(it.getModel());
return mapWriteModel(it.source(), it.model());
}
/**
@@ -306,7 +310,7 @@ class DefaultBulkOperations implements BulkOperations {
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
private BulkOperations update(Query query, UpdateDefinition update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
@@ -322,53 +326,30 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
private WriteModel<Document> mapWriteModel(WriteModel<Document> writeModel) {
if (writeModel instanceof UpdateOneModel) {
UpdateOneModel<Document> model = (UpdateOneModel<Document>) writeModel;
return new UpdateOneModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof UpdateManyModel) {
UpdateManyModel<Document> model = (UpdateManyModel<Document>) writeModel;
return new UpdateManyModel<>(getMappedQuery(model.getFilter()), getMappedUpdate(model.getUpdate()),
model.getOptions());
}
if (writeModel instanceof DeleteOneModel) {
DeleteOneModel<Document> model = (DeleteOneModel<Document>) writeModel;
return new DeleteOneModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
if (writeModel instanceof DeleteManyModel) {
DeleteManyModel<Document> model = (DeleteManyModel<Document>) writeModel;
return new DeleteManyModel<>(getMappedQuery(model.getFilter()), model.getOptions());
}
return writeModel;
@Override
protected void maybeEmitEvent(ApplicationEvent event) {
bulkOperationContext.publishEvent(event);
}
private Bson getMappedUpdate(Bson update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
@Override
protected UpdateMapper updateMapper() {
return bulkOperationContext.updateMapper();
}
private Bson getMappedQuery(Bson query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
@Override
protected QueryMapper queryMapper() {
return bulkOperationContext.queryMapper();
}
@Override
protected Optional<? extends MongoPersistentEntity<?>> entity() {
return bulkOperationContext.entity();
}
private Document getMappedObject(Object source) {
if (source instanceof Document) {
return (Document) source;
if (source instanceof Document document) {
return document;
}
Document sink = new Document();
@@ -381,268 +362,83 @@ class DefaultBulkOperations implements BulkOperations {
models.add(new SourceAwareWriteModelHolder(source, model));
}
private void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
}
}
private void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
}
}
private void maybeInvokeAfterSaveCallback(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
if (holder.model() instanceof InsertOneModel<Document> model) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = model.getDocument();
maybeInvokeAfterSaveCallback(holder.source(), target);
} else if (holder.model() instanceof ReplaceOneModel<Document> model) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
Document target = model.getReplacement();
maybeInvokeAfterSaveCallback(holder.source(), target);
}
}
private <E extends MongoMappingEvent<T>, T> E maybeEmitEvent(E event) {
if (bulkOperationContext.getEventPublisher() == null) {
return event;
}
bulkOperationContext.getEventPublisher().publishEvent(event);
return event;
private void publishEvent(MongoMappingEvent<?> event) {
bulkOperationContext.publishEvent(event);
}
private Object maybeInvokeBeforeConvertCallback(Object value) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeConvertCallback.class, value, collectionName);
return bulkOperationContext.callback(BeforeConvertCallback.class, value, collectionName);
}
private Object maybeInvokeBeforeSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeSaveCallback.class, value, mappedDocument,
collectionName);
return bulkOperationContext.callback(BeforeSaveCallback.class, value, mappedDocument, collectionName);
}
private Object maybeInvokeAfterSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(AfterSaveCallback.class, value, mappedDocument,
collectionName);
}
private static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
switch (bulkMode) {
case ORDERED:
return options.ordered(true);
case UNORDERED:
return options.ordered(false);
}
throw new IllegalStateException("BulkMode was null");
return bulkOperationContext.callback(AfterSaveCallback.class, value, mappedDocument, collectionName);
}
/**
* @param filterQuery The {@link Query} to read a potential {@link Collation} from. Must not be {@literal null}.
* @param update The {@link Update} to apply
* @param upsert flag to indicate if document should be upserted.
* @return new instance of {@link UpdateOptions}.
*/
private static UpdateOptions computeUpdateOptions(Query filterQuery, UpdateDefinition update, boolean upsert) {
UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (update.hasArrayFilters()) {
List<Document> list = new ArrayList<>(update.getArrayFilters().size());
for (ArrayFilter arrayFilter : update.getArrayFilters()) {
list.add(arrayFilter.asDocument());
}
options.arrayFilters(list);
}
filterQuery.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
return options;
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link BulkOperationContext} holds information about {@link BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
static final class BulkOperationContext {
record BulkOperationContext(BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, @Nullable ApplicationEventPublisher eventPublisher,
@Nullable EntityCallbacks entityCallbacks) {
private final BulkMode bulkMode;
private final Optional<? extends MongoPersistentEntity<?>> entity;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final ApplicationEventPublisher eventPublisher;
private final EntityCallbacks entityCallbacks;
BulkOperationContext(BulkOperations.BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, ApplicationEventPublisher eventPublisher,
EntityCallbacks entityCallbacks) {
this.bulkMode = bulkMode;
this.entity = entity;
this.queryMapper = queryMapper;
this.updateMapper = updateMapper;
this.eventPublisher = eventPublisher;
this.entityCallbacks = entityCallbacks;
public boolean skipEntityCallbacks() {
return entityCallbacks == null;
}
public BulkMode getBulkMode() {
return this.bulkMode;
public boolean skipEventPublishing() {
return eventPublisher == null;
}
public Optional<? extends MongoPersistentEntity<?>> getEntity() {
return this.entity;
}
@SuppressWarnings("rawtypes")
public <T> T callback(Class<? extends EntityCallback> callbackType, T entity, String collectionName) {
public QueryMapper getQueryMapper() {
return this.queryMapper;
}
public UpdateMapper getUpdateMapper() {
return this.updateMapper;
}
public ApplicationEventPublisher getEventPublisher() {
return this.eventPublisher;
}
public EntityCallbacks getEntityCallbacks() {
return this.entityCallbacks;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BulkOperationContext that = (BulkOperationContext) o;
if (bulkMode != that.bulkMode)
return false;
if (!ObjectUtils.nullSafeEquals(this.entity, that.entity)) {
return false;
if (skipEntityCallbacks()) {
return entity;
}
if (!ObjectUtils.nullSafeEquals(this.queryMapper, that.queryMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.updateMapper, that.updateMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.eventPublisher, that.eventPublisher)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.entityCallbacks, that.entityCallbacks);
return entityCallbacks.callback(callbackType, entity, collectionName);
}
@Override
public int hashCode() {
int result = bulkMode != null ? bulkMode.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(entity);
result = 31 * result + ObjectUtils.nullSafeHashCode(queryMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(updateMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(eventPublisher);
result = 31 * result + ObjectUtils.nullSafeHashCode(entityCallbacks);
return result;
@SuppressWarnings("rawtypes")
public <T> T callback(Class<? extends EntityCallback> callbackType, T entity, Document document,
String collectionName) {
if (skipEntityCallbacks()) {
return entity;
}
return entityCallbacks.callback(callbackType, entity, document, collectionName);
}
public String toString() {
return "DefaultBulkOperations.BulkOperationContext(bulkMode=" + this.getBulkMode() + ", entity="
+ this.getEntity() + ", queryMapper=" + this.getQueryMapper() + ", updateMapper=" + this.getUpdateMapper()
+ ", eventPublisher=" + this.getEventPublisher() + ", entityCallbacks=" + this.getEntityCallbacks() + ")";
public void publishEvent(ApplicationEvent event) {
if (skipEventPublishing()) {
return;
}
eventPublisher.publishEvent(event);
}
}
/**
* Value object chaining together an actual source with its {@link WriteModel} representation.
*
* @since 2.2
* @author Christoph Strobl
*/
private static final class SourceAwareWriteModelHolder {
private final Object source;
private final WriteModel<Document> model;
SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
this.source = source;
this.model = model;
}
public Object getSource() {
return this.source;
}
public WriteModel<Document> getModel() {
return this.model;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SourceAwareWriteModelHolder that = (SourceAwareWriteModelHolder) o;
if (!ObjectUtils.nullSafeEquals(this.source, that.source)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.model, that.model);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(model);
result = 31 * result + ObjectUtils.nullSafeHashCode(source);
return result;
}
public String toString() {
return "DefaultBulkOperations.SourceAwareWriteModelHolder(source=" + this.getSource() + ", model="
+ this.getModel() + ")";
}
}
}

View File

@@ -22,6 +22,7 @@ import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
@@ -155,6 +157,20 @@ public class DefaultIndexOperations implements IndexOperations {
}
@Override
public void alterIndex(String name, org.springframework.data.mongodb.core.index.IndexOptions options) {
Document indexOptions = new Document("name", name);
indexOptions.putAll(options.toDocument());
Document result = mongoOperations
.execute(db -> db.runCommand(new Document("collMod", collectionName).append("index", indexOptions)));
if(NumberUtils.convertNumberToTargetClass(result.get("ok", (Number) 0), Integer.class) != 1) {
throw new UncategorizedMongoDbException("Index '%s' could not be modified. Response was %s".formatted(name, result.toJson()), null);
}
}
public void dropAllIndexes() {
dropIndex("*");
}

View File

@@ -0,0 +1,390 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.callback.ReactiveEntityCallbacks;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCallback;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.ReplaceOneModel;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.reactivestreams.client.MongoCollection;
/**
* Default implementation for {@link ReactiveBulkOperations}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 4.1
*/
class DefaultReactiveBulkOperations extends BulkOperationsSupport implements ReactiveBulkOperations {
private final ReactiveMongoOperations mongoOperations;
private final String collectionName;
private final ReactiveBulkOperationContext bulkOperationContext;
private final List<Mono<SourceAwareWriteModelHolder>> models = new ArrayList<>();
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
/**
* Creates a new {@link DefaultReactiveBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link ReactiveBulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
*/
DefaultReactiveBulkOperations(ReactiveMongoOperations mongoOperations, String collectionName,
ReactiveBulkOperationContext bulkOperationContext) {
super(collectionName);
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
void setDefaultWriteConcern(@Nullable WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
@Override
public ReactiveBulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null");
this.models.add(Mono.just(document).flatMap(it -> {
maybeEmitEvent(new BeforeConvertEvent<>(it, collectionName));
return maybeInvokeBeforeConvertCallback(it);
}).map(it -> new SourceAwareWriteModelHolder(it, new InsertOneModel<>(getMappedObject(it)))));
return this;
}
@Override
public ReactiveBulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null");
documents.forEach(this::insert);
return this;
}
@Override
public ReactiveBulkOperations updateOne(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
update(query, update, false, false);
return this;
}
@Override
public ReactiveBulkOperations updateMulti(Query query, UpdateDefinition update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
update(query, update, false, true);
return this;
}
@Override
public ReactiveBulkOperations upsert(Query query, UpdateDefinition update) {
return update(query, update, true, true);
}
@Override
public ReactiveBulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null");
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
this.models.add(Mono.just(query)
.map(it -> new SourceAwareWriteModelHolder(it, new DeleteManyModel<>(it.getQueryObject(), deleteOptions))));
return this;
}
@Override
public ReactiveBulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null");
for (Query query : removes) {
remove(query);
}
return this;
}
@Override
public ReactiveBulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(replacement, "Replacement must not be null");
Assert.notNull(options, "Options must not be null");
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert());
query.getCollation().map(Collation::toMongoCollation).ifPresent(replaceOptions::collation);
this.models.add(Mono.just(replacement).flatMap(it -> {
maybeEmitEvent(new BeforeConvertEvent<>(it, collectionName));
return maybeInvokeBeforeConvertCallback(it);
}).map(it -> new SourceAwareWriteModelHolder(it,
new ReplaceOneModel<>(getMappedQuery(query.getQueryObject()), getMappedObject(it), replaceOptions))));
return this;
}
@Override
public Mono<BulkWriteResult> execute() {
try {
return mongoOperations.execute(collectionName, this::bulkWriteTo).next();
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.bulkMode());
}
}
private Mono<BulkWriteResult> bulkWriteTo(MongoCollection<Document> collection) {
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
Flux<SourceAwareWriteModelHolder> concat = Flux.concat(models).flatMap(it -> {
if (it.model()instanceof InsertOneModel<Document> iom) {
Document target = iom.getDocument();
maybeEmitBeforeSaveEvent(it);
return maybeInvokeBeforeSaveCallback(it.source(), target)
.map(afterCallback -> new SourceAwareWriteModelHolder(afterCallback, mapWriteModel(afterCallback, iom)));
} else if (it.model()instanceof ReplaceOneModel<Document> rom) {
Document target = rom.getReplacement();
maybeEmitBeforeSaveEvent(it);
return maybeInvokeBeforeSaveCallback(it.source(), target)
.map(afterCallback -> new SourceAwareWriteModelHolder(afterCallback, mapWriteModel(afterCallback, rom)));
}
return Mono.just(new SourceAwareWriteModelHolder(it.source(), mapWriteModel(it.source(), it.model())));
});
MongoCollection<Document> theCollection = collection;
return concat.collectList().flatMap(it -> {
return Mono
.from(theCollection
.bulkWrite(it.stream().map(SourceAwareWriteModelHolder::model).collect(Collectors.toList()), bulkOptions))
.doOnSuccess(state -> {
it.forEach(this::maybeEmitAfterSaveEvent);
}).flatMap(state -> {
List<Mono<Object>> monos = it.stream().map(this::maybeInvokeAfterSaveCallback).collect(Collectors.toList());
return Flux.concat(monos).then(Mono.just(state));
});
});
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private ReactiveBulkOperations update(Query query, UpdateDefinition update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
UpdateOptions options = computeUpdateOptions(query, update, upsert);
this.models.add(Mono.just(update).map(it -> {
if (multi) {
return new SourceAwareWriteModelHolder(update,
new UpdateManyModel<>(query.getQueryObject(), it.getUpdateObject(), options));
}
return new SourceAwareWriteModelHolder(update,
new UpdateOneModel<>(query.getQueryObject(), it.getUpdateObject(), options));
}));
return this;
}
@Override
protected void maybeEmitEvent(ApplicationEvent event) {
bulkOperationContext.publishEvent(event);
}
@Override
protected UpdateMapper updateMapper() {
return bulkOperationContext.updateMapper();
}
@Override
protected QueryMapper queryMapper() {
return bulkOperationContext.queryMapper();
}
@Override
protected Optional<? extends MongoPersistentEntity<?>> entity() {
return bulkOperationContext.entity();
}
private Document getMappedObject(Object source) {
if (source instanceof Document) {
return (Document) source;
}
Document sink = new Document();
mongoOperations.getConverter().write(source, sink);
return sink;
}
private Mono<Object> maybeInvokeAfterSaveCallback(SourceAwareWriteModelHolder holder) {
if (holder.model() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.model()).getDocument();
return maybeInvokeAfterSaveCallback(holder.source(), target);
} else if (holder.model() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.model()).getReplacement();
return maybeInvokeAfterSaveCallback(holder.source(), target);
}
return Mono.just(holder.source());
}
private Mono<Object> maybeInvokeBeforeConvertCallback(Object value) {
return bulkOperationContext.callback(ReactiveBeforeConvertCallback.class, value, collectionName);
}
private Mono<Object> maybeInvokeBeforeSaveCallback(Object value, Document mappedDocument) {
return bulkOperationContext.callback(ReactiveBeforeSaveCallback.class, value, mappedDocument, collectionName);
}
private Mono<Object> maybeInvokeAfterSaveCallback(Object value, Document mappedDocument) {
return bulkOperationContext.callback(ReactiveAfterSaveCallback.class, value, mappedDocument, collectionName);
}
/**
* {@link ReactiveBulkOperationContext} holds information about {@link BulkMode} the entity in use as well as
* references to {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
record ReactiveBulkOperationContext(BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, @Nullable ApplicationEventPublisher eventPublisher,
@Nullable ReactiveEntityCallbacks entityCallbacks) {
public boolean skipEntityCallbacks() {
return entityCallbacks == null;
}
public boolean skipEventPublishing() {
return eventPublisher == null;
}
@SuppressWarnings("rawtypes")
public <T> Mono<T> callback(Class<? extends EntityCallback> callbackType, T entity, String collectionName) {
if (skipEntityCallbacks()) {
return Mono.just(entity);
}
return entityCallbacks.callback(callbackType, entity, collectionName);
}
@SuppressWarnings("rawtypes")
public <T> Mono<T> callback(Class<? extends EntityCallback> callbackType, T entity, Document document,
String collectionName) {
if (skipEntityCallbacks()) {
return Mono.just(entity);
}
return entityCallbacks.callback(callbackType, entity, document, collectionName);
}
public void publishEvent(ApplicationEvent event) {
if (skipEventPublishing()) {
return;
}
eventPublisher.publishEvent(event);
}
}
}

View File

@@ -22,6 +22,7 @@ import java.util.Collection;
import java.util.Optional;
import org.bson.Document;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import com.mongodb.client.model.IndexOptions;
@@ -104,6 +106,22 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
}).next();
}
@Override
public Mono<Void> alterIndex(String name, org.springframework.data.mongodb.core.index.IndexOptions options) {
return mongoOperations.execute(db -> {
Document indexOptions = new Document("name", name);
indexOptions.putAll(options.toDocument());
return Flux.from(db.runCommand(new Document("collMod", collectionName).append("index", indexOptions)))
.doOnNext(result -> {
if(NumberUtils.convertNumberToTargetClass(result.get("ok", (Number) 0), Integer.class) != 1) {
throw new UncategorizedMongoDbException("Index '%s' could not be modified. Response was %s".formatted(name, result.toJson()), null);
}
});
}).then();
}
@Nullable
private MongoPersistentEntity<?> lookupPersistentEntity(String collection) {

View File

@@ -150,7 +150,7 @@ class DefaultScriptOperations implements ScriptOperations {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
List<Object> convertedValues = new ArrayList<>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String && quote ? String.format("'%s'", arg)

View File

@@ -17,9 +17,11 @@ package org.springframework.data.mongodb.core;
import java.util.Collection;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Optional;
import org.bson.BsonNull;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
@@ -28,6 +30,8 @@ import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PersistentPropertyPath;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.CollectionOptions.TimeSeriesOptions;
@@ -44,9 +48,11 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.EntityProjectionIntrospector;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.TargetAware;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -114,15 +120,19 @@ class EntityOperations {
Assert.notNull(entity, "Bean must not be null");
if (entity instanceof TargetAware targetAware) {
return new SimpleMappedEntity((Map<String, Object>) targetAware.getTarget(), this);
}
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
return new UnmappedEntity(parse(entity.toString()), this);
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
return new SimpleMappedEntity((Map<String, Object>) entity, this);
}
return MappedEntity.of(entity, context);
return MappedEntity.of(entity, context, this);
}
/**
@@ -139,14 +149,14 @@ class EntityOperations {
Assert.notNull(conversionService, "ConversionService must not be null");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
return new UnmappedEntity(parse(entity.toString()), this);
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
return new SimpleMappedEntity((Map<String, Object>) entity, this);
}
return AdaptibleMappedEntity.of(entity, context, conversionService);
return AdaptibleMappedEntity.of(entity, context, conversionService, this);
}
/**
@@ -283,6 +293,11 @@ class EntityOperations {
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
MongoPersistentEntity<?> persistentEntity = queryMapper.getMappingContext().getPersistentEntity(entityType);
if (persistentEntity == null && !resultType.isInterface() || ClassUtils.isAssignable(Document.class, resultType)) {
return (EntityProjection) EntityProjection.nonProjecting(resultType);
}
return introspector.introspect(resultType, entityType);
}
@@ -362,6 +377,7 @@ class EntityOperations {
* A representation of information about an entity.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @since 2.1
*/
interface Entity<T> {
@@ -380,6 +396,16 @@ class EntityOperations {
*/
Object getId();
/**
* Returns the property value for {@code key}.
*
* @param key
* @return
* @since 4.1
*/
@Nullable
Object getPropertyValue(String key);
/**
* Returns the {@link Query} to find the entity by its identifier.
*
@@ -450,6 +476,15 @@ class EntityOperations {
* @since 2.1.2
*/
boolean isNew();
/**
* @param sortObject
* @return
* @since 4.1
* @throws IllegalStateException if a sort key yields {@literal null}.
*/
Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType);
}
/**
@@ -471,7 +506,7 @@ class EntityOperations {
T populateIdIfNecessary(@Nullable Object id);
/**
* Initializes the version property of the of the current entity if available.
* Initializes the version property of the current entity if available.
*
* @return the entity with the version property updated if available.
*/
@@ -497,9 +532,11 @@ class EntityOperations {
private static class UnmappedEntity<T extends Map<String, Object>> implements AdaptibleEntity<T> {
private final T map;
private final EntityOperations entityOperations;
protected UnmappedEntity(T map) {
protected UnmappedEntity(T map, EntityOperations entityOperations) {
this.map = map;
this.entityOperations = entityOperations;
}
@Override
@@ -509,7 +546,12 @@ class EntityOperations {
@Override
public Object getId() {
return map.get(ID_FIELD);
return getPropertyValue(ID_FIELD);
}
@Override
public Object getPropertyValue(String key) {
return map.get(key);
}
@Override
@@ -533,8 +575,8 @@ class EntityOperations {
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document //
? (Document) map //
return MappedDocument.of(map instanceof Document document //
? document //
: new Document(map));
}
@@ -563,12 +605,50 @@ class EntityOperations {
public boolean isNew() {
return map.get(ID_FIELD) != null;
}
@Override
public Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType) {
Map<String, Object> keyset = new LinkedHashMap<>();
MongoPersistentEntity<?> sourceEntity = entityOperations.context.getPersistentEntity(sourceType);
if (sourceEntity != null && sourceEntity.hasIdProperty()) {
keyset.put(sourceEntity.getRequiredIdProperty().getName(), getId());
} else {
keyset.put(ID_FIELD, getId());
}
for (String key : sortObject.keySet()) {
Object value = resolveValue(key, sourceEntity);
if (value == null) {
throw new IllegalStateException(
String.format("Cannot extract value for key %s because its value is null", key));
}
keyset.put(key, value);
}
return keyset;
}
@Nullable
private Object resolveValue(String key, @Nullable MongoPersistentEntity<?> sourceEntity) {
if (sourceEntity == null) {
return BsonUtils.resolveValue(map, key);
}
PropertyPath from = PropertyPath.from(key, sourceEntity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> persistentPropertyPath = entityOperations.context
.getPersistentPropertyPath(from);
return BsonUtils.resolveValue(map, persistentPropertyPath.toDotPath(p -> p.getFieldName()));
}
}
private static class SimpleMappedEntity<T extends Map<String, Object>> extends UnmappedEntity<T> {
protected SimpleMappedEntity(T map) {
super(map);
protected SimpleMappedEntity(T map, EntityOperations entityOperations) {
super(map, entityOperations);
}
@Override
@@ -576,8 +656,8 @@ class EntityOperations {
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
T bean = getBean();
bean = (T) (bean instanceof Document //
? (Document) bean //
bean = (T) (bean instanceof Document document//
? document //
: new Document(bean));
Document document = new Document();
writer.write(bean, document);
@@ -591,23 +671,26 @@ class EntityOperations {
private final MongoPersistentEntity<?> entity;
private final IdentifierAccessor idAccessor;
private final PersistentPropertyAccessor<T> propertyAccessor;
private final EntityOperations entityOperations;
protected MappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor idAccessor,
PersistentPropertyAccessor<T> propertyAccessor) {
PersistentPropertyAccessor<T> propertyAccessor, EntityOperations entityOperations) {
this.entity = entity;
this.idAccessor = idAccessor;
this.propertyAccessor = propertyAccessor;
this.entityOperations = entityOperations;
}
private static <T> MappedEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
EntityOperations entityOperations) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor, entityOperations);
}
@Override
@@ -620,6 +703,11 @@ class EntityOperations {
return idAccessor.getRequiredIdentifier();
}
@Override
public Object getPropertyValue(String key) {
return propertyAccessor.getProperty(entity.getRequiredPersistentProperty(key));
}
@Override
public Query getByIdQuery() {
@@ -697,6 +785,60 @@ class EntityOperations {
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
}
@Override
public Map<String, Object> extractKeys(Document sortObject, Class<?> sourceType) {
Map<String, Object> keyset = new LinkedHashMap<>();
MongoPersistentEntity<?> sourceEntity = entityOperations.context.getPersistentEntity(sourceType);
if (sourceEntity != null && sourceEntity.hasIdProperty()) {
keyset.put(sourceEntity.getRequiredIdProperty().getName(), getId());
} else {
keyset.put(entity.getRequiredIdProperty().getName(), getId());
}
for (String key : sortObject.keySet()) {
Object value;
if (key.indexOf('.') != -1) {
// follow the path across nested levels.
// TODO: We should have a MongoDB-specific property path abstraction to allow diving into Document.
value = getNestedPropertyValue(key);
} else {
value = getPropertyValue(key);
}
if (value == null) {
throw new IllegalStateException(
String.format("Cannot extract value for key %s because its value is null", key));
}
keyset.put(key, value);
}
return keyset;
}
@Nullable
private Object getNestedPropertyValue(String key) {
String[] segments = key.split("\\.");
Entity<?> currentEntity = this;
Object currentValue = BsonNull.VALUE;
for (int i = 0; i < segments.length; i++) {
String segment = segments[i];
currentValue = currentEntity.getPropertyValue(segment);
if (i < segments.length - 1) {
currentEntity = entityOperations.forEntity(currentValue);
}
}
return currentValue != null ? currentValue : BsonNull.VALUE;
}
}
private static class AdaptibleMappedEntity<T> extends MappedEntity<T> implements AdaptibleEntity<T> {
@@ -706,9 +848,9 @@ class EntityOperations {
private final IdentifierAccessor identifierAccessor;
private AdaptibleMappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor identifierAccessor,
ConvertingPropertyAccessor<T> propertyAccessor) {
ConvertingPropertyAccessor<T> propertyAccessor, EntityOperations entityOperations) {
super(entity, identifierAccessor, propertyAccessor);
super(entity, identifierAccessor, propertyAccessor, entityOperations);
this.entity = entity;
this.propertyAccessor = propertyAccessor;
@@ -717,14 +859,14 @@ class EntityOperations {
private static <T> AdaptibleEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
ConversionService conversionService) {
ConversionService conversionService, EntityOperations entityOperations) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new AdaptibleMappedEntity<>(entity, identifierAccessor,
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService), entityOperations);
}
@Nullable
@@ -825,6 +967,14 @@ class EntityOperations {
* @since 3.3
*/
TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options);
/**
* @return the name of the id field.
* @since 4.1
*/
default String getIdKeyName() {
return ID_FIELD;
}
}
/**
@@ -947,6 +1097,11 @@ class EntityOperations {
MongoPersistentProperty persistentProperty = entity.getPersistentProperty(name);
return persistentProperty != null ? persistentProperty.getFieldName() : name;
}
@Override
public String getIdKeyName() {
return entity.getIdProperty().getName();
}
}
}

View File

@@ -98,9 +98,7 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
return collection;
}
if (aggregation instanceof TypedAggregation) {
TypedAggregation<?> typedAggregation = (TypedAggregation<?>) aggregation;
if (aggregation instanceof TypedAggregation typedAggregation) {
if (typedAggregation.getInputType() != null) {
return template.getCollectionName(typedAggregation.getInputType());

View File

@@ -20,6 +20,9 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.springframework.dao.DataAccessException;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -124,12 +127,28 @@ public interface ExecutableFindOperation {
Stream<T> stream();
/**
* Get the number of matching elements.
* <br />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link MongoOperations#estimatedCount(String)} for empty queries instead.
* Return a window of elements either starting or resuming at
* {@link org.springframework.data.domain.ScrollPosition}.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable
* {@link org.springframework.data.domain.Sort sort properties} as MongoDB does not support criteria to reconstruct
* a query result from absent document fields or {@code null} values through {@code $gt/$lt} operators.
*
* @param scrollPosition the scroll position.
* @return a window of the resulting elements.
* @since 4.1
* @see org.springframework.data.domain.OffsetScrollPosition
* @see org.springframework.data.domain.KeysetScrollPosition
*/
Window<T> scroll(ScrollPosition scrollPosition);
/**
* Get the number of matching elements. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but
* guarantees shard, session and transaction compliance. In case an inaccurate count satisfies the applications
* needs use {@link MongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return total number of matching elements.
*/

View File

@@ -20,8 +20,9 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.Window;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
@@ -72,8 +73,8 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final @Nullable String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
@Nullable String collection, Query query) {
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType, @Nullable String collection,
Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
@@ -139,6 +140,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return doStream();
}
@Override
public Window<T> scroll(ScrollPosition scrollPosition) {
return template.doScroll(query.with(scrollPosition), domainType, returnType, getCollectionName());
}
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
@@ -168,8 +174,8 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
Document queryObject = query.getQueryObject();
Document fieldsObject = query.getFieldsObject();
return template.doFind(getCollectionName(), queryObject, fieldsObject, domainType, returnType,
getCursorPreparer(query, preparer));
return template.doFind(template.createDelegate(query), getCollectionName(), queryObject, fieldsObject, domainType,
returnType, getCursorPreparer(query, preparer));
}
private List<T> doFindDistinct(String field) {

View File

@@ -76,7 +76,7 @@ public interface ExecutableRemoveOperation {
/**
* Remove and return all matching documents. <br/>
* <strong>NOTE</strong> The entire list of documents will be fetched before sending the actual delete commands.
* <strong>NOTE:</strong> The entire list of documents will be fetched before sending the actual delete commands.
* Also, {@link org.springframework.context.ApplicationEvent}s will be published for each and every delete
* operation.
*

View File

@@ -87,6 +87,23 @@ public interface ExecutableUpdateOperation {
T findAndModifyValue();
}
/**
* Trigger <a href="https://docs.mongodb.com/manual/reference/method/db.collection.replaceOne/">replaceOne</a>
* execution by calling one of the terminating methods.
*
* @author Christoph Strobl
* @since 4.2
*/
interface TerminatingReplace {
/**
* Find first and replace/upsert.
*
* @return never {@literal null}.
*/
UpdateResult replaceFirst();
}
/**
* Trigger
* <a href="https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/">findOneAndReplace</a>
@@ -95,7 +112,7 @@ public interface ExecutableUpdateOperation {
* @author Mark Paluch
* @since 2.1
*/
interface TerminatingFindAndReplace<T> {
interface TerminatingFindAndReplace<T> extends TerminatingReplace {
/**
* Find, replace and return the first matching document.
@@ -243,6 +260,22 @@ public interface ExecutableUpdateOperation {
TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options);
}
/**
* @author Christoph Strobl
* @since 4.2
*/
interface ReplaceWithOptions extends TerminatingReplace {
/**
* Explicitly define {@link ReplaceOptions}.
*
* @param options must not be {@literal null}.
* @return new instance of {@link FindAndReplaceOptions}.
* @throws IllegalArgumentException if options is {@literal null}.
*/
TerminatingReplace withOptions(ReplaceOptions options);
}
/**
* Define {@link FindAndReplaceOptions}.
*
@@ -250,7 +283,7 @@ public interface ExecutableUpdateOperation {
* @author Christoph Strobl
* @since 2.1
*/
interface FindAndReplaceWithOptions<T> extends TerminatingFindAndReplace<T> {
interface FindAndReplaceWithOptions<T> extends TerminatingFindAndReplace<T>, ReplaceWithOptions {
/**
* Explicitly define {@link FindAndReplaceOptions} for the {@link Update}.

View File

@@ -126,6 +126,17 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
options, replacement, targetType);
}
@Override
public TerminatingReplace withOptions(ReplaceOptions options) {
FindAndReplaceOptions target = new FindAndReplaceOptions();
if (options.isUpsert()) {
target.upsert();
}
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
target, replacement, targetType);
}
@Override
public UpdateWithUpdate<T> matching(Query query) {
@@ -175,6 +186,18 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
getCollectionName(), targetType);
}
@Override
public UpdateResult replaceFirst() {
if (replacement != null) {
return template.replace(query, domainType, replacement,
findAndReplaceOptions != null ? findAndReplaceOptions : ReplaceOptions.none(), getCollectionName());
}
return template.replace(query, domainType, update,
findAndReplaceOptions != null ? findAndReplaceOptions : ReplaceOptions.none(), getCollectionName());
}
private UpdateResult doUpdate(boolean multi, boolean upsert) {
return template.doUpdate(getCollectionName(), query, update, domainType, upsert, multi);
}

View File

@@ -31,10 +31,9 @@ package org.springframework.data.mongodb.core;
* @author Christoph Strobl
* @since 2.1
*/
public class FindAndReplaceOptions {
public class FindAndReplaceOptions extends ReplaceOptions {
private boolean returnNew;
private boolean upsert;
private static final FindAndReplaceOptions NONE = new FindAndReplaceOptions() {
@@ -109,7 +108,7 @@ public class FindAndReplaceOptions {
*/
public FindAndReplaceOptions upsert() {
this.upsert = true;
super.upsert();
return this;
}
@@ -122,13 +121,4 @@ public class FindAndReplaceOptions {
return returnNew;
}
/**
* Get the bit indicating if to create a new document if not exists.
*
* @return {@literal true} if set.
*/
public boolean isUpsert() {
return upsert;
}
}

View File

@@ -27,6 +27,7 @@ import org.springframework.util.StringUtils;
* Function object to apply a query hint. Can be an index name or a BSON document.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 4.1
*/
class HintFunction {
@@ -67,6 +68,32 @@ class HintFunction {
return (hint instanceof String hintString && StringUtils.hasText(hintString)) || hint instanceof Bson;
}
/**
* If a hint is not present, returns {@code true}, otherwise {@code false}.
*
* @return {@code true} if a hint is not present, otherwise {@code false}.
*/
public boolean isEmpty() {
return !isPresent();
}
/**
* Apply the hint to consumers depending on the hint format if {@link #isPresent() present}.
*
* @param registryProvider
* @param stringConsumer
* @param bsonConsumer
* @param <R>
*/
public <R> void ifPresent(@Nullable CodecRegistryProvider registryProvider, Function<String, R> stringConsumer,
Function<Bson, R> bsonConsumer) {
if (isEmpty()) {
return;
}
apply(registryProvider, stringConsumer, bsonConsumer);
}
/**
* Apply the hint to consumers depending on the hint format.
*
@@ -79,7 +106,7 @@ class HintFunction {
public <R> R apply(@Nullable CodecRegistryProvider registryProvider, Function<String, R> stringConsumer,
Function<Bson, R> bsonConsumer) {
if (!isPresent()) {
if (isEmpty()) {
throw new IllegalStateException("No hint present");
}

View File

@@ -119,6 +119,10 @@ abstract class IndexConverters {
ops.wildcardProjection(indexOptions.get("wildcardProjection", Document.class));
}
if (indexOptions.containsKey("hidden")) {
ops = ops.hidden((Boolean) indexOptions.get("hidden"));
}
return ops;
};
}

View File

@@ -203,8 +203,9 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required));
}
}
return targetProperties.size() == 1 ? targetProperties.iterator().next()
JsonSchemaProperty schemaProperty = targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.merged(targetProperties);
return applyEncryptionDataIfNecessary(property, schemaProperty);
}
}
@@ -322,7 +323,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private TypedJsonSchemaObject createSchemaObject(Object type, Collection<?> possibleValues) {
TypedJsonSchemaObject schemaObject = type instanceof Type ? JsonSchemaObject.of(Type.class.cast(type))
TypedJsonSchemaObject schemaObject = type instanceof Type typeObject ? JsonSchemaObject.of(typeObject)
: JsonSchemaObject.of(Class.class.cast(type));
if (!CollectionUtils.isEmpty(possibleValues)) {
@@ -331,23 +332,22 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
return schemaObject;
}
private String computePropertyFieldName(PersistentProperty property) {
private String computePropertyFieldName(PersistentProperty<?> property) {
return property instanceof MongoPersistentProperty ? ((MongoPersistentProperty) property).getFieldName()
: property.getName();
return property instanceof MongoPersistentProperty mongoPersistentProperty ?
mongoPersistentProperty.getFieldName() : property.getName();
}
private boolean isRequiredProperty(PersistentProperty property) {
private boolean isRequiredProperty(PersistentProperty<?> property) {
return property.getType().isPrimitive();
}
private Class<?> computeTargetType(PersistentProperty<?> property) {
if (!(property instanceof MongoPersistentProperty)) {
if (!(property instanceof MongoPersistentProperty mongoProperty)) {
return property.getType();
}
MongoPersistentProperty mongoProperty = (MongoPersistentProperty) property;
if (!mongoProperty.isIdProperty()) {
return mongoProperty.getFieldType();
}

View File

@@ -21,9 +21,10 @@ package org.springframework.data.mongodb.core;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
* @see MongoAction
*/
public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK, REPLACE;
}

View File

@@ -68,6 +68,8 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
private static final Set<String> SECURITY_EXCEPTIONS = Set.of("MongoCryptException");
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
@@ -97,12 +99,12 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
if (DATA_INTEGRITY_EXCEPTIONS.contains(exception)) {
if (ex instanceof MongoServerException) {
if (((MongoServerException) ex).getCode() == 11000) {
if (ex instanceof MongoServerException mse) {
if (mse.getCode() == 11000) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof MongoBulkWriteException) {
for (BulkWriteError x : ((MongoBulkWriteException) ex).getWriteErrors()) {
if (ex instanceof MongoBulkWriteException bulkException) {
for (BulkWriteError x : bulkException.getWriteErrors()) {
if (x.getCode() == 11000) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
@@ -114,9 +116,9 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
}
// All other MongoExceptions
if (ex instanceof MongoException) {
if (ex instanceof MongoException mongoException) {
int code = ((MongoException) ex).getCode();
int code = mongoException.getCode();
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
return new DuplicateKeyException(ex.getMessage(), ex);
@@ -131,6 +133,8 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new ClientSessionException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isTransactionFailureCode(code)) {
return new MongoTransactionException(ex.getMessage(), ex);
} else if(ex.getCause() != null && SECURITY_EXCEPTIONS.contains(ClassUtils.getShortName(ex.getCause().getClass()))) {
return new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);

View File

@@ -23,6 +23,8 @@ import java.util.function.Supplier;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -38,6 +40,7 @@ import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -116,7 +119,7 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param collectionName name of the collection to retrieve the objects from.
* @param dch the handler that will extract results, one document at a time.
@@ -222,7 +225,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <p>
* Returns a {@link String} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param <T> element return type
@@ -238,7 +241,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
@@ -319,7 +322,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
MongoCollection<Document> createView(String name, Class<?> source, AggregationPipeline pipeline, @Nullable ViewOptions options);
MongoCollection<Document> createView(String name, Class<?> source, AggregationPipeline pipeline,
@Nullable ViewOptions options);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
@@ -331,7 +335,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
MongoCollection<Document> createView(String name, String source, AggregationPipeline pipeline, @Nullable ViewOptions options);
MongoCollection<Document> createView(String name, String source, AggregationPipeline pipeline,
@Nullable ViewOptions options);
/**
* A set of collection names.
@@ -718,7 +723,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned list.
* @return the converted object.
@@ -734,7 +739,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from.
@@ -748,7 +753,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param collectionName name of the collection to check for objects.
* @return {@literal true} if the query yields a result.
*/
@@ -757,7 +762,7 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param entityClass the parametrized type.
* @return {@literal true} if the query yields a result.
*/
@@ -766,7 +771,7 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName name of the collection to check for objects.
* @return {@literal true} if the query yields a result.
@@ -780,7 +785,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned list. Must not be {@literal null}.
* @return the List of converted objects.
@@ -794,7 +799,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned list. Must not be {@literal null}.
* @param collectionName name of the collection to retrieve the objects from. Must not be {@literal null}.
@@ -802,6 +807,57 @@ public interface MongoOperations extends FluentMongoOperations {
*/
<T> List<T> find(Query query, Class<T> entityClass, String collectionName);
/**
* Query for a window of objects of type T from the specified collection. <br />
* Make sure to either set {@link Query#skip(long)} or {@link Query#with(KeysetScrollPosition)} along with
* {@link Query#limit(int)} to limit large query results for efficient scrolling. <br />
* Result objects are converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* Unless configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable {@link org.springframework.data.domain.Sort
* sort properties} as MongoDB does not support criteria to reconstruct a query result from absent document fields or
* {@code null} values through {@code $gt/$lt} operators.
*
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType the parametrized type of the returned window.
* @return the converted window.
* @throws IllegalStateException if a potential {@link Query#getKeyset() KeysetScrollPosition} contains an invalid
* position.
* @since 4.1
* @see Query#with(org.springframework.data.domain.OffsetScrollPosition)
* @see Query#with(org.springframework.data.domain.KeysetScrollPosition)
*/
<T> Window<T> scroll(Query query, Class<T> entityType);
/**
* Query for a window of objects of type T from the specified collection. <br />
* Make sure to either set {@link Query#skip(long)} or {@link Query#with(KeysetScrollPosition)} along with
* {@link Query#limit(int)} to limit large query results for efficient scrolling. <br />
* Result objects are converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* Unless configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable {@link org.springframework.data.domain.Sort
* sort properties} as MongoDB does not support criteria to reconstruct a query result from absent document fields or
* {@code null} values through {@code $gt/$lt} operators.
*
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType the parametrized type of the returned window.
* @param collectionName name of the collection to retrieve the objects from.
* @return the converted window.
* @throws IllegalStateException if a potential {@link Query#getKeyset() KeysetScrollPosition} contains an invalid
* position.
* @since 4.1
* @see Query#with(org.springframework.data.domain.OffsetScrollPosition)
* @see Query#with(org.springframework.data.domain.KeysetScrollPosition)
*/
<T> Window<T> scroll(Query query, Class<T> entityType, String collectionName);
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
@@ -887,7 +943,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify </a>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. Must not be {@literal null}.
@@ -903,7 +959,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify </a>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. Must not be {@literal null}.
@@ -921,7 +977,7 @@ public interface MongoOperations extends FluentMongoOperations {
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification.
* @param update the {@link UpdateDefinition} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
@@ -941,7 +997,7 @@ public interface MongoOperations extends FluentMongoOperations {
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -967,7 +1023,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Options are defaulted to {@link FindAndReplaceOptions#empty()}. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the converted object that was updated or {@literal null}, if not found.
@@ -988,7 +1044,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Options are defaulted to {@link FindAndReplaceOptions#empty()}. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param collectionName the collection to query. Must not be {@literal null}.
@@ -1007,7 +1063,7 @@ public interface MongoOperations extends FluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account.<br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -1030,7 +1086,7 @@ public interface MongoOperations extends FluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account.<br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -1053,7 +1109,7 @@ public interface MongoOperations extends FluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account.<br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -1078,7 +1134,7 @@ public interface MongoOperations extends FluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account.<br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -1108,7 +1164,7 @@ public interface MongoOperations extends FluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account.<br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -1133,7 +1189,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned list.
* @return the converted object
@@ -1150,7 +1206,7 @@ public interface MongoOperations extends FluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from.
@@ -1175,7 +1231,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* {@link #getCollectionName(Class) derived} from the given type.
* @see #exactCount(Query, Class)
* @see #estimatedCount(Class)
*/
@@ -1435,7 +1491,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, FindAndModifyOptions, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1458,7 +1514,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, FindAndModifyOptions, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1474,7 +1530,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1491,7 +1547,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1514,7 +1570,7 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1530,7 +1586,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br />
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1547,7 +1603,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1568,7 +1624,7 @@ public interface MongoOperations extends FluentMongoOperations {
* domain type information. Use {@link #updateMulti(Query, UpdateDefinition, Class, String)} to get full type specific
* support.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1584,7 +1640,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1617,7 +1673,7 @@ public interface MongoOperations extends FluentMongoOperations {
* acknowledged} remove operation was successful or not.
*
* @param object must not be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/
DeleteResult remove(Object object, String collectionName);
@@ -1626,7 +1682,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param query the query document that specifies the criteria used to remove a document.
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal entityClass} is {@literal null}.
@@ -1639,9 +1695,9 @@ public interface MongoOperations extends FluentMongoOperations {
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param query the query document that specifies the criteria used to remove a document.
* @param entityClass class of the pojo to be operated on. Can be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query}, {@literal entityClass} or {@literal collectionName} is
* {@literal null}.
@@ -1654,8 +1710,8 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param query the query document that specifies the criteria used to remove a document.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal collectionName} is {@literal null}.
*/
@@ -1667,7 +1723,7 @@ public interface MongoOperations extends FluentMongoOperations {
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link List} converted objects deleted by this operation.
* @since 1.5
*/
@@ -1692,12 +1748,79 @@ public interface MongoOperations extends FluentMongoOperations {
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link List} converted objects deleted by this operation.
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document. <br />
* The collection name is derived from the {@literal replacement} type. <br />
* Options are defaulted to {@link ReplaceOptions#none()}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 4.2
*/
default <T> UpdateResult replace(Query query, T replacement) {
return replace(query, replacement, ReplaceOptions.none());
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document. Options are defaulted to {@link ReplaceOptions#none()}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param collectionName the collection to query. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @since 4.2
*/
default <T> UpdateResult replace(Query query, T replacement, String collectionName) {
return replace(query, replacement, ReplaceOptions.none(), collectionName);
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document taking {@link ReplaceOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document.The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link ReplaceOptions} holding additional information. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 4.2
*/
default <T> UpdateResult replace(Query query, T replacement, ReplaceOptions options) {
return replace(query, replacement, options, getCollectionName(ClassUtils.getUserClass(replacement)));
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document taking {@link ReplaceOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may *
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link ReplaceOptions} holding additional information. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @since 4.2
*/
<T> UpdateResult replace(Query query, T replacement, ReplaceOptions options, String collectionName);
/**
* Returns the underlying {@link MongoConverter}.
*

View File

@@ -21,6 +21,7 @@ import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.TimeUnit;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.stream.Collectors;
@@ -28,6 +29,7 @@ import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
@@ -52,6 +54,7 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.ShardKey;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
@@ -190,6 +193,15 @@ class QueryOperations {
return new UpdateContext(replacement, upsert);
}
/**
* @param replacement the {@link MappedDocument mapped replacement} document.
* @param upsert use {@literal true} to insert diff when no existing document found.
* @return new instance of {@link UpdateContext}.
*/
UpdateContext replaceSingleContext(Query query, MappedDocument replacement, boolean upsert) {
return new UpdateContext(query, replacement, upsert);
}
/**
* Create a new {@link DeleteContext} instance removing all matching documents.
*
@@ -387,12 +399,12 @@ class QueryOperations {
for (Entry<String, Object> entry : fields.entrySet()) {
if (entry.getValue() instanceof MongoExpression) {
if (entry.getValue()instanceof MongoExpression mongoExpression) {
AggregationOperationContext ctx = entity == null ? Aggregation.DEFAULT_CONTEXT
: new RelaxedTypeBasedAggregationOperationContext(entity.getType(), mappingContext, queryMapper);
evaluated.put(entry.getKey(), AggregationExpression.from((MongoExpression) entry.getValue()).toDocument(ctx));
evaluated.put(entry.getKey(), AggregationExpression.from(mongoExpression).toDocument(ctx));
} else {
evaluated.put(entry.getKey(), entry.getValue());
}
@@ -436,6 +448,25 @@ class QueryOperations {
return entityOperations.forType(domainType).getCollation(query) //
.map(Collation::toMongoCollation);
}
/**
* Get the {@link HintFunction} reading the actual hint form the {@link Query}.
*
* @return new instance of {@link HintFunction}.
* @since 4.2
*/
HintFunction getHintFunction() {
return HintFunction.from(query.getHint());
}
/**
* Read and apply the hint from the {@link Query}.
*
* @since 4.2
*/
<R> void applyHint(Function<String, R> stringConsumer, Function<Bson, R> bsonConsumer) {
getHintFunction().ifPresent(codecRegistryProvider, stringConsumer, bsonConsumer);
}
}
/**
@@ -455,7 +486,7 @@ class QueryOperations {
*/
private DistinctQueryContext(@Nullable Object query, String fieldName) {
super(query instanceof Document ? new BasicQuery((Document) query) : (Query) query);
super(query instanceof Document document ? new BasicQuery(document) : (Query) query);
this.fieldName = fieldName;
}
@@ -563,10 +594,23 @@ class QueryOperations {
if (query.getLimit() > 0) {
options.limit(query.getLimit());
}
if (query.getSkip() > 0) {
options.skip((int) query.getSkip());
}
Meta meta = query.getMeta();
if (meta.hasValues()) {
if (meta.hasMaxTime()) {
options.maxTime(meta.getRequiredMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
if (meta.hasComment()) {
options.comment(meta.getComment());
}
}
HintFunction hintFunction = HintFunction.from(query.getHint());
if (hintFunction.isPresent()) {
@@ -680,8 +724,12 @@ class QueryOperations {
}
UpdateContext(MappedDocument update, boolean upsert) {
this(new BasicQuery(BsonUtils.asDocument(update.getIdFilter())), update, upsert);
}
super(new BasicQuery(BsonUtils.asDocument(update.getIdFilter())));
UpdateContext(Query query, MappedDocument update, boolean upsert) {
super(query);
this.multi = false;
this.upsert = upsert;
this.mappedDocument = update;
@@ -715,6 +763,7 @@ class QueryOperations {
.arrayFilters(update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()));
}
HintFunction.from(getQuery().getHint()).ifPresent(codecRegistryProvider, options::hintString, options::hint);
applyCollation(domainType, options::collation);
if (callback != null) {
@@ -748,6 +797,7 @@ class QueryOperations {
ReplaceOptions options = new ReplaceOptions();
options.collation(updateOptions.getCollation());
options.upsert(updateOptions.isUpsert());
applyHint(options::hintString, options::hint);
if (callback != null) {
callback.accept(options);
@@ -761,7 +811,7 @@ class QueryOperations {
Document mappedQuery = super.getMappedQuery(domainType);
if (multi && update.isIsolated() && !mappedQuery.containsKey("$isolated")) {
if (multi && update != null && update.isIsolated() && !mappedQuery.containsKey("$isolated")) {
mappedQuery.put("$isolated", 1);
}
@@ -775,7 +825,7 @@ class QueryOperations {
Document filterWithShardKey = new Document(filter);
getMappedShardKeyFields(domainType)
.forEach(key -> filterWithShardKey.putIfAbsent(key, BsonUtils.resolveValue(shardKeySource, key)));
.forEach(key -> filterWithShardKey.putIfAbsent(key, BsonUtils.resolveValue((Bson) shardKeySource, key)));
return filterWithShardKey;
}
@@ -857,7 +907,7 @@ class QueryOperations {
if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
String versionFieldName = persistentEntity.getRequiredVersionProperty().getFieldName();
if (!update.modifies(versionFieldName)) {
if (update != null && !update.modifies(versionFieldName)) {
update.inc(versionFieldName);
}
}
@@ -905,10 +955,10 @@ class QueryOperations {
this.aggregation = aggregation;
if (aggregation instanceof TypedAggregation) {
this.inputType = ((TypedAggregation<?>) aggregation).getInputType();
} else if (aggregationOperationContext instanceof TypeBasedAggregationOperationContext) {
this.inputType = ((TypeBasedAggregationOperationContext) aggregationOperationContext).getType();
if (aggregation instanceof TypedAggregation typedAggregation) {
this.inputType = typedAggregation.getInputType();
} else if (aggregationOperationContext instanceof TypeBasedAggregationOperationContext typeBasedAggregationOperationContext) {
this.inputType = typeBasedAggregationOperationContext.getType();
} else {
this.inputType = null;
}
@@ -933,8 +983,8 @@ class QueryOperations {
this.aggregation = aggregation;
if (aggregation instanceof TypedAggregation) {
this.inputType = ((TypedAggregation<?>) aggregation).getInputType();
if (aggregation instanceof TypedAggregation typedAggregation) {
this.inputType = typedAggregation.getInputType();
} else {
this.inputType = inputType;
}

View File

@@ -98,9 +98,7 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return collection;
}
if (aggregation instanceof TypedAggregation) {
TypedAggregation<?> typedAggregation = (TypedAggregation<?>) aggregation;
if (aggregation instanceof TypedAggregation typedAggregation) {
if (typedAggregation.getInputType() != null) {
return template.getCollectionName(typedAggregation.getInputType());

View File

@@ -0,0 +1,138 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import reactor.core.publisher.Mono;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import com.mongodb.bulk.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. Bulk operations are available since MongoDB 2.6 and
* make use of low level bulk commands on the protocol level. This interface defines a fluent API to add multiple single
* operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* <pre class="code">
* ReactiveMongoOperations ops = …;
*
* ops.bulkOps(BulkMode.UNORDERED, Person.class)
* .insert(newPerson)
* .updateOne(where("firstname").is("Joe"), Update.update("lastname", "Doe"))
* .execute();
* </pre>
* <p>
* Bulk operations are issued as one batch that pulls together all insert, update, and delete operations. Operations
* that require individual operation results such as optimistic locking (using {@code @Version}) are not supported and
* the version field remains not populated.
*
* @author Christoph Strobl
* @since 4.1
*/
public interface ReactiveBulkOperations {
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the insert added, will never be {@literal null}.
*/
ReactiveBulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the insert added, will never be {@literal null}.
*/
ReactiveBulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link UpdateDefinition} operation to perform, must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the update added, will never be {@literal null}.
*/
ReactiveBulkOperations updateOne(Query query, UpdateDefinition update);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link ReactiveBulkOperations} instance with the update added, will never be {@literal null}.
*/
ReactiveBulkOperations updateMulti(Query query, UpdateDefinition update);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link ReactiveBulkOperations} instance with the update added, will never be {@literal null}.
*/
ReactiveBulkOperations upsert(Query query, UpdateDefinition update);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the removal added, will never be {@literal null}.
*/
ReactiveBulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the removal added, will never be {@literal null}.
*/
ReactiveBulkOperations remove(List<Query> removes);
/**
* Add a single replace operation to the bulk operation.
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the replace added, will never be {@literal null}.
*/
default ReactiveBulkOperations replaceOne(Query query, Object replacement) {
return replaceOne(query, replacement, FindAndReplaceOptions.empty());
}
/**
* Add a single replace operation to the bulk operation.
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
* @return the current {@link ReactiveBulkOperations} instance with the replace added, will never be {@literal null}.
*/
ReactiveBulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options);
/**
* Execute all bulk operations using the default write concern.
*
* @return a {@link Mono} emitting the result of the bulk operation providing counters for inserts/updates etc.
*/
Mono<BulkWriteResult> execute();
}

View File

@@ -93,10 +93,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> {
if (token instanceof Instant) {
builder.resumeAt((Instant) token);
} else if (token instanceof BsonTimestamp) {
builder.resumeAt((BsonTimestamp) token);
if (token instanceof Instant instant) {
builder.resumeAt(instant);
} else if (token instanceof BsonTimestamp bsonTimestamp) {
builder.resumeAt(bsonTimestamp);
}
});
}
@@ -161,8 +161,8 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
}
options.getFilter().ifPresent(it -> {
if (it instanceof Aggregation) {
builder.filter((Aggregation) it);
if (it instanceof Aggregation aggregation) {
builder.filter(aggregation);
} else {
builder.filter(((List<Document>) it).toArray(new Document[0]));
}

View File

@@ -18,6 +18,9 @@ package org.springframework.data.mongodb.core;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -87,14 +90,27 @@ public interface ReactiveFindOperation {
*/
Flux<T> all();
/**
* Return a scroll of elements either starting or resuming at {@link ScrollPosition}.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable
* {@link org.springframework.data.domain.Sort sort properties} as MongoDB does not support criteria to reconstruct
* a query result from absent document fields or {@code null} values through {@code $gt/$lt} operators.
*
* @param scrollPosition the scroll position.
* @return a scroll of the resulting elements.
* @since 4.1
* @see org.springframework.data.domain.OffsetScrollPosition
* @see org.springframework.data.domain.KeysetScrollPosition
*/
Mono<Window<T>> scroll(ScrollPosition scrollPosition);
/**
* Get all matching elements using a {@link com.mongodb.CursorType#TailableAwait tailable cursor}. The stream will
* not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link org.reactivestreams.Subscription#cancel() canceled}.
* <br />
* {@link org.reactivestreams.Subscription#cancel() canceled}. <br />
* However, the stream may become dead, or invalid, if either the query returns no match or the cursor returns the
* document at the "end" of the collection and then the application deletes that document.
* <br />
* document at the "end" of the collection and then the application deletes that document. <br />
* A stream that is no longer in use must be {@link reactor.core.Disposable#dispose()} disposed} otherwise the
* streams will linger and exhaust resources. <br/>
* <strong>NOTE:</strong> Requires a capped collection.
@@ -105,8 +121,7 @@ public interface ReactiveFindOperation {
Flux<T> tail();
/**
* Get the number of matching elements.
* <br />
* Get the number of matching elements. <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but

View File

@@ -20,6 +20,9 @@ import reactor.core.publisher.Mono;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.domain.Window;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.mongodb.core.CollectionPreparerSupport.ReactiveCollectionPreparerDelegate;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
@@ -67,8 +70,8 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
private final String collection;
private final Query query;
ReactiveFindSupport(ReactiveMongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
ReactiveFindSupport(ReactiveMongoTemplate template, Class<?> domainType, Class<T> returnType, String collection,
Query query) {
this.template = template;
this.domainType = domainType;
@@ -136,6 +139,11 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return doFind(null);
}
@Override
public Mono<Window<T>> scroll(ScrollPosition scrollPosition) {
return template.doScroll(query.with(scrollPosition), domainType, returnType, getCollectionName());
}
@Override
public Flux<T> tail() {
return doFind(template.new TailingQueryFindPublisherPreparer(query, domainType));
@@ -169,8 +177,8 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
Document queryObject = query.getQueryObject();
Document fieldsObject = query.getFieldsObject();
return template.doFind(getCollectionName(), queryObject, fieldsObject, domainType, returnType,
preparer != null ? preparer : getCursorPreparer(query));
return template.doFind(getCollectionName(), ReactiveCollectionPreparerDelegate.of(query), queryObject,
fieldsObject, domainType, returnType, preparer != null ? preparer : getCursorPreparer(query));
}
@SuppressWarnings("unchecked")

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.query.Collation;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -25,9 +26,11 @@ import java.util.function.Supplier;
import org.bson.Document;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
@@ -279,7 +282,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
Mono<MongoCollection<Document>> createView(String name, Class<?> source, AggregationPipeline pipeline, @Nullable ViewOptions options);
Mono<MongoCollection<Document>> createView(String name, Class<?> source, AggregationPipeline pipeline,
@Nullable ViewOptions options);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
@@ -291,7 +295,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
Mono<MongoCollection<Document>> createView(String name, String source, AggregationPipeline pipeline, @Nullable ViewOptions options);
Mono<MongoCollection<Document>> createView(String name, String source, AggregationPipeline pipeline,
@Nullable ViewOptions options);
/**
* A set of collection names.
@@ -346,6 +351,40 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
Mono<Void> dropCollection(String collectionName);
/**
* Returns a new {@link ReactiveBulkOperations} for the given collection. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, etc. is not available for {@literal update} or
* {@literal remove} operations in bulk mode due to the lack of domain type information. Use
* {@link #bulkOps(BulkMode, Class, String)} to get full type specific support.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link ReactiveBulkOperations} on the named collection
* @since 4.1
*/
ReactiveBulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link ReactiveBulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @return {@link ReactiveBulkOperations} on the named collection associated of the given entity class.
* @since 4.1
*/
ReactiveBulkOperations bulkOps(BulkMode mode, Class<?> entityClass);
/**
* Returns a new {@link ReactiveBulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class. Can be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link ReactiveBulkOperations} on the named collection associated with the given entity class.
* @since 4.1
*/
ReactiveBulkOperations bulkOps(BulkMode mode, @Nullable Class<?> entityType, String collectionName);
/**
* Query for a {@link Flux} of objects of type T from the collection used by the entity class. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
@@ -379,7 +418,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object.
@@ -394,7 +433,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from.
@@ -407,7 +446,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param collectionName name of the collection to check for objects.
* @return {@literal true} if the query yields a result.
*/
@@ -416,7 +455,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param entityClass the parametrized type.
* @return {@literal true} if the query yields a result.
*/
@@ -425,7 +464,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param query the {@link Query} class that specifies the criteria used to find a document.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName name of the collection to check for objects.
* @return {@literal true} if the query yields a result.
@@ -440,7 +479,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned {@link Flux}. Must not be {@literal null}.
* @return the {@link Flux} of converted objects.
@@ -454,7 +493,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from. Must not be {@literal null}.
@@ -462,6 +501,57 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
<T> Flux<T> find(Query query, Class<T> entityClass, String collectionName);
/**
* Query for a scroll of objects of type T from the specified collection. <br />
* Make sure to either set {@link Query#skip(long)} or {@link Query#with(KeysetScrollPosition)} along with
* {@link Query#limit(int)} to limit large query results for efficient scrolling. <br />
* Result objects are converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* Unless configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable {@link org.springframework.data.domain.Sort
* sort properties} as MongoDB does not support criteria to reconstruct a query result from absent document fields or
* {@code null} values through {@code $gt/$lt} operators.
*
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType the parametrized type of the returned list.
* @return {@link Mono} emitting the converted window.
* @throws IllegalStateException if a potential {@link Query#getKeyset() KeysetScrollPosition} contains an invalid
* position.
* @since 4.1
* @see Query#with(org.springframework.data.domain.OffsetScrollPosition)
* @see Query#with(org.springframework.data.domain.KeysetScrollPosition)
*/
<T> Mono<Window<T>> scroll(Query query, Class<T> entityType);
/**
* Query for a window of objects of type T from the specified collection. <br />
* Make sure to either set {@link Query#skip(long)} or {@link Query#with(KeysetScrollPosition)} along with
* {@link Query#limit(int)} to limit large query results for efficient scrolling. <br />
* Result objects are converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* Unless configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
* <p>
* When using {@link KeysetScrollPosition}, make sure to use non-nullable {@link org.springframework.data.domain.Sort
* sort properties} as MongoDB does not support criteria to reconstruct a query result from absent document fields or
* {@code null} values through {@code $gt/$lt} operators.
*
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType the parametrized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from.
* @return {@link Mono} emitting the converted window.
* @throws IllegalStateException if a potential {@link Query#getKeyset() KeysetScrollPosition} contains an invalid
* position.
* @since 4.1
* @see Query#with(org.springframework.data.domain.OffsetScrollPosition)
* @see Query#with(org.springframework.data.domain.KeysetScrollPosition)
*/
<T> Mono<Window<T>> scroll(Query query, Class<T> entityType, String collectionName);
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
@@ -669,7 +759,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify</a>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. Must not be {@literal null}.
@@ -684,7 +774,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify</a>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. Must not be {@literal null}.
@@ -701,7 +791,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification.
* @param update the {@link UpdateDefinition} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
@@ -719,7 +809,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param update the {@link UpdateDefinition} to apply on matching documents. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -742,7 +832,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Options are defaulted to {@link FindAndReplaceOptions#empty()}. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the converted object that was updated or {@link Mono#empty()}, if not found.
@@ -762,7 +852,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Options are defaulted to {@link FindAndReplaceOptions#empty()}. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param collectionName the collection to query. Must not be {@literal null}.
@@ -780,7 +870,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -802,7 +892,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -824,7 +914,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -848,7 +938,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -877,7 +967,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* taking {@link FindAndReplaceOptions} into account. <br />
* <strong>NOTE:</strong> The replacement entity must not hold an {@literal id}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document and also an optional
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
@@ -902,7 +992,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object
@@ -918,7 +1008,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from.
@@ -1280,7 +1370,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, Class)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1301,7 +1391,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* domain type information. Use {@link #upsert(Query, UpdateDefinition, Class, String)} to get full type specific
* support.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1317,7 +1407,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param query the query document that specifies the criteria used to select a document to be upserted. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing object. Must not be {@literal null}.
@@ -1336,7 +1426,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, Class)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1359,7 +1449,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, UpdateDefinition, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1375,7 +1465,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br />
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1392,7 +1482,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1413,7 +1503,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* domain type information. Use {@link #updateMulti(Query, UpdateDefinition, Class, String)} to get full type specific
* support.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1429,7 +1519,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param query the query document that specifies the criteria used to select a document to be updated. Must not be
* {@literal null}.
* @param update the {@link UpdateDefinition} that contains the updated object or {@code $} operators to manipulate
* the existing. Must not be {@literal null}.
@@ -1456,7 +1546,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Removes the given object from the given collection.
*
* @param object must not be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/
Mono<DeleteResult> remove(Object object, String collectionName);
@@ -1475,7 +1565,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Removes the given object from the given collection.
*
* @param objectToRemove must not be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collectionName);
@@ -1484,7 +1574,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param query the query document that specifies the criteria used to remove a document.
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
@@ -1496,9 +1586,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param query the query document that specifies the criteria used to remove a document.
* @param entityClass class of the pojo to be operated on. Can be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/
Mono<DeleteResult> remove(Query query, @Nullable Class<?> entityClass, String collectionName);
@@ -1509,8 +1599,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to remove a record.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param query the query document that specifies the criteria used to remove a document.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/
Mono<DeleteResult> remove(Query query, String collectionName);
@@ -1521,7 +1611,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link Flux} converted objects deleted by this operation.
*/
<T> Flux<T> findAllAndRemove(Query query, String collectionName);
@@ -1544,11 +1634,80 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @param collectionName name of the collection where the documents will be removed from, must not be {@literal null} or empty.
* @return the {@link Flux} converted objects deleted by this operation.
*/
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document. <br />
* The collection name is derived from the {@literal replacement} type. <br />
* Options are defaulted to {@link ReplaceOptions#none()}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 4.2
*/
default <T> Mono<UpdateResult> replace(Query query, T replacement) {
return replace(query, replacement, ReplaceOptions.none());
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document. Options are defaulted to {@link ReplaceOptions#none()}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param collectionName the collection to query. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @since 4.2
*/
default <T> Mono<UpdateResult> replace(Query query, T replacement, String collectionName) {
return replace(query, replacement, ReplaceOptions.none(), collectionName);
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document taking {@link ReplaceOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document.The query may
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link ReplaceOptions} holding additional information. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 4.2
*/
default <T> Mono<UpdateResult> replace(Query query, T replacement, ReplaceOptions options) {
return replace(query, replacement, options, getCollectionName(ClassUtils.getUserClass(replacement)));
}
/**
* Replace a single document matching the {@link Criteria} of given {@link Query} with the {@code replacement}
* document taking {@link ReplaceOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a document. The query may *
* contain an index {@link Query#withHint(String) hint} or the {@link Query#collation(Collation) collation}
* to use. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link ReplaceOptions} holding additional information. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous replacement.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 4.2
*/
<T> Mono<UpdateResult> replace(Query query, T replacement, ReplaceOptions options, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
@@ -1559,7 +1718,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Flux}.
* @return the {@link Flux} of converted objects.
@@ -1578,7 +1737,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* @param query the query class that specifies the criteria used to find a document and also an optional fields
* specification.
* @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from.
@@ -1693,6 +1852,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, String inputCollectionName, Class<T> resultType,
String mapFunction, String reduceFunction, MapReduceOptions options);
/**
* Returns the underlying {@link MongoConverter}.
*

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import org.springframework.data.mongodb.core.CollectionPreparerSupport.CollectionPreparerDelegate;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2;
@@ -58,6 +59,8 @@ import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.domain.OffsetScrollPosition;
import org.springframework.data.domain.Window;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
@@ -70,6 +73,9 @@ import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.ReactiveMongoDatabaseUtils;
import org.springframework.data.mongodb.SessionSynchronization;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.CollectionPreparerSupport.ReactiveCollectionPreparerDelegate;
import org.springframework.data.mongodb.core.DefaultReactiveBulkOperations.ReactiveBulkOperationContext;
import org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity;
import org.springframework.data.mongodb.core.QueryOperations.AggregationDefinition;
import org.springframework.data.mongodb.core.QueryOperations.CountContext;
@@ -77,9 +83,11 @@ import org.springframework.data.mongodb.core.QueryOperations.DeleteContext;
import org.springframework.data.mongodb.core.QueryOperations.DistinctQueryContext;
import org.springframework.data.mongodb.core.QueryOperations.QueryContext;
import org.springframework.data.mongodb.core.QueryOperations.UpdateContext;
import org.springframework.data.mongodb.core.ScrollUtils.KeysetScrollQuery;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions.Builder;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.PrefixingDelegatingAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
@@ -105,7 +113,6 @@ import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Meta.CursorOption;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
@@ -147,9 +154,18 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* extract results. This class executes BSON queries or updates, initiating iteration over {@link FindPublisher} and
* catching MongoDB exceptions and translating them to the generic, more informative exception hierarchy defined in the
* org.springframework.dao package. Can be used within a service implementation via direct instantiation with a
* {@link SimpleReactiveMongoDatabaseFactory} reference, or get prepared in an application context and given to services
* as bean reference. Note: The {@link SimpleReactiveMongoDatabaseFactory} should always be configured as a bean in the
* application context, in the first case given to the service directly, in the second case to the prepared template.
* {@link ReactiveMongoDatabaseFactory} reference, or get prepared in an application context and given to services as
* bean reference.
* <p>
* Note: The {@link ReactiveMongoDatabaseFactory} should always be configured as a bean in the application context, in
* the first case given to the service directly, in the second case to the prepared template.
* <h3>{@link ReadPreference} and {@link com.mongodb.ReadConcern}</h3>
* <p>
* {@code ReadPreference} and {@code ReadConcern} are generally considered from {@link Query} and
* {@link AggregationOptions} objects for the action to be executed on a particular {@link MongoCollection}.
* <p>
* You can also set the default {@link #setReadPreference(ReadPreference) ReadPreference} on the template level to
* generally apply a {@link ReadPreference}.
*
* @author Mark Paluch
* @author Christoph Strobl
@@ -250,9 +266,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this.eventDelegate = new EntityLifecycleEventDelegate();
// We create indexes based on mapping events
if (this.mappingContext instanceof MongoMappingContext) {
MongoMappingContext mongoMappingContext = (MongoMappingContext) this.mappingContext;
if (this.mappingContext instanceof MongoMappingContext mongoMappingContext) {
if (mongoMappingContext.isAutoIndexCreation()) {
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
@@ -356,8 +370,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
setEntityCallbacks(ReactiveEntityCallbacks.create(applicationContext));
}
if (mappingContext instanceof ApplicationEventPublisherAware) {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
if (mappingContext instanceof ApplicationEventPublisherAware applicationEventPublisherAware) {
applicationEventPublisherAware.setApplicationEventPublisher(eventPublisher);
}
}
@@ -441,8 +455,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
}
if (context instanceof ConfigurableApplicationContext) {
((ConfigurableApplicationContext) context).addApplicationListener(indexCreatorListener);
if (context instanceof ConfigurableApplicationContext configurableApplicationContext) {
configurableApplicationContext.addApplicationListener(indexCreatorListener);
}
}
@@ -723,7 +737,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Mono<Void> dropCollection(Class<T> entityClass) {
return dropCollection(getCollectionName(entityClass));
}
@Override
public Mono<Void> dropCollection(String collectionName) {
@@ -734,6 +747,31 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}).then();
}
@Override
public ReactiveBulkOperations bulkOps(BulkMode mode, String collectionName) {
return bulkOps(mode, null, collectionName);
}
@Override
public ReactiveBulkOperations bulkOps(BulkMode mode, Class<?> entityClass) {
return bulkOps(mode, entityClass, getCollectionName(entityClass));
}
@Override
public ReactiveBulkOperations bulkOps(BulkMode mode, @Nullable Class<?> entityType, String collectionName) {
Assert.notNull(mode, "BulkMode must not be null");
Assert.hasText(collectionName, "Collection name must not be null or empty");
DefaultReactiveBulkOperations operations = new DefaultReactiveBulkOperations(this, collectionName,
new ReactiveBulkOperationContext(mode, Optional.ofNullable(getPersistentEntity(entityType)), queryMapper,
updateMapper, eventPublisher, entityCallbacks));
operations.setDefaultWriteConcern(writeConcern);
return operations;
}
@Override
public Flux<String> getCollectionNames() {
return createFlux(MongoDatabase::listCollectionNames);
@@ -756,8 +794,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Mono<T> findOne(Query query, Class<T> entityClass, String collectionName) {
if (ObjectUtils.isEmpty(query.getSortObject())) {
return doFindOne(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryFindPublisherPreparer(query, entityClass));
return doFindOne(collectionName, ReactiveCollectionPreparerDelegate.of(query), query.getQueryObject(),
query.getFieldsObject(), entityClass, new QueryFindPublisherPreparer(query, entityClass));
}
query.limit(1);
@@ -783,10 +821,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return createFlux(collectionName, collection -> {
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(query);
QueryContext queryContext = queryOperations.createQueryContext(query);
Document filter = queryContext.getMappedQuery(entityClass, this::getPersistentEntity);
FindPublisher<Document> findPublisher = collection.find(filter, Document.class)
FindPublisher<Document> findPublisher = collectionPreparer.prepare(collection).find(filter, Document.class)
.projection(new Document("_id", 1));
if (LOGGER.isDebugEnabled()) {
@@ -811,8 +850,53 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return findAll(entityClass, collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryFindPublisherPreparer(query, entityClass));
return doFind(collectionName, ReactiveCollectionPreparerDelegate.of(query), query.getQueryObject(),
query.getFieldsObject(), entityClass, new QueryFindPublisherPreparer(query, entityClass));
}
@Override
public <T> Mono<Window<T>> scroll(Query query, Class<T> entityType) {
Assert.notNull(entityType, "Entity type must not be null");
return scroll(query, entityType, getCollectionName(entityType));
}
@Override
public <T> Mono<Window<T>> scroll(Query query, Class<T> entityType, String collectionName) {
return doScroll(query, entityType, entityType, collectionName);
}
<T> Mono<Window<T>> doScroll(Query query, Class<?> sourceClass, Class<T> targetClass, String collectionName) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(collectionName, "CollectionName must not be null");
Assert.notNull(sourceClass, "Entity type must not be null");
Assert.notNull(targetClass, "Target type must not be null");
EntityProjection<T, ?> projection = operations.introspectProjection(targetClass, sourceClass);
ProjectingReadCallback<?,T> callback = new ProjectingReadCallback<>(mongoConverter, projection, collectionName);
int limit = query.isLimited() ? query.getLimit() + 1 : Integer.MAX_VALUE;
if (query.hasKeyset()) {
KeysetScrollQuery keysetPaginationQuery = ScrollUtils.createKeysetPaginationQuery(query,
operations.getIdPropertyName(sourceClass));
Mono<List<T>> result = doFind(collectionName, ReactiveCollectionPreparerDelegate.of(query),
keysetPaginationQuery.query(), keysetPaginationQuery.fields(), sourceClass,
new QueryFindPublisherPreparer(query, keysetPaginationQuery.sort(), limit, 0, sourceClass), callback).collectList();
return result.map(it -> ScrollUtils.createWindow(query, it, sourceClass, operations));
}
Mono<List<T>> result = doFind(collectionName, ReactiveCollectionPreparerDelegate.of(query), query.getQueryObject(),
query.getFieldsObject(), sourceClass,
new QueryFindPublisherPreparer(query, query.getSortObject(), limit, query.getSkip(), sourceClass), callback)
.collectList();
return result.map(
it -> ScrollUtils.createWindow(it, query.getLimit(), OffsetScrollPosition.positionFunction(query.getSkip())));
}
@Override
@@ -825,7 +909,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
String idKey = operations.getIdPropertyName(entityClass);
return doFindOne(collectionName, new Document(idKey, id), null, entityClass, (Collation) null);
return doFindOne(collectionName, CollectionPreparer.identity(), new Document(idKey, id), null, entityClass,
(Collation) null);
}
@Override
@@ -850,6 +935,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Document mappedQuery = distinctQueryContext.getMappedQuery(entity);
String mappedFieldName = distinctQueryContext.getMappedFieldName(entity);
Class<T> mongoDriverCompatibleType = distinctQueryContext.getDriverCompatibleClass(resultClass);
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(query);
Flux<?> result = execute(collectionName, collection -> {
@@ -859,11 +945,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
FindPublisherPreparer preparer = new QueryFindPublisherPreparer(query, entityClass);
if (preparer.hasReadPreference()) {
collection = collection.withReadPreference(preparer.getReadPreference());
}
DistinctPublisher<T> publisher = collection.distinct(mappedFieldName, mappedQuery, mongoDriverCompatibleType);
DistinctPublisher<T> publisher = collectionPreparer.prepare(collection).distinct(mappedFieldName, mappedQuery,
mongoDriverCompatibleType);
distinctQueryContext.applyCollation(entityClass, publisher::collation);
return publisher;
});
@@ -929,7 +1013,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
boolean isOutOrMerge, AggregationOptions options, ReadDocumentCallback<O> readCallback,
@Nullable Class<?> inputType) {
AggregatePublisher<Document> cursor = collection.aggregate(pipeline, Document.class)
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(options);
AggregatePublisher<Document> cursor = collectionPreparer.prepare(collection).aggregate(pipeline, Document.class)
.allowDiskUse(options.isAllowDiskUse());
if (options.getCursorBatchSize() != null) {
@@ -987,8 +1072,19 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
GeoNearResultDocumentCallback<T> callback = new GeoNearResultDocumentCallback<>(distanceField,
new ProjectingReadCallback<>(mongoConverter, projection, collection), near.getMetric());
Builder optionsBuilder = AggregationOptions.builder();
if (near.hasReadPreference()) {
optionsBuilder.readPreference(near.getReadPreference());
}
if (near.hasReadConcern()) {
optionsBuilder.readConcern(near.getReadConcern());
}
optionsBuilder.collation(near.getCollation());
Aggregation $geoNear = TypedAggregation.newAggregation(entityClass, Aggregation.geoNear(near, distanceField))
.withOptions(AggregationOptions.builder().collation(near.getCollation()).build());
.withOptions(optionsBuilder.build());
return aggregate($geoNear, collection, Document.class) //
.concatMap(callback::doWith);
@@ -1028,8 +1124,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
operations.forType(entityClass).getCollation(query).ifPresent(optionsToUse::collation);
}
return doFindAndModify(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), entityClass, update, optionsToUse);
return doFindAndModify(collectionName, ReactiveCollectionPreparerDelegate.of(query), query.getQueryObject(),
query.getFieldsObject(), getMappedSortObject(query, entityClass), entityClass, update, optionsToUse);
}
@Override
@@ -1053,6 +1149,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Document mappedQuery = queryContext.getMappedQuery(entity);
Document mappedFields = queryContext.getMappedFields(entity, projection);
Document mappedSort = queryContext.getMappedSort(entity);
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(query);
return Mono.defer(() -> {
@@ -1070,8 +1167,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
mapped.getCollection()));
}).flatMap(it -> {
Mono<T> afterFindAndReplace = doFindAndReplace(it.getCollection(), mappedQuery, mappedFields, mappedSort,
queryContext.getCollation(entityType).orElse(null), entityType, it.getTarget(), options, projection);
Mono<T> afterFindAndReplace = doFindAndReplace(it.getCollection(), collectionPreparer, mappedQuery,
mappedFields, mappedSort, queryContext.getCollation(entityType).orElse(null), entityType, it.getTarget(),
options, projection);
return afterFindAndReplace.flatMap(saved -> {
maybeEmitEvent(new AfterSaveEvent<>(saved, it.getTarget(), it.getCollection()));
return maybeCallAfterSave(saved, it.getTarget(), it.getCollection());
@@ -1089,9 +1187,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Mono<T> findAndRemove(Query query, Class<T> entityClass, String collectionName) {
operations.forType(entityClass).getCollation(query);
return doFindAndRemove(collectionName, query.getQueryObject(), query.getFieldsObject(),
getMappedSortObject(query, entityClass), operations.forType(entityClass).getCollation(query).orElse(null),
entityClass);
return doFindAndRemove(collectionName, ReactiveCollectionPreparerDelegate.of(query), query.getQueryObject(),
query.getFieldsObject(), getMappedSortObject(query, entityClass),
operations.forType(entityClass).getCollation(query).orElse(null), entityClass);
}
/*
@@ -1685,7 +1783,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
deferredFilter = Mono.just(filter);
}
ReplaceOptions replaceOptions = updateContext.getReplaceOptions(entityClass);
com.mongodb.client.model.ReplaceOptions replaceOptions = updateContext.getReplaceOptions(entityClass);
return deferredFilter.flatMap(it -> Mono.from(collectionToUse.replaceOne(it, updateObj, replaceOptions)));
}
@@ -1799,12 +1897,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass,
null, removeQuery);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(query);
return execute(collectionName, collection -> {
maybeEmitEvent(new BeforeDeleteEvent<>(removeQuery, entityClass, collectionName));
MongoCollection<Document> collectionToUse = prepareCollection(collection, writeConcernToUse);
MongoCollection<Document> collectionToUse = collectionPreparer
.prepare(prepareCollection(collection, writeConcernToUse));
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Remove using query: %s in collection: %s.", serializeToJsonSafely(removeQuery),
@@ -1839,8 +1939,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Override
public <T> Flux<T> findAll(Class<T> entityClass, String collectionName) {
return executeFindMultiInternal(new FindCallback(null), FindPublisherPreparer.NO_OP_PREPARER,
new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName), collectionName);
return executeFindMultiInternal(new FindCallback(CollectionPreparer.identity(), null),
FindPublisherPreparer.NO_OP_PREPARER, new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName),
collectionName);
}
@Override
@@ -1859,6 +1960,34 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return doFindAndDelete(collectionName, query, entityClass);
}
@Override
public <T> Mono<UpdateResult> replace(Query query, T replacement, ReplaceOptions options, String collectionName) {
Assert.notNull(replacement, "Replacement must not be null");
return replace(query, (Class<T>) ClassUtils.getUserClass(replacement), replacement, options, collectionName);
}
protected <S,T> Mono<UpdateResult> replace(Query query, Class<S> entityType, T replacement, ReplaceOptions options,
String collectionName) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityType);
UpdateContext updateContext = queryOperations.replaceSingleContext(query, operations.forEntity(replacement).toMappedDocument(this.mongoConverter), options.isUpsert());
return createMono(collectionName, collection -> {
Document mappedUpdate = updateContext.getMappedUpdate(entity);
MongoAction action = new MongoAction(writeConcern, MongoActionOperation.REPLACE, collectionName, entityType,
mappedUpdate, updateContext.getQueryObject());
MongoCollection<Document> collectionToUse = createCollectionPreparer(query, action).prepare(collection);
return collectionToUse.replaceOne(updateContext.getMappedQuery(entity), mappedUpdate, updateContext.getReplaceOptions(entityType, it -> {
it.upsert(options.isUpsert());
}));
});
}
@Override
public <T> Flux<T> tail(Query query, Class<T> entityClass) {
return tail(query, entityClass, getCollectionName(entityClass));
@@ -1867,17 +1996,19 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Override
public <T> Flux<T> tail(@Nullable Query query, Class<T> entityClass, String collectionName) {
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(query);
if (query == null) {
LOGGER.debug(String.format("Tail for class: %s in collection: %s", entityClass, collectionName));
return executeFindMultiInternal(
collection -> new FindCallback(null).doInCollection(collection).cursorType(CursorType.TailableAwait),
collection -> new FindCallback(collectionPreparer, null).doInCollection(collection)
.cursorType(CursorType.TailableAwait),
FindPublisherPreparer.NO_OP_PREPARER, new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName),
collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
return doFind(collectionName, collectionPreparer, query.getQueryObject(), query.getFieldsObject(), entityClass,
new TailingQueryFindPublisherPreparer(query, entityClass));
}
@@ -1920,10 +2051,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Object filter = options.getFilter().orElse(Collections.emptyList());
if (filter instanceof Aggregation) {
Aggregation agg = (Aggregation) filter;
AggregationOperationContext context = agg instanceof TypedAggregation
? new TypeBasedAggregationOperationContext(((TypedAggregation<?>) agg).getInputType(),
if (filter instanceof Aggregation agg) {
AggregationOperationContext context = agg instanceof TypedAggregation typedAggregation
? new TypeBasedAggregationOperationContext(typedAggregation.getInputType(),
getConverter().getMappingContext(), queryMapper)
: new RelaxedTypeBasedAggregationOperationContext(Object.class, mappingContext, queryMapper);
@@ -1961,12 +2091,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
assertLocalFunctionNames(mapFunction, reduceFunction);
ReactiveCollectionPreparerDelegate collectionPreparer = ReactiveCollectionPreparerDelegate.of(filterQuery);
return createFlux(inputCollectionName, collection -> {
Document mappedQuery = queryMapper.getMappedObject(filterQuery.getQueryObject(),
mappingContext.getPersistentEntity(domainType));
MapReducePublisher<Document> publisher = collection.mapReduce(mapFunction, reduceFunction, Document.class);
MapReducePublisher<Document> publisher = collectionPreparer.prepare(collection).mapReduce(mapFunction,
reduceFunction, Document.class);
publisher.filter(mappedQuery);
@@ -1975,8 +2107,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
publisher.sort(mappedSort);
}
if (filterQuery.getMeta().getMaxTimeMsec() != null) {
publisher.maxTime(filterQuery.getMeta().getMaxTimeMsec(), TimeUnit.MILLISECONDS);
Meta meta = filterQuery.getMeta();
if (meta.hasMaxTime()) {
publisher.maxTime(meta.getRequiredMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
if (filterQuery.getLimit() > 0 || (options.getLimit() != null)) {
@@ -2133,16 +2266,18 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* The query document is specified as a standard {@link Document} and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param collation can be {@literal null}.
* @return the {@link List} of converted objects.
*/
protected <T> Mono<T> doFindOne(String collectionName, Document query, @Nullable Document fields,
protected <T> Mono<T> doFindOne(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, @Nullable Document fields,
Class<T> entityClass, @Nullable Collation collation) {
return doFindOne(collectionName, query, fields, entityClass,
return doFindOne(collectionName, collectionPreparer, query, fields, entityClass,
findPublisher -> collation != null ? findPublisher.collation(collation.toMongoCollation()) : findPublisher);
}
@@ -2151,6 +2286,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* The query document is specified as a standard {@link Document} and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
@@ -2158,7 +2294,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @return the {@link List} of converted objects.
* @since 2.2
*/
protected <T> Mono<T> doFindOne(String collectionName, Document query, @Nullable Document fields,
protected <T> Mono<T> doFindOne(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, @Nullable Document fields,
Class<T> entityClass, FindPublisherPreparer preparer) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
@@ -2173,7 +2310,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
}
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields, preparer),
return executeFindOneInternal(new FindOneCallback(collectionPreparer, mappedQuery, mappedFields, preparer),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
}
@@ -2182,13 +2319,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* query document is specified as a standard Document and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
* @param entityClass the parameterized type of the returned list.
* @return the List of converted objects.
*/
protected <T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<T> entityClass) {
return doFind(collectionName, query, fields, entityClass, null,
protected <T> Flux<T> doFind(String collectionName, CollectionPreparer<MongoCollection<Document>> collectionPreparer,
Document query, Document fields, Class<T> entityClass) {
return doFind(collectionName, collectionPreparer, query, fields, entityClass, null,
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName));
}
@@ -2198,6 +2337,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* specified as a standard Document and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
@@ -2205,14 +2345,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* the result set, (apply limits, skips and so on).
* @return the {@link List} of converted objects.
*/
protected <T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<T> entityClass,
FindPublisherPreparer preparer) {
return doFind(collectionName, query, fields, entityClass, preparer,
protected <T> Flux<T> doFind(String collectionName, CollectionPreparer<MongoCollection<Document>> collectionPreparer,
Document query, Document fields, Class<T> entityClass, FindPublisherPreparer preparer) {
return doFind(collectionName, collectionPreparer, query, fields, entityClass, preparer,
new ReadDocumentCallback<>(mongoConverter, entityClass, collectionName));
}
protected <S, T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<S> entityClass,
@Nullable FindPublisherPreparer preparer, DocumentCallback<T> objectCallback) {
protected <S, T> Flux<T> doFind(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, Document fields,
Class<S> entityClass, @Nullable FindPublisherPreparer preparer, DocumentCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
@@ -2225,8 +2366,23 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
collectionName);
return executeFindMultiInternal(new FindCallback(collectionPreparer, mappedQuery, mappedFields), preparer,
objectCallback, collectionName);
}
CollectionPreparer<MongoCollection<Document>> createCollectionPreparer(Query query) {
return ReactiveCollectionPreparerDelegate.of(query);
}
CollectionPreparer<MongoCollection<Document>> createCollectionPreparer(Query query, @Nullable MongoAction action) {
CollectionPreparer<MongoCollection<Document>> collectionPreparer = createCollectionPreparer(query);
if (action == null) {
return collectionPreparer;
}
return collectionPreparer.andThen(collection -> {
WriteConcern writeConcern = prepareWriteConcern(action);
return writeConcern != null ? collection.withWriteConcern(writeConcern) : collection;
});
}
/**
@@ -2235,8 +2391,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*
* @since 2.0
*/
<S, T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<S> sourceClass,
Class<T> targetClass, FindPublisherPreparer preparer) {
<S, T> Flux<T> doFind(String collectionName, CollectionPreparer<MongoCollection<Document>> collectionPreparer,
Document query, Document fields, Class<S> sourceClass, Class<T> targetClass, FindPublisherPreparer preparer) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(sourceClass);
EntityProjection<T, S> projection = operations.introspectProjection(targetClass, sourceClass);
@@ -2250,7 +2406,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(mappedQuery), mappedFields, sourceClass, collectionName));
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer,
return executeFindMultiInternal(new FindCallback(collectionPreparer, mappedQuery, mappedFields), preparer,
new ProjectingReadCallback<>(mongoConverter, projection, collectionName), collectionName);
}
@@ -2268,13 +2424,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* The first document that matches the query is returned and also removed from the collection in the database. <br />
* The query document is specified as a standard Document and so is the fields specification.
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param collation collation
* @param collectionName name of the collection to retrieve the objects from.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param query the query document that specifies the criteria used to find a record.
* @param collation collation.
* @param entityClass the parameterized type of the returned list.
* @return the List of converted objects.
*/
protected <T> Mono<T> doFindAndRemove(String collectionName, Document query, Document fields, Document sort,
protected <T> Mono<T> doFindAndRemove(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, Document fields, Document sort,
@Nullable Collation collation, Class<T> entityClass) {
if (LOGGER.isDebugEnabled()) {
@@ -2284,12 +2442,13 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(
new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort, collation),
return executeFindOneInternal(new FindAndRemoveCallback(collectionPreparer,
queryMapper.getMappedObject(query, entity), fields, sort, collation),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
}
protected <T> Mono<T> doFindAndModify(String collectionName, Document query, Document fields, Document sort,
protected <T> Mono<T> doFindAndModify(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, Document fields, Document sort,
Class<T> entityClass, UpdateDefinition update, FindAndModifyOptions options) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
@@ -2310,7 +2469,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
return executeFindOneInternal(
new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate,
new FindAndModifyCallback(collectionPreparer, mappedQuery, fields, sort, mappedUpdate,
update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()), options),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
});
@@ -2320,6 +2479,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* Customize this part for findAndReplace.
*
* @param collectionName The name of the collection to perform the operation in.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param mappedQuery the query to look up documents.
* @param mappedFields the fields to project the result to.
* @param mappedSort the sort to be applied when executing the query.
@@ -2332,20 +2492,22 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* {@literal false} and {@link FindAndReplaceOptions#isUpsert() upsert} is {@literal false}.
* @since 2.1
*/
protected <T> Mono<T> doFindAndReplace(String collectionName, Document mappedQuery, Document mappedFields,
protected <T> Mono<T> doFindAndReplace(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document mappedQuery, Document mappedFields,
Document mappedSort, com.mongodb.client.model.Collation collation, Class<?> entityType, Document replacement,
FindAndReplaceOptions options, Class<T> resultType) {
EntityProjection<T, ?> projection = operations.introspectProjection(resultType, entityType);
return doFindAndReplace(collectionName, mappedQuery, mappedFields, mappedSort, collation, entityType, replacement,
options, projection);
return doFindAndReplace(collectionName, collectionPreparer, mappedQuery, mappedFields, mappedSort, collation,
entityType, replacement, options, projection);
}
/**
* Customize this part for findAndReplace.
*
* @param collectionName The name of the collection to perform the operation in.
* @param collectionPreparer the preparer to prepare the collection for the actual use.
* @param mappedQuery the query to look up documents.
* @param mappedFields the fields to project the result to.
* @param mappedSort the sort to be applied when executing the query.
@@ -2358,7 +2520,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* {@literal false} and {@link FindAndReplaceOptions#isUpsert() upsert} is {@literal false}.
* @since 3.4
*/
private <T> Mono<T> doFindAndReplace(String collectionName, Document mappedQuery, Document mappedFields,
private <T> Mono<T> doFindAndReplace(String collectionName,
CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document mappedQuery, Document mappedFields,
Document mappedSort, com.mongodb.client.model.Collation collation, Class<?> entityType, Document replacement,
FindAndReplaceOptions options, EntityProjection<T, ?> projection) {
@@ -2372,8 +2535,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
serializeToJsonSafely(replacement), collectionName));
}
return executeFindOneInternal(
new FindAndReplaceCallback(mappedQuery, mappedFields, mappedSort, replacement, collation, options),
return executeFindOneInternal(new FindAndReplaceCallback(collectionPreparer, mappedQuery, mappedFields,
mappedSort, replacement, collation, options),
new ProjectingReadCallback<>(this.mongoConverter, projection, collectionName), collectionName);
});
@@ -2451,7 +2614,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @param collection
*/
protected MongoCollection<Document> prepareCollection(MongoCollection<Document> collection) {
return this.readPreference != null ? collection.withReadPreference(readPreference) : collection;
if (this.readPreference != null && this.readPreference != collection.getReadPreference()) {
return collection.withReadPreference(readPreference);
}
return collection;
}
/**
@@ -2494,7 +2662,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)) {
if (wc == null || wc.getWObject() == null
|| (wc.getWObject() instanceof Number && ((Number) wc.getWObject()).intValue() < 1)) {
|| (wc.getWObject() instanceof Number concern && concern.intValue() < 1)) {
return WriteConcern.ACKNOWLEDGED;
}
}
@@ -2557,8 +2725,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return throwable -> {
if (throwable instanceof RuntimeException) {
return potentiallyConvertRuntimeException((RuntimeException) throwable, exceptionTranslator);
if (throwable instanceof RuntimeException runtimeException) {
return potentiallyConvertRuntimeException(runtimeException, exceptionTranslator);
}
return throwable;
@@ -2600,13 +2768,24 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return converter;
}
@Nullable
private Document getMappedSortObject(Query query, Class<?> type) {
if (query == null) {
return null;
}
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
return getMappedSortObject(query.getSortObject(), type);
}
@Nullable
private Document getMappedSortObject(Document sortObject, Class<?> type) {
if (ObjectUtils.isEmpty(sortObject)) {
return null;
}
return queryMapper.getMappedSort(sortObject, mappingContext.getPersistentEntity(type));
}
// Callback implementations
@@ -2621,11 +2800,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
private static class FindOneCallback implements ReactiveCollectionCallback<Document> {
private final CollectionPreparer<MongoCollection<Document>> collectionPreparer;
private final Document query;
private final Optional<Document> fields;
private final FindPublisherPreparer preparer;
FindOneCallback(Document query, @Nullable Document fields, FindPublisherPreparer preparer) {
FindOneCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query,
@Nullable Document fields, FindPublisherPreparer preparer) {
this.collectionPreparer = collectionPreparer;
this.query = query;
this.fields = Optional.ofNullable(fields);
this.preparer = preparer;
@@ -2635,14 +2817,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(
String.format("findOne using query: %s fields: %s in db.collection: %s", serializeToJsonSafely(query),
serializeToJsonSafely(fields.orElseGet(Document::new)), collection.getNamespace().getFullName()));
}
FindPublisher<Document> publisher = preparer.initiateFind(collection, col -> col.find(query, Document.class));
FindPublisher<Document> publisher = preparer.initiateFind(collectionPreparer.prepare(collection),
col -> col.find(query, Document.class));
if (fields.isPresent()) {
publisher = publisher.projection(fields.get());
@@ -2660,15 +2836,17 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
private static class FindCallback implements ReactiveCollectionQueryCallback<Document> {
private final CollectionPreparer<MongoCollection<Document>> collectionPreparer;
private final @Nullable Document query;
private final @Nullable Document fields;
FindCallback(@Nullable Document query) {
this(query, null);
FindCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, @Nullable Document query) {
this(collectionPreparer, query, null);
}
FindCallback(Document query, Document fields) {
FindCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query, Document fields) {
this.collectionPreparer = collectionPreparer;
this.query = query;
this.fields = fields;
}
@@ -2676,11 +2854,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Override
public FindPublisher<Document> doInCollection(MongoCollection<Document> collection) {
MongoCollection<Document> collectionToUse = collectionPreparer.prepare(collection);
FindPublisher<Document> findPublisher;
if (ObjectUtils.isEmpty(query)) {
findPublisher = collection.find(Document.class);
findPublisher = collectionToUse.find(Document.class);
} else {
findPublisher = collection.find(query, Document.class);
findPublisher = collectionToUse.find(query, Document.class);
}
if (ObjectUtils.isEmpty(fields)) {
@@ -2699,13 +2878,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
private static class FindAndRemoveCallback implements ReactiveCollectionCallback<Document> {
private final CollectionPreparer<MongoCollection<Document>> collectionPreparer;
private final Document query;
private final Document fields;
private final Document sort;
private final Optional<Collation> collation;
FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
FindAndRemoveCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query,
Document fields, Document sort, @Nullable Collation collation) {
this.collectionPreparer = collectionPreparer;
this.query = query;
this.fields = fields;
this.sort = sort;
@@ -2719,7 +2900,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
FindOneAndDeleteOptions findOneAndDeleteOptions = convertToFindOneAndDeleteOptions(fields, sort);
collation.map(Collation::toMongoCollation).ifPresent(findOneAndDeleteOptions::collation);
return collection.findOneAndDelete(query, findOneAndDeleteOptions);
return collectionPreparer.prepare(collection).findOneAndDelete(query, findOneAndDeleteOptions);
}
}
@@ -2728,6 +2909,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
private static class FindAndModifyCallback implements ReactiveCollectionCallback<Document> {
private final CollectionPreparer<MongoCollection<Document>> collectionPreparer;
private final Document query;
private final Document fields;
private final Document sort;
@@ -2735,9 +2917,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
FindAndModifyCallback(Document query, Document fields, Document sort, Object update, List<Document> arrayFilters,
FindAndModifyOptions options) {
FindAndModifyCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query,
Document fields, Document sort, Object update, List<Document> arrayFilters, FindAndModifyOptions options) {
this.collectionPreparer = collectionPreparer;
this.query = query;
this.fields = fields;
this.sort = sort;
@@ -2750,21 +2933,22 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
MongoCollection<Document> collectionToUse = collectionPreparer.prepare(collection);
if (options.isRemove()) {
FindOneAndDeleteOptions findOneAndDeleteOptions = convertToFindOneAndDeleteOptions(fields, sort);
findOneAndDeleteOptions = options.getCollation().map(Collation::toMongoCollation)
.map(findOneAndDeleteOptions::collation).orElse(findOneAndDeleteOptions);
return collection.findOneAndDelete(query, findOneAndDeleteOptions);
return collectionToUse.findOneAndDelete(query, findOneAndDeleteOptions);
}
FindOneAndUpdateOptions findOneAndUpdateOptions = convertToFindOneAndUpdateOptions(options, fields, sort,
arrayFilters);
if (update instanceof Document) {
return collection.findOneAndUpdate(query, (Document) update, findOneAndUpdateOptions);
if (update instanceof Document document) {
return collection.findOneAndUpdate(query, document, findOneAndUpdateOptions);
} else if (update instanceof List) {
return collection.findOneAndUpdate(query, (List<Document>) update, findOneAndUpdateOptions);
return collectionToUse.findOneAndUpdate(query, (List<Document>) update, findOneAndUpdateOptions);
}
return Flux
@@ -2803,6 +2987,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
private static class FindAndReplaceCallback implements ReactiveCollectionCallback<Document> {
private final CollectionPreparer<MongoCollection<Document>> collectionPreparer;
private final Document query;
private final Document fields;
private final Document sort;
@@ -2810,9 +2995,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final @Nullable com.mongodb.client.model.Collation collation;
private final FindAndReplaceOptions options;
FindAndReplaceCallback(Document query, Document fields, Document sort, Document update,
com.mongodb.client.model.Collation collation, FindAndReplaceOptions options) {
FindAndReplaceCallback(CollectionPreparer<MongoCollection<Document>> collectionPreparer, Document query,
Document fields, Document sort, Document update, com.mongodb.client.model.Collation collation,
FindAndReplaceOptions options) {
this.collectionPreparer = collectionPreparer;
this.query = query;
this.fields = fields;
this.sort = sort;
@@ -2826,7 +3012,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
throws MongoException, DataAccessException {
FindOneAndReplaceOptions findOneAndReplaceOptions = convertToFindOneAndReplaceOptions(options, fields, sort);
return collection.findOneAndReplace(query, update, findOneAndReplaceOptions);
return collectionPreparer.prepare(collection).findOneAndReplace(query, update, findOneAndReplaceOptions);
}
private FindOneAndReplaceOptions convertToFindOneAndReplaceOptions(FindAndReplaceOptions options, Document fields,
@@ -3022,11 +3208,24 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
class QueryFindPublisherPreparer implements FindPublisherPreparer {
private final Query query;
private final Document sortObject;
private final int limit;
private final long skip;
private final @Nullable Class<?> type;
QueryFindPublisherPreparer(Query query, @Nullable Class<?> type) {
this(query, query.getSortObject(), query.getLimit(), query.getSkip(), type);
}
QueryFindPublisherPreparer(Query query, Document sortObject, int limit, long skip, @Nullable Class<?> type) {
this.query = query;
this.sortObject = sortObject;
this.limit = limit;
this.skip = skip;
this.type = type;
}
@@ -3041,23 +3240,23 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
HintFunction hintFunction = HintFunction.from(query.getHint());
Meta meta = query.getMeta();
if (query.getSkip() <= 0 && query.getLimit() <= 0 && ObjectUtils.isEmpty(query.getSortObject())
&& !hintFunction.isPresent() && !meta.hasValues()) {
if (skip <= 0 && limit <= 0 && ObjectUtils.isEmpty(sortObject) && hintFunction.isEmpty()
&& !meta.hasValues()) {
return findPublisherToUse;
}
try {
if (query.getSkip() > 0) {
findPublisherToUse = findPublisherToUse.skip((int) query.getSkip());
if (skip > 0) {
findPublisherToUse = findPublisherToUse.skip((int) skip);
}
if (query.getLimit() > 0) {
findPublisherToUse = findPublisherToUse.limit(query.getLimit());
if (limit > 0) {
findPublisherToUse = findPublisherToUse.limit(limit);
}
if (!ObjectUtils.isEmpty(query.getSortObject())) {
Document sort = type != null ? getMappedSortObject(query, type) : query.getSortObject();
if (!ObjectUtils.isEmpty(sortObject)) {
Document sort = type != null ? getMappedSortObject(sortObject, type) : sortObject;
findPublisherToUse = findPublisherToUse.sort(sort);
}
@@ -3068,12 +3267,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (meta.hasValues()) {
if (StringUtils.hasText(meta.getComment())) {
findPublisherToUse = findPublisherToUse.comment(meta.getComment());
if (meta.hasComment()) {
findPublisherToUse = findPublisherToUse.comment(meta.getRequiredComment());
}
if (meta.getMaxTimeMsec() != null) {
findPublisherToUse = findPublisherToUse.maxTime(meta.getMaxTimeMsec(), TimeUnit.MILLISECONDS);
if (meta.hasMaxTime()) {
findPublisherToUse = findPublisherToUse.maxTime(meta.getRequiredMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
if (meta.getCursorBatchSize() != null) {
@@ -3092,11 +3291,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return findPublisherToUse;
}
@Override
public ReadPreference getReadPreference() {
return query.getMeta().getFlags().contains(CursorOption.SECONDARY_READS) ? ReadPreference.primaryPreferred()
: null;
}
}
class TailingQueryFindPublisherPreparer extends QueryFindPublisherPreparer {
@@ -3179,9 +3373,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
PersistentEntity<?, ?> entity = event.getPersistentEntity();
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
if (entity instanceof MongoPersistentEntity<?> mongoPersistentProperties) {
onCheckForIndexes((MongoPersistentEntity<?>) entity, subscriptionExceptionHandler);
onCheckForIndexes(mongoPersistentProperties, subscriptionExceptionHandler);
}
}
}

View File

@@ -69,7 +69,7 @@ public interface ReactiveRemoveOperation {
/**
* Remove and return all matching documents. <br/>
* <strong>NOTE</strong> The entire list of documents will be fetched before sending the actual delete commands.
* <strong>NOTE:</strong> The entire list of documents will be fetched before sending the actual delete commands.
* Also, {@link org.springframework.context.ApplicationEvent}s will be published for each and every delete
* operation.
*

View File

@@ -72,13 +72,30 @@ public interface ReactiveUpdateOperation {
Mono<T> findAndModify();
}
/**
* Trigger <a href="https://docs.mongodb.com/manual/reference/method/db.collection.replaceOne/">replaceOne</a>
* execution by calling one of the terminating methods.
*
* @author Christoph Strobl
* @since 4.2
*/
interface TerminatingReplace {
/**
* Find first and replace/upsert.
*
* @return never {@literal null}.
*/
Mono<UpdateResult> replaceFirst();
}
/**
* Compose findAndReplace execution by calling one of the terminating methods.
*
* @author Mark Paluch
* @since 2.1
*/
interface TerminatingFindAndReplace<T> {
interface TerminatingFindAndReplace<T> extends TerminatingReplace {
/**
* Find, replace and return the first matching document.
@@ -202,6 +219,22 @@ public interface ReactiveUpdateOperation {
TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options);
}
/**
* @author Christoph Strobl
* @since 4.2
*/
interface ReplaceWithOptions extends TerminatingReplace {
/**
* Explicitly define {@link ReplaceOptions}.
*
* @param options must not be {@literal null}.
* @return new instance of {@link FindAndReplaceOptions}.
* @throws IllegalArgumentException if options is {@literal null}.
*/
TerminatingReplace withOptions(ReplaceOptions options);
}
/**
* Define {@link FindAndReplaceOptions}.
*
@@ -209,7 +242,7 @@ public interface ReactiveUpdateOperation {
* @author Christoph Strobl
* @since 2.1
*/
interface FindAndReplaceWithOptions<T> extends TerminatingFindAndReplace<T> {
interface FindAndReplaceWithOptions<T> extends TerminatingFindAndReplace<T>, ReplaceWithOptions {
/**
* Explicitly define {@link FindAndReplaceOptions} for the {@link Update}.

View File

@@ -165,6 +165,17 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
replacement, targetType);
}
@Override
public TerminatingReplace withOptions(ReplaceOptions options) {
FindAndReplaceOptions target = new FindAndReplaceOptions();
if (options.isUpsert()) {
target.upsert();
}
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
target, replacement, targetType);
}
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
@@ -174,6 +185,18 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, resultType);
}
@Override
public Mono <UpdateResult> replaceFirst() {
if (replacement != null) {
return template.replace(query, domainType, replacement,
findAndReplaceOptions != null ? findAndReplaceOptions : ReplaceOptions.none(), getCollectionName());
}
return template.replace(query, domainType, update,
findAndReplaceOptions != null ? findAndReplaceOptions : ReplaceOptions.none(), getCollectionName());
}
private Mono<UpdateResult> doUpdate(boolean multi, boolean upsert) {
return template.doUpdate(getCollectionName(), query, update, domainType, upsert, multi);
}

View File

@@ -0,0 +1,46 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.lang.Nullable;
import com.mongodb.ReadConcern;
/**
* Interface to be implemented by any object that wishes to expose the {@link ReadConcern}.
* <p>
* Typically implemented by cursor or query preparer objects.
*
* @author Mark Paluch
* @since 4.1
* @see org.springframework.data.mongodb.core.query.Query
* @see org.springframework.data.mongodb.core.aggregation.AggregationOptions
*/
public interface ReadConcernAware {
/**
* @return {@literal true} if a {@link ReadConcern} is set.
*/
default boolean hasReadConcern() {
return getReadConcern() != null;
}
/**
* @return the {@link ReadConcern} to apply or {@literal null} if none set.
*/
@Nullable
ReadConcern getReadConcern();
}

View File

@@ -27,6 +27,8 @@ import com.mongodb.ReadPreference;
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
* @see org.springframework.data.mongodb.core.query.Query
* @see org.springframework.data.mongodb.core.aggregation.AggregationOptions
*/
public interface ReadPreferenceAware {

View File

@@ -0,0 +1,87 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.query.Query;
/**
* Options for {@link org.springframework.data.mongodb.core.MongoOperations#replace(Query, Object) replace operations}. Defaults to
* <dl>
* <dt>upsert</dt>
* <dd>false</dd>
* </dl>
*
* @author Jakub Zurawa
* @author Christoph Strob
* @since 4.2
*/
public class ReplaceOptions {
private boolean upsert;
private static final ReplaceOptions NONE = new ReplaceOptions() {
private static final String ERROR_MSG = "ReplaceOptions.none() cannot be changed; Please use ReplaceOptions.options() instead";
@Override
public ReplaceOptions upsert() {
throw new UnsupportedOperationException(ERROR_MSG);
}
};
/**
* Static factory method to create a {@link ReplaceOptions} instance.
* <dl>
* <dt>upsert</dt>
* <dd>false</dd>
* </dl>
*
* @return new instance of {@link ReplaceOptions}.
*/
public static ReplaceOptions replaceOptions() {
return new ReplaceOptions();
}
/**
* Static factory method returning an unmodifiable {@link ReplaceOptions} instance.
*
* @return unmodifiable {@link ReplaceOptions} instance.
*/
public static ReplaceOptions none() {
return NONE;
}
/**
* Insert a new document if not exists.
*
* @return this.
*/
public ReplaceOptions upsert() {
this.upsert = true;
return this;
}
/**
* Get the bit indicating if to create a new document if not exists.
*
* @return {@literal true} if set.
*/
public boolean isUpsert() {
return upsert;
}
}

View File

@@ -0,0 +1,268 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.function.IntFunction;
import org.bson.BsonNull;
import org.bson.Document;
import org.springframework.data.domain.KeysetScrollPosition;
import org.springframework.data.domain.ScrollPosition;
import org.springframework.data.domain.ScrollPosition.Direction;
import org.springframework.data.domain.Window;
import org.springframework.data.mongodb.core.EntityOperations.Entity;
import org.springframework.data.mongodb.core.query.Query;
/**
* Utilities to run scroll queries and create {@link Window} results.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 4.1
*/
class ScrollUtils {
/**
* Create the actual query to run keyset-based pagination. Affects projection, sorting, and the criteria.
*
* @param query
* @param idPropertyName
* @return
*/
static KeysetScrollQuery createKeysetPaginationQuery(Query query, String idPropertyName) {
KeysetScrollPosition keyset = query.getKeyset();
KeysetScrollDirector director = KeysetScrollDirector.of(keyset.getDirection());
Document sortObject = director.getSortObject(idPropertyName, query);
Document fieldsObject = director.getFieldsObject(query.getFieldsObject(), sortObject);
Document queryObject = director.createQuery(keyset, query.getQueryObject(), sortObject);
return new KeysetScrollQuery(queryObject, fieldsObject, sortObject);
}
static <T> Window<T> createWindow(Query query, List<T> result, Class<?> sourceType, EntityOperations operations) {
Document sortObject = query.getSortObject();
KeysetScrollPosition keyset = query.getKeyset();
Direction direction = keyset.getDirection();
KeysetScrollDirector director = KeysetScrollDirector.of(direction);
List<T> resultsToUse = director.postPostProcessResults(result, query.getLimit());
IntFunction<ScrollPosition> positionFunction = value -> {
T last = resultsToUse.get(value);
Entity<T> entity = operations.forEntity(last);
Map<String, Object> keys = entity.extractKeys(sortObject, sourceType);
return ScrollPosition.of(keys, direction);
};
return Window.from(resultsToUse, positionFunction, hasMoreElements(result, query.getLimit()));
}
static <T> Window<T> createWindow(List<T> result, int limit, IntFunction<? extends ScrollPosition> positionFunction) {
return Window.from(getSubList(result, limit), positionFunction, hasMoreElements(result, limit));
}
static boolean hasMoreElements(List<?> result, int limit) {
return !result.isEmpty() && result.size() > limit;
}
static <T> List<T> getSubList(List<T> result, int limit) {
if (limit > 0 && result.size() > limit) {
return result.subList(0, limit);
}
return result;
}
record KeysetScrollQuery(Document query, Document fields, Document sort) {
}
/**
* Director for keyset scrolling.
*/
static class KeysetScrollDirector {
private static final KeysetScrollDirector FORWARD = new KeysetScrollDirector();
private static final KeysetScrollDirector REVERSE = new ReverseKeysetScrollDirector();
/**
* Factory method to obtain the right {@link KeysetScrollDirector}.
*
* @param direction
* @return
*/
public static KeysetScrollDirector of(ScrollPosition.Direction direction) {
return direction == Direction.FORWARD ? FORWARD : REVERSE;
}
public Document getSortObject(String idPropertyName, Query query) {
Document sortObject = query.isSorted() ? query.getSortObject() : new Document();
sortObject.put(idPropertyName, 1);
return sortObject;
}
public Document getFieldsObject(Document fieldsObject, Document sortObject) {
// make sure we can extract the keyset
if (!fieldsObject.isEmpty()) {
for (String field : sortObject.keySet()) {
fieldsObject.put(field, 1);
}
}
return fieldsObject;
}
public Document createQuery(KeysetScrollPosition keyset, Document queryObject, Document sortObject) {
Map<String, Object> keysetValues = keyset.getKeys();
List<Document> or = (List<Document>) queryObject.getOrDefault("$or", new ArrayList<>());
List<String> sortKeys = new ArrayList<>(sortObject.keySet());
// first query doesn't come with a keyset
if (keysetValues.isEmpty()) {
return queryObject;
}
if (!keysetValues.keySet().containsAll(sortKeys)) {
throw new IllegalStateException("KeysetScrollPosition does not contain all keyset values");
}
// build matrix query for keyset paging that contains sort^2 queries
// reflecting a query that follows sort order semantics starting from the last returned keyset
for (int i = 0; i < sortKeys.size(); i++) {
Document sortConstraint = new Document();
for (int j = 0; j < sortKeys.size(); j++) {
String sortSegment = sortKeys.get(j);
int sortOrder = sortObject.getInteger(sortSegment);
Object o = keysetValues.get(sortSegment);
if (j >= i) { // tail segment
if (o instanceof BsonNull) {
throw new IllegalStateException(
"Cannot resume from KeysetScrollPosition. Offending key: '%s' is 'null'".formatted(sortSegment));
}
sortConstraint.put(sortSegment, new Document(getComparator(sortOrder), o));
break;
}
sortConstraint.put(sortSegment, o);
}
if (!sortConstraint.isEmpty()) {
or.add(sortConstraint);
}
}
if (!or.isEmpty()) {
queryObject.put("$or", or);
}
return queryObject;
}
protected String getComparator(int sortOrder) {
return sortOrder == 1 ? "$gt" : "$lt";
}
protected <T> List<T> postPostProcessResults(List<T> list, int limit) {
return getFirst(limit, list);
}
}
/**
* Reverse scrolling director variant applying {@link KeysetScrollPosition.Direction#Backward}. In reverse scrolling,
* we need to flip directions for the actual query so that we do not get everything from the top position and apply
* the limit but rather flip the sort direction, apply the limit and then reverse the result to restore the actual
* sort order.
*/
private static class ReverseKeysetScrollDirector extends KeysetScrollDirector {
@Override
public Document getSortObject(String idPropertyName, Query query) {
Document sortObject = super.getSortObject(idPropertyName, query);
// flip sort direction for backward scrolling
for (String field : sortObject.keySet()) {
sortObject.put(field, sortObject.getInteger(field) == 1 ? -1 : 1);
}
return sortObject;
}
@Override
public <T> List<T> postPostProcessResults(List<T> list, int limit) {
// flip direction of the result list as we need to accomodate for the flipped sort order for proper offset
// querying.
Collections.reverse(list);
return getLast(limit, list);
}
}
/**
* Return the first {@code count} items from the list.
*
* @param count
* @param list
* @return
* @param <T>
*/
static <T> List<T> getFirst(int count, List<T> list) {
if (count > 0 && list.size() > count) {
return list.subList(0, count);
}
return list;
}
/**
* Return the last {@code count} items from the list.
*
* @param count
* @param list
* @return
* @param <T>
*/
static <T> List<T> getLast(int count, List<T> list) {
if (count > 0 && list.size() > count) {
return list.subList(list.size() - count, list.size());
}
return list;
}
}

View File

@@ -69,8 +69,8 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
@SuppressWarnings("unchecked")
private Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
if (value instanceof Field field) {
@@ -136,8 +136,8 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
List<Object> clone = new ArrayList<>((List<Object>) this.value);
if (value instanceof Collection && Expand.EXPAND_VALUES.equals(expandList)) {
clone.addAll((Collection<?>) value);
if (value instanceof Collection<?> collection && Expand.EXPAND_VALUES.equals(expandList)) {
clone.addAll(collection);
} else {
clone.add(value);
}

View File

@@ -360,10 +360,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);
@@ -440,10 +438,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);
@@ -539,10 +535,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);
@@ -639,10 +633,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);
@@ -719,10 +711,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);
@@ -799,10 +789,8 @@ public class AccumulatorOperators {
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDocument(((List<Object>) value).iterator().next(), context);
}
if (value instanceof List<?> list && list.size() == 1) {
return super.toDocument(list.iterator().next(), context);
}
return super.toDocument(value, context);

View File

@@ -148,7 +148,7 @@ public class AddFieldsOperation extends DocumentEnhancingOperation {
@Override
public AddFieldsOperationBuilder withValueOf(Object value) {
valueMap.put(field, value instanceof String ? Fields.fields((String) value) : value);
valueMap.put(field, value instanceof String stringValue ? Fields.fields(stringValue) : value);
return AddFieldsOperationBuilder.this;
}

View File

@@ -28,6 +28,7 @@ import org.springframework.data.mongodb.core.aggregation.AddFieldsOperation.AddF
import org.springframework.data.mongodb.core.aggregation.CountOperation.CountOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.FacetOperation.FacetOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder;
import org.springframework.data.mongodb.core.aggregation.LookupOperation.LookupOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.MergeOperation.MergeOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootDocumentOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootOperationBuilder;
@@ -50,6 +51,7 @@ import org.springframework.util.Assert;
* @author Nikolay Bogdanov
* @author Gustavo de Geus
* @author Jérôme Guyon
* @author Sangyong Choi
* @since 1.3
*/
public class Aggregation {
@@ -664,6 +666,23 @@ public class Aggregation {
return new LookupOperation(from, localField, foreignField, as);
}
/**
* Entrypoint for creating {@link LookupOperation $lookup} using a fluent builder API.
* <pre class="code">
* Aggregation.lookup().from("restaurants")
* .localField("restaurant_name")
* .foreignField("name")
* .let(newVariable("orders_drink").forField("drink"))
* .pipeline(match(ctx -> new Document("$expr", new Document("$in", List.of("$$orders_drink", "$beverages")))))
* .as("matches")
* </pre>
* @return new instance of {@link LookupOperationBuilder}.
* @since 4.1
*/
public static LookupOperationBuilder lookup() {
return new LookupOperationBuilder();
}
/**
* Creates a new {@link CountOperationBuilder}.
*

View File

@@ -21,6 +21,10 @@ import org.springframework.data.mongodb.MongoExpression;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
* <p>
* The {@link AggregationExpression expressions} {@link #toDocument(AggregationOperationContext)} method is called during
* the mapping process to obtain the mapped, ready to use representation that can be handed over to the driver as part
* of an {@link AggregationOperation pipleine stage}.
*
* @author Thomas Darimont
* @author Oliver Gierke
@@ -39,11 +43,11 @@ public interface AggregationExpression extends MongoExpression {
*/
static AggregationExpression from(MongoExpression expression) {
if (expression instanceof AggregationExpression) {
return AggregationExpression.class.cast(expression);
if (expression instanceof AggregationExpression aggregationExpression) {
return aggregationExpression;
}
return (context) -> context.getMappedObject(expression.toDocument());
return context -> context.getMappedObject(expression.toDocument());
}
/**

View File

@@ -0,0 +1,58 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.EvaluationOperators.Expr;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
/**
* A {@link CriteriaDefinition criteria} to use {@code $expr} within a
* {@link org.springframework.data.mongodb.core.query.Query}.
*
* @author Christoph Strobl
* @since 4.1
*/
public class AggregationExpressionCriteria implements CriteriaDefinition {
private final AggregationExpression expression;
AggregationExpressionCriteria(AggregationExpression expression) {
this.expression = expression;
}
/**
* @param expression must not be {@literal null}.
* @return new instance of {@link AggregationExpressionCriteria}.
*/
public static AggregationExpressionCriteria whereExpr(AggregationExpression expression) {
return new AggregationExpressionCriteria(expression);
}
@Override
public Document getCriteriaObject() {
if (expression instanceof Expr expr) {
return new Document(getKey(), expr.get(0));
}
return new Document(getKey(), expression);
}
@Override
public String getKey() {
return "$expr";
}
}

View File

@@ -55,16 +55,15 @@ class AggregationOperationRenderer {
operationDocuments.addAll(operation.toPipelineStages(contextToUse));
if (operation instanceof FieldsExposingAggregationOperation) {
if (operation instanceof FieldsExposingAggregationOperation exposedFieldsOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation || exposedFieldsOperation.inheritsFields()) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT
: new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), contextToUse);
: new ExposedFieldsAggregationOperationContext(fields, contextToUse);
}
}
}

View File

@@ -19,11 +19,16 @@ import java.time.Duration;
import java.util.Optional;
import org.bson.Document;
import org.springframework.data.mongodb.core.ReadConcernAware;
import org.springframework.data.mongodb.core.ReadPreferenceAware;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
@@ -39,7 +44,7 @@ import org.springframework.util.Assert;
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
public class AggregationOptions implements ReadConcernAware, ReadPreferenceAware {
private static final String BATCH_SIZE = "batchSize";
private static final String CURSOR = "cursor";
@@ -56,6 +61,10 @@ public class AggregationOptions {
private final Optional<Collation> collation;
private final Optional<String> comment;
private final Optional<Object> hint;
private Optional<ReadConcern> readConcern;
private Optional<ReadPreference> readPreference;
private Duration maxTime = Duration.ZERO;
private ResultOptions resultOptions = ResultOptions.READ;
private DomainTypeMapping domainTypeMapping = DomainTypeMapping.RELAXED;
@@ -123,6 +132,8 @@ public class AggregationOptions {
this.collation = Optional.ofNullable(collation);
this.comment = Optional.ofNullable(comment);
this.hint = Optional.ofNullable(hint);
this.readConcern = Optional.empty();
this.readPreference = Optional.empty();
}
/**
@@ -268,6 +279,26 @@ public class AggregationOptions {
return hint;
}
@Override
public boolean hasReadConcern() {
return readConcern.isPresent();
}
@Override
public ReadConcern getReadConcern() {
return readConcern.orElse(null);
}
@Override
public boolean hasReadPreference() {
return readPreference.isPresent();
}
@Override
public ReadPreference getReadPreference() {
return readPreference.orElse(null);
}
/**
* @return the time limit for processing. {@link Duration#ZERO} is used for the default unbounded behavior.
* @since 3.0
@@ -385,6 +416,8 @@ public class AggregationOptions {
private @Nullable Collation collation;
private @Nullable String comment;
private @Nullable Object hint;
private @Nullable ReadConcern readConcern;
private @Nullable ReadPreference readPreference;
private @Nullable Duration maxTime;
private @Nullable ResultOptions resultOptions;
private @Nullable DomainTypeMapping domainTypeMapping;
@@ -490,6 +523,32 @@ public class AggregationOptions {
return this;
}
/**
* Define a {@link ReadConcern} to apply to the aggregation.
*
* @param readConcern can be {@literal null}.
* @return this.
* @since 4.1
*/
public Builder readConcern(@Nullable ReadConcern readConcern) {
this.readConcern = readConcern;
return this;
}
/**
* Define a {@link ReadPreference} to apply to the aggregation.
*
* @param readPreference can be {@literal null}.
* @return this.
* @since 4.1
*/
public Builder readPreference(@Nullable ReadPreference readPreference) {
this.readPreference = readPreference;
return this;
}
/**
* Set the time limit for processing.
*
@@ -573,6 +632,12 @@ public class AggregationOptions {
if (domainTypeMapping != null) {
options.domainTypeMapping = domainTypeMapping;
}
if (readConcern != null) {
options.readConcern = Optional.of(readConcern);
}
if (readPreference != null) {
options.readPreference = Optional.of(readPreference);
}
return options;
}

View File

@@ -105,6 +105,6 @@ public class AggregationResults<T> implements Iterable<T> {
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");
return object instanceof String ? (String) object : null;
return object instanceof String stringValue ? stringValue : null;
}
}

View File

@@ -97,10 +97,8 @@ public class AggregationUpdate extends Aggregation implements UpdateDefinition {
super(pipeline);
for (AggregationOperation operation : pipeline) {
if (operation instanceof FieldsExposingAggregationOperation) {
((FieldsExposingAggregationOperation) operation).getFields().forEach(it -> {
keysTouched.add(it.getName());
});
if (operation instanceof FieldsExposingAggregationOperation exposingAggregationOperation) {
exposingAggregationOperation.getFields().forEach(it -> keysTouched.add(it.getName()));
}
}
}

View File

@@ -0,0 +1,133 @@
/*
* Copyright 2022-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* A special field that points to a variable {@code $$} expression.
*
* @author Christoph Strobl
* @since 4.1.3
*/
public interface AggregationVariable extends Field {
String PREFIX = "$$";
/**
* @return {@literal true} if the fields {@link #getName() name} does not match the defined {@link #getTarget()
* target}.
*/
@Override
default boolean isAliased() {
return !ObjectUtils.nullSafeEquals(getName(), getTarget());
}
@Override
default String getName() {
return getTarget();
}
@Override
default boolean isInternal() {
return false;
}
/**
* Create a new {@link AggregationVariable} for the given name.
* <p>
* Variables start with {@code $$}. If not, the given value gets prefixed with {@code $$}.
*
* @param value must not be {@literal null}.
* @return new instance of {@link AggregationVariable}.
* @throws IllegalArgumentException if given value is {@literal null}.
*/
static AggregationVariable variable(String value) {
Assert.notNull(value, "Value must not be null");
return new AggregationVariable() {
private final String val = AggregationVariable.prefixVariable(value);
@Override
public String getTarget() {
return val;
}
};
}
/**
* Create a new {@link #isInternal() local} {@link AggregationVariable} for the given name.
* <p>
* Variables start with {@code $$}. If not, the given value gets prefixed with {@code $$}.
*
* @param value must not be {@literal null}.
* @return new instance of {@link AggregationVariable}.
* @throws IllegalArgumentException if given value is {@literal null}.
*/
static AggregationVariable localVariable(String value) {
Assert.notNull(value, "Value must not be null");
return new AggregationVariable() {
private final String val = AggregationVariable.prefixVariable(value);
@Override
public String getTarget() {
return val;
}
@Override
public boolean isInternal() {
return true;
}
};
}
/**
* Check if the given field name reference may be variable.
*
* @param fieldRef can be {@literal null}.
* @return true if given value matches the variable identification pattern.
*/
static boolean isVariable(@Nullable String fieldRef) {
return fieldRef != null && fieldRef.stripLeading().matches("^\\$\\$\\w.*");
}
/**
* Check if the given field may be variable.
*
* @param field can be {@literal null}.
* @return true if given {@link Field field} is an {@link AggregationVariable} or if its value is a
* {@link #isVariable(String) variable}.
*/
static boolean isVariable(Field field) {
if (field instanceof AggregationVariable) {
return true;
}
return isVariable(field.getTarget());
}
private static String prefixVariable(String variable) {
var trimmed = variable.stripLeading();
return trimmed.startsWith(PREFIX) ? trimmed : (PREFIX + trimmed);
}
}

View File

@@ -79,7 +79,7 @@ public class ArrayOperators {
private final @Nullable String fieldReference;
private final @Nullable AggregationExpression expression;
private final @Nullable Collection values;
private final @Nullable Collection<?> values;
/**
* Creates new {@link ArrayOperatorFactory} for given {@literal fieldReference}.
@@ -214,6 +214,10 @@ public class ArrayOperators {
return Filter.filter(fieldReference);
}
if (usesExpression()) {
return Filter.filter(expression);
}
Assert.state(values != null, "Values must not be null");
return Filter.filter(new ArrayList<>(values));
}
@@ -317,7 +321,8 @@ public class ArrayOperators {
}
/**
* Creates new {@link AggregationExpression} that takes the associated array and sorts it by the given {@link Sort order}.
* Creates new {@link AggregationExpression} that takes the associated array and sorts it by the given {@link Sort
* order}.
*
* @return new instance of {@link SortArray}.
* @since 4.0
@@ -397,8 +402,8 @@ public class ArrayOperators {
}
/**
* Creates new {@link AggregationExpression} that return the last element in the given array.
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
* Creates new {@link AggregationExpression} that return the last element in the given array. <strong>NOTE:</strong>
* Requires MongoDB 4.4 or later.
*
* @return new instance of {@link Last}.
* @since 3.4
@@ -649,6 +654,19 @@ public class ArrayOperators {
return new FilterExpressionBuilder().filter(field);
}
/**
* Set the {@link AggregationExpression} resolving to an arry to apply the {@code $filter} to.
*
* @param expression must not be {@literal null}.
* @return never {@literal null}.
* @since 4.2
*/
public static AsBuilder filter(AggregationExpression expression) {
Assert.notNull(expression, "Field must not be null");
return new FilterExpressionBuilder().filter(expression);
}
/**
* Set the {@literal values} to apply the {@code $filter} to.
*
@@ -681,18 +699,27 @@ public class ArrayOperators {
}
private Object getMappedInput(AggregationOperationContext context) {
return input instanceof Field ? context.getReference((Field) input).toString() : input;
if (input instanceof Field field) {
return context.getReference(field).toString();
}
if (input instanceof AggregationExpression expression) {
return expression.toDocument(context);
}
return input;
}
private Object getMappedCondition(AggregationOperationContext context) {
if (!(condition instanceof AggregationExpression)) {
if (!(condition instanceof AggregationExpression aggregationExpression)) {
return condition;
}
NestedDelegatingExpressionAggregationOperationContext nea = new NestedDelegatingExpressionAggregationOperationContext(
context, Collections.singleton(as));
return ((AggregationExpression) condition).toDocument(nea);
return aggregationExpression.toDocument(nea);
}
/**
@@ -715,6 +742,15 @@ public class ArrayOperators {
* @return
*/
AsBuilder filter(Field field);
/**
* Set the {@link AggregationExpression} resolving to an array to apply the {@code $filter} to.
*
* @param expression must not be {@literal null}.
* @return
* @since 4.1.1
*/
AsBuilder filter(AggregationExpression expression);
}
/**
@@ -785,7 +821,7 @@ public class ArrayOperators {
public AsBuilder filter(List<?> array) {
Assert.notNull(array, "Array must not be null");
filter.input = new ArrayList<Object>(array);
filter.input = new ArrayList<>(array);
return this;
}
@@ -797,6 +833,14 @@ public class ArrayOperators {
return this;
}
@Override
public AsBuilder filter(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null");
filter.input = expression;
return this;
}
@Override
public ConditionBuilder as(String variableName) {
@@ -1292,10 +1336,10 @@ public class ArrayOperators {
if (value instanceof Document) {
return value;
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
} else if (value instanceof Field) {
return context.getReference(((Field) value)).toString();
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
} else if (value instanceof Field field) {
return context.getReference(field).toString();
} else {
return context.getMappedObject(new Document("###val###", value)).get("###val###");
}
@@ -1333,7 +1377,7 @@ public class ArrayOperators {
Assert.notNull(expressions, "PropertyExpressions must not be null");
return new Reduce(Fields.field(fieldReference), initialValue,
Arrays.<AggregationExpression>asList(expressions));
Arrays.<AggregationExpression> asList(expressions));
}
};
}
@@ -1471,24 +1515,15 @@ public class ArrayOperators {
}
}
public enum Variable implements Field {
public enum Variable implements AggregationVariable {
THIS {
@Override
public String getName() {
return "$$this";
}
@Override
public String getTarget() {
return "$$this";
}
@Override
public boolean isAliased() {
return false;
}
@Override
public String toString() {
return getName();
@@ -1496,27 +1531,23 @@ public class ArrayOperators {
},
VALUE {
@Override
public String getName() {
return "$$value";
}
@Override
public String getTarget() {
return "$$value";
}
@Override
public boolean isAliased() {
return false;
}
@Override
public String toString() {
return getName();
}
};
@Override
public boolean isInternal() {
return true;
}
/**
* Create a {@link Field} reference to a given {@literal property} prefixed with the {@link Variable} identifier.
* eg. {@code $$value.product}
@@ -1548,6 +1579,16 @@ public class ArrayOperators {
}
};
}
public static boolean isVariable(Field field) {
for (Variable var : values()) {
if (field.getTarget().startsWith(var.getTarget())) {
return true;
}
}
return false;
}
}
}
@@ -1655,7 +1696,7 @@ public class ArrayOperators {
private ZipBuilder(Object sourceArray) {
this.sourceArrays = new ArrayList<Object>();
this.sourceArrays = new ArrayList<>();
this.sourceArrays.add(sourceArray);
}
@@ -1672,14 +1713,14 @@ public class ArrayOperators {
Assert.notNull(arrays, "Arrays must not be null");
for (Object value : arrays) {
if (value instanceof String) {
sourceArrays.add(Fields.field((String) value));
if (value instanceof String stringValue) {
sourceArrays.add(Fields.field(stringValue));
} else {
sourceArrays.add(value);
}
}
return new Zip(Collections.<String, Object>singletonMap("inputs", sourceArrays));
return new Zip(Collections.singletonMap("inputs", sourceArrays));
}
}
}
@@ -1690,7 +1731,7 @@ public class ArrayOperators {
* @author Christoph Strobl
* @author Shashank Sharma
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/in/">https://docs.mongodb.com/manual/reference/operator/aggregation/in/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/in/">https://docs.mongodb.com/manual/reference/operator/aggregation/in/</a>
* @since 2.2
*/
public static class In extends AbstractAggregationExpression {
@@ -1779,7 +1820,7 @@ public class ArrayOperators {
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/arrayToObject/">https://docs.mongodb.com/manual/reference/operator/aggregation/arrayToObject/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/arrayToObject/">https://docs.mongodb.com/manual/reference/operator/aggregation/arrayToObject/</a>
* @since 2.1
*/
public static class ArrayToObject extends AbstractAggregationExpression {
@@ -1976,7 +2017,7 @@ public class ArrayOperators {
/**
* Set the order to put elements in.
*
*
* @param sort must not be {@literal null}.
* @return new instance of {@link SortArray}.
*/

View File

@@ -80,7 +80,7 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
super(bucketOperation);
this.boundaries = new ArrayList<Object>(boundaries);
this.boundaries = new ArrayList<>(boundaries);
this.defaultBucket = defaultBucket;
}
@@ -129,7 +129,7 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
Assert.notNull(boundaries, "Boundaries must not be null");
Assert.noNullElements(boundaries, "Boundaries must not contain null values");
List<Object> newBoundaries = new ArrayList<Object>(this.boundaries.size() + boundaries.length);
List<Object> newBoundaries = new ArrayList<>(this.boundaries.size() + boundaries.length);
newBoundaries.addAll(this.boundaries);
newBoundaries.addAll(Arrays.asList(boundaries));

View File

@@ -324,7 +324,7 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
Assert.hasText(operation, "Operation must not be empty or null");
Assert.notNull(value, "Values must not be null");
List<Object> objects = new ArrayList<Object>(values.length + 1);
List<Object> objects = new ArrayList<>(values.length + 1);
objects.add(value);
objects.addAll(Arrays.asList(values));
return apply(new OperationOutput(operation, objects));
@@ -350,8 +350,8 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
*/
public T as(String alias) {
if (value instanceof OperationOutput) {
return this.operation.andOutput(((OperationOutput) this.value).withAlias(alias));
if (value instanceof OperationOutput operationOutput) {
return this.operation.andOutput(operationOutput.withAlias(alias));
}
if (value instanceof Field) {
@@ -520,7 +520,7 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
Assert.notNull(values, "Values must not be null");
this.operation = operation;
this.values = new ArrayList<Object>(values);
this.values = new ArrayList<>(values);
}
private OperationOutput(Field field, OperationOutput operationOutput) {
@@ -540,18 +540,18 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values != null ? values.size() : 1);
List<Object> result = new ArrayList<>(values != null ? values.size() : 1);
for (Object element : values) {
if (element instanceof Field) {
result.add(context.getReference((Field) element).toString());
} else if (element instanceof Fields) {
for (Field field : (Fields) element) {
if (element instanceof Field field) {
result.add(context.getReference(field).toString());
} else if (element instanceof Fields fields) {
for (Field field : fields) {
result.add(context.getReference(field).toString());
}
} else if (element instanceof AggregationExpression) {
result.add(((AggregationExpression) element).toDocument(context));
} else if (element instanceof AggregationExpression aggregationExpression) {
result.add(aggregationExpression.toDocument(context));
} else {
result.add(element);
}

View File

@@ -278,10 +278,10 @@ public class ConditionalOperators {
@Override
public Document toDocument(AggregationOperationContext context) {
List<Object> list = new ArrayList<Object>();
List<Object> list = new ArrayList<>();
if (condition instanceof Collection) {
for (Object val : ((Collection) this.condition)) {
if (condition instanceof Collection<?> collection) {
for (Object val : collection) {
list.add(mapCondition(val, context));
}
} else {
@@ -294,10 +294,10 @@ public class ConditionalOperators {
private Object mapCondition(Object condition, AggregationOperationContext context) {
if (condition instanceof Field) {
return context.getReference((Field) condition).toString();
} else if (condition instanceof AggregationExpression) {
return ((AggregationExpression) condition).toDocument(context);
if (condition instanceof Field field) {
return context.getReference(field).toString();
} else if (condition instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
} else {
return condition;
}
@@ -305,10 +305,10 @@ public class ConditionalOperators {
private Object resolve(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
return context.getReference((Field) value).toString();
} else if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof Field field) {
return context.getReference(field).toString();
} else if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
} else if (value instanceof Document) {
return value;
}
@@ -482,7 +482,7 @@ public class ConditionalOperators {
public static Switch switchCases(List<CaseOperator> conditions) {
Assert.notNull(conditions, "Conditions must not be null");
return new Switch(Collections.<String, Object> singletonMap("branches", new ArrayList<CaseOperator>(conditions)));
return new Switch(Collections.singletonMap("branches", new ArrayList<>(conditions)));
}
/**
@@ -529,10 +529,10 @@ public class ConditionalOperators {
Document dbo = new Document("case", when.toDocument(context));
if (then instanceof AggregationExpression) {
dbo.put("then", ((AggregationExpression) then).toDocument(context));
} else if (then instanceof Field) {
dbo.put("then", context.getReference((Field) then).toString());
if (then instanceof AggregationExpression aggregationExpression) {
dbo.put("then", aggregationExpression.toDocument(context));
} else if (then instanceof Field field) {
dbo.put("then", context.getReference(field).toString());
} else {
dbo.put("then", then);
}
@@ -629,8 +629,8 @@ public class ConditionalOperators {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
return context.getMappedObject(new Document("$set", value)).get("$set");
@@ -642,13 +642,13 @@ public class ConditionalOperators {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
if (value instanceof CriteriaDefinition) {
if (value instanceof CriteriaDefinition criteriaDefinition) {
Document mappedObject = context.getMappedObject(((CriteriaDefinition) value).getCriteriaObject());
Document mappedObject = context.getMappedObject(criteriaDefinition.getCriteriaObject());
List<Object> clauses = getClauses(context, mappedObject);
return clauses.size() == 1 ? clauses.get(0) : clauses;
}
@@ -659,7 +659,7 @@ public class ConditionalOperators {
private List<Object> getClauses(AggregationOperationContext context, Document mappedObject) {
List<Object> clauses = new ArrayList<Object>();
List<Object> clauses = new ArrayList<>();
for (String key : mappedObject.keySet()) {
@@ -672,23 +672,20 @@ public class ConditionalOperators {
private List<Object> getClauses(AggregationOperationContext context, String key, Object predicate) {
List<Object> clauses = new ArrayList<Object>();
List<Object> clauses = new ArrayList<>();
if (predicate instanceof List) {
if (predicate instanceof List<?> predicates) {
List<?> predicates = (List<?>) predicate;
List<Object> args = new ArrayList<Object>(predicates.size());
List<Object> args = new ArrayList<>(predicates.size());
for (Object clause : (List<?>) predicate) {
if (clause instanceof Document) {
args.addAll(getClauses(context, (Document) clause));
for (Object clause : predicates) {
if (clause instanceof Document document) {
args.addAll(getClauses(context, document));
}
}
clauses.add(new Document(key, args));
} else if (predicate instanceof Document) {
Document nested = (Document) predicate;
} else if (predicate instanceof Document nested) {
for (String s : nested.keySet()) {
@@ -696,14 +693,14 @@ public class ConditionalOperators {
continue;
}
List<Object> args = new ArrayList<Object>(2);
List<Object> args = new ArrayList<>(2);
args.add("$" + key);
args.add(nested.get(s));
clauses.add(new Document(s, args));
}
} else if (!isKeyword(key)) {
List<Object> args = new ArrayList<Object>(2);
List<Object> args = new ArrayList<>(2);
args.add("$" + key);
args.add(predicate);
clauses.add(new Document("$eq", args));
@@ -724,8 +721,8 @@ public class ConditionalOperators {
private Object resolve(AggregationOperationContext context, Object value) {
if (value instanceof Document) {
return context.getMappedObject((Document) value);
if (value instanceof Document document) {
return context.getMappedObject(document);
}
return context.getReference((Field) value).toString();

View File

@@ -159,7 +159,7 @@ public class DateOperators {
* +/-[hh], e.g. "+03"</td>
* </tr>
* </table>
* <strong>NOTE: </strong>Support for timezones in aggregations Requires MongoDB 3.6 or later.
* <strong>NOTE:</strong> Support for timezones in aggregations Requires MongoDB 3.6 or later.
*
* @author Christoph Strobl
* @author Mark Paluch
@@ -985,8 +985,8 @@ public class DateOperators {
java.util.Map<String, Object> args;
if (source instanceof Map) {
args = new LinkedHashMap<>((Map) source);
if (source instanceof Map map) {
args = new LinkedHashMap<>(map);
} else {
args = new LinkedHashMap<>(2);
args.put("date", source);
@@ -1877,12 +1877,12 @@ public class DateOperators {
java.util.Map<String, Object> clone = new LinkedHashMap<>(argumentMap());
if (value instanceof Timezone) {
if (value instanceof Timezone timezone) {
if (ObjectUtils.nullSafeEquals(value, Timezone.none())) {
clone.remove("timezone");
} else {
clone.put("timezone", ((Timezone) value).value);
clone.put("timezone", timezone.value);
}
} else {
clone.put(key, value);

View File

@@ -84,21 +84,21 @@ abstract class DocumentEnhancingOperation implements InheritsFieldsAggregationOp
return exposedFields;
}
private ExposedFields add(Object field) {
private ExposedFields add(Object fieldValue) {
if (field instanceof Field) {
return exposedFields.and(new ExposedField((Field) field, true));
if (fieldValue instanceof Field field) {
return exposedFields.and(new ExposedField(field, true));
}
if (field instanceof String) {
return exposedFields.and(new ExposedField(Fields.field((String) field), true));
if (fieldValue instanceof String fieldName) {
return exposedFields.and(new ExposedField(Fields.field(fieldName), true));
}
throw new IllegalArgumentException(String.format("Expected %s to be a field/property", field));
throw new IllegalArgumentException(String.format("Expected %s to be a field/property", fieldValue));
}
private static Document toSetEntry(Entry<Object, Object> entry, AggregationOperationContext context) {
String field = entry.getKey() instanceof String ? context.getReference((String) entry.getKey()).getRaw()
String field = entry.getKey() instanceof String key ? context.getReference(key).getRaw()
: context.getReference((Field) entry.getKey()).getRaw();
Object value = computeValue(entry.getValue(), context);
@@ -108,20 +108,20 @@ abstract class DocumentEnhancingOperation implements InheritsFieldsAggregationOp
private static Object computeValue(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
return context.getReference((Field) value).toString();
if (value instanceof Field field) {
return context.getReference(field).toString();
}
if (value instanceof ExpressionProjection) {
return ((ExpressionProjection) value).toExpression(context);
if (value instanceof ExpressionProjection expressionProjection) {
return expressionProjection.toExpression(context);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
if (value instanceof Collection) {
return ((Collection<?>) value).stream().map(it -> computeValue(it, context)).collect(Collectors.toList());
if (value instanceof Collection<?> collection) {
return collection.stream().map(it -> computeValue(it, context)).collect(Collectors.toList());
}
return value;

View File

@@ -393,13 +393,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
return true;
}
if (!(obj instanceof DirectFieldReference)) {
if (!(obj instanceof DirectFieldReference fieldReference)) {
return false;
}
DirectFieldReference that = (DirectFieldReference) obj;
return this.field.equals(that.field);
return this.field.equals(fieldReference.field);
}
@Override
@@ -460,12 +458,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
return true;
}
if (!(obj instanceof ExpressionFieldReference)) {
if (!(obj instanceof ExpressionFieldReference fieldReference)) {
return false;
}
ExpressionFieldReference that = (ExpressionFieldReference) obj;
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
return ObjectUtils.nullSafeEquals(this.delegate, fieldReference.delegate);
}
@Override

View File

@@ -96,6 +96,15 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return exposedField;
}
if (rootContext instanceof RelaxedTypeBasedAggregationOperationContext) {
if (field != null) {
return new DirectFieldReference(new ExposedField(field, true));
}
return new DirectFieldReference(new ExposedField(name, true));
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'", name));
}

View File

@@ -67,7 +67,7 @@ public final class Fields implements Iterable<Field> {
Assert.notNull(names, "Field names must not be null");
List<Field> fields = new ArrayList<Field>();
List<Field> fields = new ArrayList<>();
for (String name : names) {
fields.add(field(name));
@@ -114,7 +114,7 @@ public final class Fields implements Iterable<Field> {
private static List<Field> verify(List<Field> fields) {
Map<String, Field> reference = new HashMap<String, Field>();
Map<String, Field> reference = new HashMap<>();
for (Field field : fields) {
@@ -133,7 +133,7 @@ public final class Fields implements Iterable<Field> {
private Fields(Fields existing, Field tail) {
this.fields = new ArrayList<Field>(existing.fields.size() + 1);
this.fields = new ArrayList<>(existing.fields.size() + 1);
this.fields.addAll(existing.fields);
this.fields.add(tail);
}
@@ -245,7 +245,7 @@ public final class Fields implements Iterable<Field> {
private static String cleanUp(String source) {
if (SystemVariable.isReferingToSystemVariable(source)) {
if (AggregationVariable.isVariable(source)) {
return source;
}
@@ -253,10 +253,12 @@ public final class Fields implements Iterable<Field> {
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}
@Override
public String getName() {
return name;
}
@Override
public String getTarget() {
if (isLocalVar() || pointsToDBRefId()) {
@@ -308,13 +310,11 @@ public final class Fields implements Iterable<Field> {
return true;
}
if (!(obj instanceof AggregationField)) {
if (!(obj instanceof AggregationField field)) {
return false;
}
AggregationField that = (AggregationField) obj;
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
return this.name.equals(field.name) && ObjectUtils.nullSafeEquals(this.target, field.target);
}
@Override

View File

@@ -84,14 +84,14 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
graphLookup.put("from", from);
List<Object> mappedStartWith = new ArrayList<Object>(startWith.size());
List<Object> mappedStartWith = new ArrayList<>(startWith.size());
for (Object startWithElement : startWith) {
if (startWithElement instanceof AggregationExpression) {
mappedStartWith.add(((AggregationExpression) startWithElement).toDocument(context));
} else if (startWithElement instanceof Field) {
mappedStartWith.add(context.getReference((Field) startWithElement).toString());
if (startWithElement instanceof AggregationExpression aggregationExpression) {
mappedStartWith.add(aggregationExpression.toDocument(context));
} else if (startWithElement instanceof Field field) {
mappedStartWith.add(context.getReference(field).toString());
} else {
mappedStartWith.add(startWithElement);
}
@@ -237,7 +237,7 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
Assert.notNull(fieldReferences, "FieldReferences must not be null");
Assert.noNullElements(fieldReferences, "FieldReferences must not contain null elements");
List<Object> fields = new ArrayList<Object>(fieldReferences.length);
List<Object> fields = new ArrayList<>(fieldReferences.length);
for (String fieldReference : fieldReferences) {
fields.add(Fields.field(fieldReference));
@@ -269,14 +269,14 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
private List<Object> verifyAndPotentiallyTransformStartsWithTypes(Object... expressions) {
List<Object> expressionsToUse = new ArrayList<Object>(expressions.length);
List<Object> expressionsToUse = new ArrayList<>(expressions.length);
for (Object expression : expressions) {
assertStartWithType(expression);
if (expression instanceof String) {
expressionsToUse.add(Fields.field((String) expression));
if (expression instanceof String stringValue) {
expressionsToUse.add(Fields.field(stringValue));
} else {
expressionsToUse.add(expression);
}
@@ -333,7 +333,7 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
String connectTo) {
this.from = from;
this.startWith = new ArrayList<Object>(startWith);
this.startWith = new ArrayList<>(startWith);
this.connectFrom = Fields.field(connectFrom);
this.connectTo = Fields.field(connectTo);
}

View File

@@ -514,8 +514,8 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
if (reference == null) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
return value;

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
/**
@@ -22,6 +23,7 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldRefe
* {@link AggregationOperationContext}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.9
*/
class InheritingExposedFieldsAggregationOperationContext extends ExposedFieldsAggregationOperationContext {
@@ -43,6 +45,11 @@ class InheritingExposedFieldsAggregationOperationContext extends ExposedFieldsAg
this.previousContext = previousContext;
}
@Override
public Document getMappedObject(Document document) {
return previousContext.getMappedObject(document);
}
@Override
protected FieldReference resolveExposedField(Field field, String name) {

View File

@@ -15,28 +15,44 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.function.Supplier;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* Encapsulates the aggregation framework {@code $lookup}-operation. We recommend to use the static factory method
* {@link Aggregation#lookup(String, String, String, String)} instead of creating instances of this class directly.
* Encapsulates the aggregation framework {@code $lookup}-operation. We recommend to use the builder provided via
* {@link #newLookup()} instead of creating instances of this class directly.
*
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Mark Paluch
* @author Sangyong Choi
* @since 1.9
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/">MongoDB Aggregation Framework:
* $lookup</a>
*/
public class LookupOperation implements FieldsExposingAggregationOperation, InheritsFieldsAggregationOperation {
private final Field from;
private final String from;
@Nullable //
private final Field localField;
@Nullable //
private final Field foreignField;
@Nullable //
private final Let let;
@Nullable //
private final AggregationPipeline pipeline;
private final ExposedField as;
/**
@@ -48,16 +64,55 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
* @param as must not be {@literal null}.
*/
public LookupOperation(Field from, Field localField, Field foreignField, Field as) {
this(((Supplier<String>) () -> {
Assert.notNull(from, "From must not be null");
return from.getTarget();
}).get(), localField, foreignField, null, null, as);
}
/**
* Creates a new {@link LookupOperation} for the given combination of {@link Field}s and {@link AggregationPipeline
* pipeline}.
*
* @param from must not be {@literal null}.
* @param let must not be {@literal null}.
* @param as must not be {@literal null}.
* @since 4.1
*/
public LookupOperation(String from, @Nullable Let let, AggregationPipeline pipeline, Field as) {
this(from, null, null, let, pipeline, as);
}
/**
* Creates a new {@link LookupOperation} for the given combination of {@link Field}s and {@link AggregationPipeline
* pipeline}.
*
* @param from must not be {@literal null}.
* @param localField can be {@literal null} if {@literal pipeline} is present.
* @param foreignField can be {@literal null} if {@literal pipeline} is present.
* @param let can be {@literal null} if {@literal localField} and {@literal foreignField} are present.
* @param as must not be {@literal null}.
* @since 4.1
*/
public LookupOperation(String from, @Nullable Field localField, @Nullable Field foreignField, @Nullable Let let,
@Nullable AggregationPipeline pipeline, Field as) {
Assert.notNull(from, "From must not be null");
Assert.notNull(localField, "LocalField must not be null");
Assert.notNull(foreignField, "ForeignField must not be null");
if (pipeline == null) {
Assert.notNull(localField, "LocalField must not be null");
Assert.notNull(foreignField, "ForeignField must not be null");
} else if (localField == null && foreignField == null) {
Assert.notNull(pipeline, "Pipeline must not be null");
}
Assert.notNull(as, "As must not be null");
this.from = from;
this.localField = localField;
this.foreignField = foreignField;
this.as = new ExposedField(as, true);
this.let = let;
this.pipeline = pipeline;
}
@Override
@@ -70,9 +125,20 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
Document lookupObject = new Document();
lookupObject.append("from", from.getTarget());
lookupObject.append("localField", localField.getTarget());
lookupObject.append("foreignField", foreignField.getTarget());
lookupObject.append("from", from);
if (localField != null) {
lookupObject.append("localField", localField.getTarget());
}
if (foreignField != null) {
lookupObject.append("foreignField", foreignField.getTarget());
}
if (let != null) {
lookupObject.append("let", let.toDocument(context).get("$let", Document.class).get("vars"));
}
if (pipeline != null) {
lookupObject.append("pipeline", pipeline.toDocuments(context));
}
lookupObject.append("as", as.getTarget());
return new Document(getOperator(), lookupObject);
@@ -101,7 +167,7 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
LocalFieldBuilder from(String name);
}
public static interface LocalFieldBuilder {
public static interface LocalFieldBuilder extends PipelineBuilder {
/**
* @param name the field from the documents input to the {@code $lookup} stage, must not be {@literal null} or
@@ -120,7 +186,67 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
AsBuilder foreignField(String name);
}
public static interface AsBuilder {
/**
* @since 4.1
* @author Christoph Strobl
*/
public interface LetBuilder {
/**
* Specifies {@link Let#getVariableNames() variables) that can be used in the
* {@link PipelineBuilder#pipeline(AggregationOperation...) pipeline stages}.
*
* @param let must not be {@literal null}.
* @return never {@literal null}.
* @see PipelineBuilder
*/
PipelineBuilder let(Let let);
/**
* Specifies {@link Let#getVariableNames() variables) that can be used in the
* {@link PipelineBuilder#pipeline(AggregationOperation...) pipeline stages}.
*
* @param variables must not be {@literal null}.
* @return never {@literal null}.
* @see PipelineBuilder
*/
default PipelineBuilder let(ExpressionVariable... variables) {
return let(Let.just(variables));
}
}
/**
* @since 4.1
* @author Christoph Strobl
*/
public interface PipelineBuilder extends LetBuilder {
/**
* Specifies the {@link AggregationPipeline pipeline} that determines the resulting documents.
*
* @param pipeline must not be {@literal null}.
* @return never {@literal null}.
*/
AsBuilder pipeline(AggregationPipeline pipeline);
/**
* Specifies the {@link AggregationPipeline#getOperations() stages} that determine the resulting documents.
*
* @param stages must not be {@literal null} can be empty.
* @return never {@literal null}.
*/
default AsBuilder pipeline(AggregationOperation... stages) {
return pipeline(AggregationPipeline.of(stages));
}
/**
* @param name the name of the new array field to add to the input documents, must not be {@literal null} or empty.
* @return new instance of {@link LookupOperation}.
*/
LookupOperation as(String name);
}
public static interface AsBuilder extends PipelineBuilder {
/**
* @param name the name of the new array field to add to the input documents, must not be {@literal null} or empty.
@@ -138,10 +264,12 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public static final class LookupOperationBuilder
implements FromBuilder, LocalFieldBuilder, ForeignFieldBuilder, AsBuilder {
private @Nullable Field from;
private @Nullable String from;
private @Nullable Field localField;
private @Nullable Field foreignField;
private @Nullable ExposedField as;
private @Nullable Let let;
private @Nullable AggregationPipeline pipeline;
/**
* Creates new builder for {@link LookupOperation}.
@@ -156,18 +284,10 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public LocalFieldBuilder from(String name) {
Assert.hasText(name, "'From' must not be null or empty");
from = Fields.field(name);
from = name;
return this;
}
@Override
public LookupOperation as(String name) {
Assert.hasText(name, "'As' must not be null or empty");
as = new ExposedField(Fields.field(name), true);
return new LookupOperation(from, localField, foreignField, as);
}
@Override
public AsBuilder foreignField(String name) {
@@ -183,5 +303,29 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
localField = Fields.field(name);
return this;
}
@Override
public PipelineBuilder let(Let let) {
Assert.notNull(let, "Let must not be null");
this.let = let;
return this;
}
@Override
public AsBuilder pipeline(AggregationPipeline pipeline) {
Assert.notNull(pipeline, "Pipeline must not be null");
this.pipeline = pipeline;
return this;
}
@Override
public LookupOperation as(String name) {
Assert.hasText(name, "'As' must not be null or empty");
as = new ExposedField(Fields.field(name), true);
return new LookupOperation(from, localField, foreignField, let, pipeline, as);
}
}
}

View File

@@ -339,8 +339,8 @@ public class MergeOperation implements FieldsExposingAggregationOperation, Inher
Document toDocument(AggregationOperationContext context) {
if (value instanceof Aggregation) {
return new Document("whenMatched", ((Aggregation) value).toPipeline(context));
if (value instanceof Aggregation aggregation) {
return new Document("whenMatched", aggregation.toPipeline(context));
}
return new Document("whenMatched", value);

View File

@@ -245,12 +245,8 @@ public class ObjectOperators {
@SuppressWarnings("unchecked")
private Object potentiallyExtractSingleValue(Object value) {
if (value instanceof Collection) {
Collection<Object> collection = ((Collection<Object>) value);
if (collection.size() == 1) {
return collection.iterator().next();
}
if (value instanceof Collection<?> collection && collection.size() == 1) {
return collection.iterator().next();
}
return value;
}

View File

@@ -115,8 +115,8 @@ public class PrefixingDelegatingAggregationOperationContext implements Aggregati
List<Object> prefixed = new ArrayList<>(sourceCollection.size());
for (Object o : sourceCollection) {
if (o instanceof Document) {
prefixed.add(doPrefix((Document) o));
if (o instanceof Document document) {
prefixed.add(doPrefix(document));
} else {
prefixed.add(o);
}

View File

@@ -207,10 +207,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
for (Object value : values) {
if (value instanceof Field) {
builder.and((Field) value);
} else if (value instanceof AggregationExpression) {
builder.and((AggregationExpression) value);
if (value instanceof Field field) {
builder.and(field);
} else if (value instanceof AggregationExpression aggregationExpression) {
builder.and(aggregationExpression);
} else {
builder.and(value);
}
@@ -330,7 +330,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
* @param parameters parameters must not be {@literal null}.
*/
public ExpressionProjectionOperationBuilder(String expression, ProjectionOperation operation, Object[] parameters) {
@@ -347,7 +347,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.length + 1);
List<Object> result = new ArrayList<>(values.length + 1);
result.add(ExpressionProjection.toMongoExpression(context,
ExpressionProjectionOperationBuilder.this.expression, ExpressionProjectionOperationBuilder.this.params));
result.addAll(Arrays.asList(values));
@@ -1455,19 +1455,19 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size());
List<Object> result = new ArrayList<>(values.size());
result.add(context.getReference(getField()).toString());
for (Object element : values) {
if (element instanceof Field) {
result.add(context.getReference((Field) element).toString());
} else if (element instanceof Fields) {
for (Field field : (Fields) element) {
if (element instanceof Field field) {
result.add(context.getReference(field).toString());
} else if (element instanceof Fields fields) {
for (Field field : fields) {
result.add(context.getReference(field).toString());
}
} else if (element instanceof AggregationExpression) {
result.add(((AggregationExpression) element).toDocument(context));
} else if (element instanceof AggregationExpression aggregationExpression) {
result.add(aggregationExpression.toDocument(context));
} else {
result.add(element);
}
@@ -1734,6 +1734,29 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
}
}
/**
* A {@link Projection} including all top level fields of the given target type mapped to include potentially
* deviating field names.
*
* @since 2.2
* @author Christoph Strobl
*/
static class FilterProjection extends Projection {
public static String FILTER_ELEMENT = "filterElement";
private final Object value;
FilterProjection(String fieldReference, Object value) {
super(Fields.field(FILTER_ELEMENT + "." + fieldReference));
this.value = value;
}
@Override
public Document toDocument(AggregationOperationContext context) {
return new Document(getExposedField().getName(), value);
}
}
/**
* Builder for {@code array} projections.
*
@@ -1829,20 +1852,16 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private Object toArrayEntry(Object projection, AggregationOperationContext ctx) {
if (projection instanceof Field) {
return ctx.getReference((Field) projection).toString();
if (projection instanceof Field field) {
return ctx.getReference(field).toString();
}
if (projection instanceof AggregationExpression) {
return ((AggregationExpression) projection).toDocument(ctx);
if (projection instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(ctx);
}
if (projection instanceof FieldProjection) {
return ctx.getReference(((FieldProjection) projection).getExposedField().getTarget()).toString();
}
if (projection instanceof Projection) {
((Projection) projection).toDocument(ctx);
if (projection instanceof FieldProjection fieldProjection) {
return ctx.getReference(fieldProjection.getExposedField().getTarget()).toString();
}
return projection;

View File

@@ -226,14 +226,14 @@ public class RedactOperation implements AggregationOperation {
private ThenBuilder when() {
if (when instanceof CriteriaDefinition) {
return ConditionalOperators.Cond.when((CriteriaDefinition) when);
if (when instanceof CriteriaDefinition criteriaDefinition) {
return ConditionalOperators.Cond.when(criteriaDefinition);
}
if (when instanceof AggregationExpression) {
return ConditionalOperators.Cond.when((AggregationExpression) when);
if (when instanceof AggregationExpression aggregationExpression) {
return ConditionalOperators.Cond.when(aggregationExpression);
}
if (when instanceof Document) {
return ConditionalOperators.Cond.when((Document) when);
if (when instanceof Document document) {
return ConditionalOperators.Cond.when(document);
}
throw new IllegalArgumentException(String.format(

View File

@@ -21,7 +21,6 @@ import java.util.Collections;
import java.util.List;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.expression.spel.ast.Projection;
import org.springframework.util.Assert;
@@ -249,9 +248,9 @@ public class ReplaceRootOperation implements FieldsExposingAggregationOperation
public ReplaceRootDocumentOperation as(String fieldName) {
if (value instanceof AggregationExpression) {
if (value instanceof AggregationExpression aggregationExpression) {
return new ReplaceRootDocumentOperation(currentOperation,
ReplacementDocument.forExpression(fieldName, (AggregationExpression) value));
ReplacementDocument.forExpression(fieldName, aggregationExpression));
}
return new ReplaceRootDocumentOperation(currentOperation, ReplacementDocument.forSingleValue(fieldName, value));
@@ -431,6 +430,7 @@ public class ReplaceRootOperation implements FieldsExposingAggregationOperation
* @param context will never be {@literal null}.
* @return never {@literal null}.
*/
@Override
Document toDocument(AggregationOperationContext context);
}

View File

@@ -62,23 +62,23 @@ public class ReplaceWithOperation extends ReplaceRootOperation {
public static ReplaceWithOperation replaceWithValueOf(Object value) {
Assert.notNull(value, "Value must not be null");
return new ReplaceWithOperation((ctx) -> {
return new ReplaceWithOperation(ctx -> {
Object target = value instanceof String ? Fields.field((String) value) : value;
Object target = value instanceof String stringValue ? Fields.field(stringValue) : value;
return computeValue(target, ctx);
});
}
private static Object computeValue(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
return context.getReference((Field) value).toString();
if (value instanceof Field field) {
return context.getReference(field).toString();
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
if (value instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
if (value instanceof Collection) {
return ((Collection) value).stream().map(it -> computeValue(it, context)).collect(Collectors.toList());
if (value instanceof Collection<?> collection) {
return collection.stream().map(it -> computeValue(it, context)).collect(Collectors.toList());
}
return value;

View File

@@ -140,7 +140,7 @@ public class SetOperation extends DocumentEnhancingOperation {
@Override
public SetOperation toValueOf(Object value) {
valueMap.put(field, value instanceof String ? Fields.fields((String) value) : value);
valueMap.put(field, value instanceof String stringValue ? Fields.fields(stringValue) : value);
return FieldAppender.this.build();
}

View File

@@ -78,10 +78,10 @@ public class SetWindowFieldsOperation
Document $setWindowFields = new Document();
if (partitionBy != null) {
if (partitionBy instanceof AggregationExpression) {
$setWindowFields.append("partitionBy", ((AggregationExpression) partitionBy).toDocument(context));
} else if (partitionBy instanceof Field) {
$setWindowFields.append("partitionBy", context.getReference((Field) partitionBy).toString());
if (partitionBy instanceof AggregationExpression aggregationExpression) {
$setWindowFields.append("partitionBy", aggregationExpression.toDocument(context));
} else if (partitionBy instanceof Field field) {
$setWindowFields.append("partitionBy", context.getReference(field).toString());
} else {
$setWindowFields.append("partitionBy", partitionBy);
}

View File

@@ -259,7 +259,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
private Document createOperationObjectAndAddToPreviousArgumentsIfNecessary(
AggregationExpressionTransformationContext<OperatorNode> context, OperatorNode currentNode) {
Document nextDocument = new Document(currentNode.getMongoOperator(), new ArrayList<Object>());
Document nextDocument = new Document(currentNode.getMongoOperator(), new ArrayList<>());
if (!context.hasPreviousOperation()) {
return nextDocument;
@@ -282,7 +282,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
@Nullable Object leftResult) {
Object result = leftResult instanceof Number ? leftResult
: new Document("$multiply", Arrays.<Object> asList(Integer.valueOf(-1), leftResult));
: new Document("$multiply", Arrays.asList(Integer.valueOf(-1), leftResult));
if (leftResult != null && context.hasPreviousOperation()) {
context.addToPreviousOperation(result);
@@ -453,7 +453,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
}
else {
List<Object> argList = new ArrayList<Object>();
List<Object> argList = new ArrayList<>();
for (ExpressionNode childNode : node) {
argList.add(transform(childNode, context));
@@ -516,7 +516,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
protected Object convert(AggregationExpressionTransformationContext<NotOperatorNode> context) {
NotOperatorNode node = context.getCurrentNode();
List<Object> args = new ArrayList<Object>();
List<Object> args = new ArrayList<>();
for (ExpressionNode childNode : node) {
args.add(transform(childNode, context));

View File

@@ -24,7 +24,7 @@ import org.springframework.lang.Nullable;
* @author Christoph Strobl
* @see <a href="https://docs.mongodb.com/manual/reference/aggregation-variables">Aggregation Variables</a>.
*/
public enum SystemVariable {
public enum SystemVariable implements AggregationVariable {
/**
* Variable for the current datetime.
@@ -82,8 +82,6 @@ public enum SystemVariable {
*/
SEARCH_META;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
@@ -93,13 +91,12 @@ public enum SystemVariable {
*/
public static boolean isReferingToSystemVariable(@Nullable String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
String candidate = variableNameFrom(fieldRef);
if (candidate == null) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
candidate = candidate.startsWith(PREFIX) ? candidate.substring(2) : candidate;
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
@@ -113,4 +110,20 @@ public enum SystemVariable {
public String toString() {
return PREFIX.concat(name());
}
@Override
public String getTarget() {
return toString();
}
@Nullable
static String variableNameFrom(@Nullable String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return null;
}
int indexOfFirstDot = fieldRef.indexOf('.');
return indexOfFirstDot == -1 ? fieldRef : fieldRef.substring(2, indexOfFirstDot);
}
}

View File

@@ -133,7 +133,7 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
protected FieldReference getReferenceFor(Field field) {
if(entity.getNullable() == null) {
if(entity.getNullable() == null || AggregationVariable.isVariable(field)) {
return new DirectFieldReference(new ExposedField(field, true));
}

View File

@@ -138,12 +138,12 @@ public class UnionWithOperation implements AggregationOperation {
private AggregationOperationContext computeContext(AggregationOperationContext source) {
if (source instanceof TypeBasedAggregationOperationContext) {
return ((TypeBasedAggregationOperationContext) source).continueOnMissingFieldReference(domainType != null ? domainType : Object.class);
if (source instanceof TypeBasedAggregationOperationContext aggregationOperationContext) {
return aggregationOperationContext.continueOnMissingFieldReference(domainType != null ? domainType : Object.class);
}
if (source instanceof ExposedFieldsAggregationOperationContext) {
return computeContext(((ExposedFieldsAggregationOperationContext) source).getRootContext());
if (source instanceof ExposedFieldsAggregationOperationContext aggregationOperationContext) {
return computeContext(aggregationOperationContext.getRootContext());
}
return source;

View File

@@ -96,8 +96,8 @@ public class UnsetOperation implements InheritsFieldsAggregationOperation {
List<String> fieldNames = new ArrayList<>(fields.size());
for (Object it : fields) {
if (it instanceof Field) {
fieldNames.add(((Field) it).getName());
if (it instanceof Field field) {
fieldNames.add(field.getName());
} else {
fieldNames.add(it.toString());
}
@@ -123,16 +123,16 @@ public class UnsetOperation implements InheritsFieldsAggregationOperation {
private Object computeFieldName(Object field, AggregationOperationContext context) {
if (field instanceof Field) {
return context.getReference((Field) field).getRaw();
if (field instanceof Field fieldObject) {
return context.getReference(fieldObject).getRaw();
}
if (field instanceof AggregationExpression) {
return ((AggregationExpression) field).toDocument(context);
if (field instanceof AggregationExpression aggregationExpression) {
return aggregationExpression.toDocument(context);
}
if (field instanceof String) {
return context.getReference((String) field).getRaw();
if (field instanceof String stringValue) {
return context.getReference(stringValue).getRaw();
}
return field;

View File

@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
@@ -175,8 +174,8 @@ public class VariableOperators {
exposedFields, context);
Document input;
if (sourceArray instanceof Field) {
input = new Document("input", context.getReference((Field) sourceArray).toString());
if (sourceArray instanceof Field field) {
input = new Document("input", context.getReference(field).toString());
} else {
input = new Document("input", ((AggregationExpression) sourceArray).toDocument(context));
}
@@ -224,28 +223,41 @@ public class VariableOperators {
public static class Let implements AggregationExpression {
private final List<ExpressionVariable> vars;
@Nullable //
private final AggregationExpression expression;
private Let(List<ExpressionVariable> vars, AggregationExpression expression) {
private Let(List<ExpressionVariable> vars, @Nullable AggregationExpression expression) {
this.vars = vars;
this.expression = expression;
}
/**
* Create a new {@link Let} holding just the given {@literal variables}.
*
* @param variables must not be {@literal null}.
* @return new instance of {@link Let}.
* @since 4.1
*/
public static Let just(ExpressionVariable... variables) {
return new Let(List.of(variables), null);
}
/**
* Start creating new {@link Let} by defining the variables for {@code $vars}.
*
* @param variables must not be {@literal null}.
* @return
*/
public static LetBuilder define(final Collection<ExpressionVariable> variables) {
public static LetBuilder define(Collection<ExpressionVariable> variables) {
Assert.notNull(variables, "Variables must not be null");
return new LetBuilder() {
@Override
public Let andApply(final AggregationExpression expression) {
public Let andApply(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null");
return new Let(new ArrayList<ExpressionVariable>(variables), expression);
@@ -259,19 +271,10 @@ public class VariableOperators {
* @param variables must not be {@literal null}.
* @return
*/
public static LetBuilder define(final ExpressionVariable... variables) {
public static LetBuilder define(ExpressionVariable... variables) {
Assert.notNull(variables, "Variables must not be null");
return new LetBuilder() {
@Override
public Let andApply(final AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null");
return new Let(Arrays.asList(variables), expression);
}
};
return define(List.of(variables));
}
public interface LetBuilder {
@@ -283,10 +286,11 @@ public class VariableOperators {
* @return
*/
Let andApply(AggregationExpression expression);
}
@Override
public Document toDocument(final AggregationOperationContext context) {
public Document toDocument(AggregationOperationContext context) {
return toLet(ExposedFields.synthetic(Fields.fields(getVariableNames())), context);
}
@@ -312,16 +316,22 @@ public class VariableOperators {
}
letExpression.put("vars", mappedVars);
letExpression.put("in", getMappedIn(operationContext));
if (expression != null) {
letExpression.put("in", getMappedIn(operationContext));
}
return new Document("$let", letExpression);
}
private Document getMappedVariable(ExpressionVariable var, AggregationOperationContext context) {
return new Document(var.variableName,
var.expression instanceof AggregationExpression ? ((AggregationExpression) var.expression).toDocument(context)
: var.expression);
if (var.expression instanceof AggregationExpression expression) {
return new Document(var.variableName, expression.toDocument(context));
}
if (var.expression instanceof Field field) {
return new Document(var.variableName, context.getReference(field).toString());
}
return new Document(var.variableName, var.expression);
}
private Object getMappedIn(AggregationOperationContext context) {
@@ -373,6 +383,10 @@ public class VariableOperators {
return new ExpressionVariable(variableName, expression);
}
public ExpressionVariable forField(String fieldRef) {
return new ExpressionVariable(variableName, Fields.field(fieldRef));
}
/**
* Create a new {@link ExpressionVariable} with current name and given {@literal expressionObject}.
*

View File

@@ -177,10 +177,10 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
return Alias.NONE;
}
if (source instanceof Document) {
return Alias.ofNullable(((Document) source).get(typeKey));
} else if (source instanceof DBObject) {
return Alias.ofNullable(((DBObject) source).get(typeKey));
if (source instanceof Document document) {
return Alias.ofNullable(document.get(typeKey));
} else if (source instanceof DBObject dbObject) {
return Alias.ofNullable(dbObject.get(typeKey));
}
throw new IllegalArgumentException("Cannot read alias from " + source.getClass());
@@ -190,10 +190,10 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
if (typeKey != null) {
if (sink instanceof Document) {
((Document) sink).put(typeKey, alias);
} else if (sink instanceof DBObject) {
((DBObject) sink).put(typeKey, alias);
if (sink instanceof Document document) {
document.put(typeKey, alias);
} else if (sink instanceof DBObject dbObject) {
dbObject.put(typeKey, alias);
}
}
}

View File

@@ -108,8 +108,8 @@ public class DefaultReferenceResolver implements ReferenceResolver {
private Object createLazyLoadingProxy(MongoPersistentProperty property, Object source,
ReferenceLookupDelegate referenceLookupDelegate, LookupFunction lookupFunction, MongoEntityReader entityReader) {
return proxyFactory.createLazyLoadingProxy(property, it -> {
return referenceLookupDelegate.readReference(it, source, lookupFunction, entityReader);
}, source instanceof DocumentReferenceSource ? ((DocumentReferenceSource)source).getTargetSource() : source);
return proxyFactory.createLazyLoadingProxy(property,
it -> referenceLookupDelegate.readReference(it, source, lookupFunction, entityReader),
source instanceof DocumentReferenceSource documentSource ? documentSource.getTargetSource() : source);
}
}

View File

@@ -169,8 +169,8 @@ class DocumentAccessor {
Object existing = BsonUtils.asMap(source).get(key);
if (existing instanceof Document) {
return (Document) existing;
if (existing instanceof Document document) {
return document;
}
Document nested = new Document();

View File

@@ -73,8 +73,8 @@ class DocumentPointerFactory {
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
MongoPersistentProperty property, Object value, Class<?> typeHint) {
if (value instanceof LazyLoadingProxy) {
return () -> ((LazyLoadingProxy) value).getSource();
if (value instanceof LazyLoadingProxy proxy) {
return proxy::getSource;
}
if (conversionService.canConvert(typeHint, DocumentPointer.class)) {
@@ -94,8 +94,8 @@ class DocumentPointerFactory {
return () -> conversionService.convert(idValue, idProperty.getFieldType());
}
if (idValue instanceof String && ObjectId.isValid((String) idValue)) {
return () -> new ObjectId((String) idValue);
if (idValue instanceof String stringValue && ObjectId.isValid((String) idValue)) {
return () -> new ObjectId(stringValue);
}
return () -> idValue;
@@ -210,13 +210,13 @@ class DocumentPointerFactory {
lookup, entry.getKey()));
}
if (entry.getValue() instanceof Document) {
if (entry.getValue() instanceof Document document) {
MongoPersistentProperty persistentProperty = persistentEntity.getPersistentProperty(entry.getKey());
if (persistentProperty != null && persistentProperty.isEntity()) {
MongoPersistentEntity<?> nestedEntity = mappingContext.getPersistentEntity(persistentProperty.getType());
target.put(entry.getKey(), updatePlaceholders((Document) entry.getValue(), new Document(), mappingContext,
target.put(entry.getKey(), updatePlaceholders(document, new Document(), mappingContext,
nestedEntity, nestedEntity.getPropertyAccessor(propertyAccessor.getProperty(persistentProperty))));
} else {
target.put(entry.getKey(), updatePlaceholders((Document) entry.getValue(), new Document(), mappingContext,

View File

@@ -69,7 +69,7 @@ public class DocumentReferenceSource {
*/
@Nullable
static Object getTargetSource(Object source) {
return source instanceof DocumentReferenceSource ? ((DocumentReferenceSource) source).getTargetSource() : source;
return source instanceof DocumentReferenceSource referenceSource ? referenceSource.getTargetSource() : source;
}
/**
@@ -79,6 +79,6 @@ public class DocumentReferenceSource {
* @return
*/
static Object getSelf(Object self) {
return self instanceof DocumentReferenceSource ? ((DocumentReferenceSource) self).getSelf() : self;
return self instanceof DocumentReferenceSource referenceSource ? referenceSource.getSelf() : self;
}
}

Some files were not shown because too many files have changed in this diff Show More