Compare commits

..

265 Commits

Author SHA1 Message Date
Christoph Strobl
d94d273010 Fix test 2023-01-25 15:05:56 +01:00
Christoph Strobl
b9f6463337 decrypt? 2023-01-25 14:32:38 +01:00
Christoph Strobl
095022e71d reactive FLE encryptiion works -> next decrypt 2023-01-25 14:32:35 +01:00
Christoph Strobl
329b4b2881 Hacking - Reactive FLE
experiment with resolving reactive types in document
2023-01-25 14:24:35 +01:00
Christoph Strobl
73aeb7a425 Test encryption during update 2023-01-25 14:24:35 +01:00
Christoph Strobl
4b8ac4d249 Some changes that allow reading the alt key from a field
typically only supported in automatic schema but neat to have it here as well. eg. for customer data cyper based on eg. username.
Also make sure to translate decryption exceptions.
2023-01-25 14:24:35 +01:00
Christoph Strobl
1a7157fa7c Encrypt collection of complex types. 2023-01-25 14:24:35 +01:00
Christoph Strobl
10a089fe77 Encrypt collection of simple values 2023-01-25 14:24:35 +01:00
Christoph Strobl
7b93379165 Enable full encryption of nested documents. 2023-01-25 14:24:35 +01:00
Christoph Strobl
0361c3acc9 Hacking 2023-01-25 14:24:35 +01:00
Christoph Strobl
a6641e0c01 Prepare issue branch. 2023-01-25 14:24:34 +01:00
Mark Paluch
33902b5061 Polishing.
Move QuerydslPredicateExecutor hints to RepositoryRuntimeHints.

See #4244
Original pull request: #4245
2023-01-23 14:08:43 +01:00
Christoph Strobl
d00db4bd40 Add missing hints for Querydsl integration.
This commit adds missing reflection configuration for Querydsl integration. We now also make sure to call the queryMixing getter instead of reading the field via reflection.

Closes #4244
Original pull request: #4245
2023-01-23 14:08:43 +01:00
Christoph Strobl
a5dcbf043a Update links in reference documentation.
We now use the springDocsUrl attribute provided via spring-projects/spring-data-build#1895 to resolve links to framework documentation.

Original Pull Request: #4267
2023-01-18 14:23:22 +01:00
robeatoz
c31203582f Fix parameter and method name in reference documentation.
Closes: #4247
2023-01-16 11:22:29 +01:00
Emre Uygun
f146afecdc Fix typo in reference documentation.
Closes: #4250
2023-01-16 11:19:32 +01:00
Christoph Strobl
324a541a64 Polishing.
Original Pull Request: #4255
2023-01-16 11:17:49 +01:00
Michael Krog
6b71d773d7 Fixes return in Javadoc.
Closes: #4255
2023-01-16 11:15:14 +01:00
Patouche
10447afe0c Fix typo in reference documentation.
Closes: #4268
2023-01-16 10:49:42 +01:00
soumyaPrakashB
c9dfd60f0f Add missing Nullable annotation.
For one of constructor arguments of the AggregationOptions the Nullable annotation for the cursor argument is missing.

Closes: #4256
2023-01-16 10:47:28 +01:00
Mark Paluch
26a8fafd03 Upgrade to MongoDB driver 4.8.2.
Closes #4270
2023-01-13 10:30:01 +01:00
Mark Paluch
00f652a094 Polishing.
Add missing package-info.

See #4248
Original pull request: #4249
2023-01-12 08:47:16 +01:00
Christoph Strobl
d050ae5732 Exclude mongodb and data.mongodb namespaces from reflection contribution.
In some cases the users domain model may hold references to spring data or MongoDB specific types which should not be included in the reflection configuration as they are part of the static runtime hints configuration.

Closes #4248
Original pull request: #4249
2023-01-12 08:47:16 +01:00
Christoph Strobl
8bcab93588 Avoid multiple mapping iterations.
A 2nd pass is no longer needed as the context already does all the work.

Closes: #4043
Original pull request: #4240
2023-01-11 16:04:36 +01:00
Mark Paluch
1839f55055 Polishing.
Introduce HintFunction to encapsulate how hints are applied and to remove code duplications.

See #4238
Original pull request: #4243
2023-01-11 16:02:14 +01:00
Christoph Strobl
4220df5bf8 Accept index names as hint for aggregations.
Closes #4238
Original pull request: #4243
2023-01-11 16:02:05 +01:00
Christoph Strobl
95c6d1531f Fix invalid format specifier in debug statement.
Closes #4241
Original pull request: #4246
2023-01-11 15:29:22 +01:00
Christoph Strobl
b7ed099e06 Update broken links in reference documentation.
Original Pull Request: #4267
2023-01-11 13:46:50 +01:00
Maksymilian Babarowski
7e2e546e55 Update links to Spring Framework reference docs.
Closes: #4267
2023-01-11 13:46:38 +01:00
yangwenjie008
7ce2ebe26e Fix class loader issue with LazyLoadingProxyInterceptor.
Restore original behaviour that was unintentionally changed by modifications related to #4148.

Closes: #4260
Original Pull Request: #4261
2023-01-11 08:49:38 +01:00
Mark Paluch
fbf4d1baa8 Extend license header copyright years to 2023.
See #4264
2023-01-02 09:53:33 +01:00
Christoph Strobl
187f260fe4 Upgrade to MongoDB driver 4.8.1
Closes: #4251
2022-12-12 13:45:20 +01:00
Mark Paluch
04411075b4 Update CI properties.
See #4235
2022-11-18 15:31:10 +01:00
Mark Paluch
459a9c191b After release cleanups.
See #4209
2022-11-18 14:30:20 +01:00
Mark Paluch
137cba8bbb Prepare next development iteration.
See #4209
2022-11-18 14:30:19 +01:00
Mark Paluch
548cbd87b6 Release version 4.0 GA (2022.0.0).
See #4209
2022-11-18 14:26:22 +01:00
Mark Paluch
02647ad125 Prepare 4.0 GA (2022.0.0).
See #4209
2022-11-18 14:26:12 +01:00
Christoph Strobl
fe549f7254 Add Nullable annotation to parameter of overridden equals method.
Closes: #4226
Original pull request: #4277
2022-11-16 10:39:47 +01:00
Christoph Strobl
c069e094e6 Implement equals, hashCode and toString for CollectionOptions.
Closes: #4210
Original pull request: #4277
2022-11-16 10:39:14 +01:00
Christoph Strobl
62f3656ebb Replace upgrading section in documentation with links to the release notes.
Closes: #4228
2022-11-15 09:16:48 +01:00
Christoph Strobl
cbc718e03d Upgrade to MongoDB driver 4.8.0
Closes: #4201
2022-11-15 07:37:11 +01:00
Christoph Strobl
23be69b9ce Upgrade to MongoDB driver 4.8.0-rc0
See: #4201
2022-11-14 08:46:59 +01:00
Christoph Strobl
b7b5b085b3 Remove micrometer docs plugin.
See: spring-projects/spring-data-build#1836
2022-11-14 08:46:59 +01:00
Mark Paluch
2d292c50b0 After release cleanups.
See #4215
2022-11-04 15:26:39 +01:00
Mark Paluch
f959a77890 Prepare next development iteration.
See #4215
2022-11-04 15:26:37 +01:00
Mark Paluch
c04c3d66a3 Release version 4.0 RC2 (2022.0.0).
See #4215
2022-11-04 15:23:17 +01:00
Mark Paluch
b204c1b33e Prepare 4.0 RC2 (2022.0.0).
See #4215
2022-11-04 15:23:06 +01:00
Christoph Strobl
dfa029b341 Guard transaction proxy creation hints.
Closes: #4225
Related: #4221
2022-11-03 12:48:49 +01:00
Christoph Strobl
04a8c47cda Follow API changes in data-commons
Update imports of moved AOT processing types and update reactive wrapper coordinates to new location.

Closes: #4224
See: spring-projects/spring-data-commons#2708
2022-11-02 11:51:11 +01:00
Christoph Strobl
88ea57f2be Provide native hints to create transaction proxies at runtime.
Closes: #4221
2022-11-02 10:26:45 +01:00
Mark Paluch
521bbd2535 Update CI properties.
See #4215
2022-10-31 10:36:34 +01:00
Mark Paluch
0474632640 Upgrade to Java 17.0.4.1_1 and pin base image distribution.
See #4215
2022-10-31 10:27:36 +01:00
Mark Paluch
8aab5e5a01 Use correct boolean type for JSON Schema creation.
We now use the correct JSON type boolean again when creating schemas. Furthermore, we use the bool type for MongoDB $type queries.

Closes #4220
2022-10-27 10:16:58 +02:00
Christoph Strobl
3db5fc728e Polishing.
See: #4211
Original pull request: #4212
2022-10-24 15:11:25 +02:00
Christoph Strobl
b027f15a4c Add missing runtime hint for QuerydslMongoPredicateExecutor.
Closes: #4211
Original pull request: #4212
2022-10-24 15:11:10 +02:00
Mark Paluch
fd0a554d59 Polishing.
Use existing constants.

See #4218
2022-10-24 15:04:50 +02:00
Marcin Grzejszczak
d4daa305a8 Align the context propagation entries with the rest of the portfolio.
Closes #4218
2022-10-24 15:04:33 +02:00
Mark Paluch
2d63d6006d Align conventions with OpenTelemetry spec.
See: #4216
2022-10-21 11:49:03 +02:00
Mark Paluch
5007e68cc1 Polishing.
See: #4216
2022-10-21 11:49:03 +02:00
Mark Paluch
3ea4e0f9dd Update documentation.
See: #4216
2022-10-21 11:49:03 +02:00
Greg L. Turnquist
e9ac77c058 Improve configuration support for Observability integration.
Closes: #4216
2022-10-21 11:27:58 +02:00
Christoph Strobl
daef8b6e8e Add missing reflection hints for generated cglib proxies.
Closes: #4217
2022-10-20 15:59:06 +02:00
Mark Paluch
f671a9bd43 After release cleanups.
See #4175
2022-10-13 17:31:16 +02:00
Mark Paluch
57b52862c8 Prepare next development iteration.
See #4175
2022-10-13 17:31:15 +02:00
Mark Paluch
af917b9465 Release version 4.0 RC1 (2022.0.0).
See #4175
2022-10-13 17:24:25 +02:00
Mark Paluch
ee545487b8 Prepare 4.0 RC1 (2022.0.0).
See #4175
2022-10-13 17:24:03 +02:00
Mark Paluch
240e53794c Fix Javadoc.
See #4139
Original pull request: #4182.
2022-10-12 15:59:33 +02:00
Christoph Strobl
7b6a06888d Update javadoc.
See: #4184
See: #4197
Original pull request: #4203.
2022-10-12 15:25:13 +02:00
Christoph Strobl
034a3528af Preserve given Id on insert.
This commit fixes an issue where an existing Id got replaced with a generated one when using MongoId annotation.

Closes: #4184
Closes: #4197
Original pull request: #4203.
2022-10-12 15:24:50 +02:00
Christoph Strobl
ff28789507 Polishing.
See #4139
Original pull request: #4182.
2022-10-12 15:12:42 +02:00
Christoph Strobl
cdfdeafdac Update aggregation reference documentation.
See #4139
Original pull request: #4182.
2022-10-12 15:12:38 +02:00
Christoph Strobl
16a35e0329 Add support for $densify aggregation stage.
See #4139
Original pull request: #4182.
2022-10-12 15:12:38 +02:00
Christoph Strobl
79f05c3d7f Move Expr operator one level up.
The Expr operator should be held within ExpressionOperators not its factory.

See #4139
Original pull request: #4182.
2022-10-12 15:12:37 +02:00
Christoph Strobl
9217821472 Add support for $locf aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:12:37 +02:00
Christoph Strobl
8d223abd05 Add support for $tsSecond aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:12:37 +02:00
Christoph Strobl
6a973b245f Add support for $tsIncrement aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:12:37 +02:00
Christoph Strobl
714b23e0ce Add support for $sortArray aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:12:37 +02:00
Christoph Strobl
d4a6614c11 Add support for $setField aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:48 +02:00
Christoph Strobl
82ce0abe1a Add support for $getField aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:46 +02:00
Christoph Strobl
dec7c125d6 Add support for $dateTrunc aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:43 +02:00
Christoph Strobl
db12c4ba5a Add support for $dateSubtract aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:41 +02:00
Christoph Strobl
5bbe481e98 Add support for $minN aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:38 +02:00
Christoph Strobl
fb39c31986 Add support for $maxN aggregation operator.
See #4139
Original pull request: #4182.
2022-10-12 15:11:31 +02:00
Christoph Strobl
dd446472bc Polishing.
See #4139
Original pull request: #4182.
2022-10-12 15:11:22 +02:00
Christoph Strobl
72d82d3083 Add support for $top & $topN aggregation operators.
Closes #4139
Original pull request: #4182.
2022-10-12 15:11:20 +02:00
Christoph Strobl
cdfe2a0b59 Add support for $lastN aggregation operator.
Closes #4139
Original pull request: #4182.
2022-10-12 15:11:18 +02:00
Christoph Strobl
5525a4fbf9 Add support for $firstN aggregation operator.
Closes #4139
Original pull request: #4182.
2022-10-12 15:11:15 +02:00
Christoph Strobl
59464f3b3c Add support for $bottomN aggregation operator.
Closes #4139
Original pull request: #4182.
2022-10-12 15:11:09 +02:00
Christoph Strobl
052cfdfd45 Add support for $bottom aggregation operator.
Closes #4139
Original pull request: #4182.
2022-10-12 15:10:40 +02:00
Christoph Strobl
b31c21bb91 Upgrade to MongoDB driver 4.8.0-beta0.
Closes: #4200
2022-10-11 16:47:23 +02:00
Christoph Strobl
5b6e5ca568 Update tests.
Original Pull Request: #4196
2022-10-11 11:53:26 +02:00
gongxuanzhang
1a9136c0c1 Fix json schema type name for boolean.
Was boolean should have been bool.

Closes: #4196
2022-10-11 11:53:25 +02:00
Mark Paluch
59753bb55a Adapt to changed AOT packages in Spring Data Commons.
Closes #4199
2022-10-11 11:39:00 +02:00
Christoph Strobl
8d963fc5da Add option to configure change stream behaviour at collection creation time.
Introduce CollectionChangeStreamOptions which allows to define the changeStreamPreAndPostImages of the createCollection command.

Original Pull Request: #4193
2022-10-10 11:54:32 +02:00
Christoph Strobl
c1de745014 Polishing.
Update javadoc, format imports and add issue references.

Original Pull Request: #4193
2022-10-10 11:54:08 +02:00
myroslav.kosinskyi
aa35aaeb70 Add fullDocumentBeforeChange support for change streams.
Closes: #4187
Original Pull Request: #4193
2022-10-10 11:53:39 +02:00
Mark Paluch
a5725806f5 Remove references to ClassTypeInformation from TypeInformation.
Closes #4195
2022-10-06 16:21:30 +02:00
Christoph Strobl
d715414683 Switch to micrometer 1.10 snapshots.
Follow signature changes.

See: #4191
See: spring-projects/spring-data-build#1810
2022-10-06 15:44:51 +02:00
Christoph Strobl
f2c2451b7d Add hint how to use $search aggregation operator to reference documentation.
Closes: #4183
2022-10-06 13:56:57 +02:00
Christoph Strobl
5b8d0d08ee Update reactive transaction sample in reference documentation.
Closes: #4190
2022-10-06 13:03:56 +02:00
Christoph Ahlers
18186f26e2 Remove unused imports.
Closes: #4178
2022-10-06 10:21:50 +02:00
Christoph Ahlers
10acc14c14 Fix javadoc parameter names.
Closes: #4179
2022-10-04 12:31:34 +02:00
Wan Bachtiar
87effb9013 Fix typo in reference documentation.
Closes: #4180
2022-10-04 12:24:23 +02:00
Mark Paluch
19819680f9 Adopt to SLF4J 2.0 upgrade.
Exclude transitive Micrometer Test dependencies that ship outdated SLF4J implementations.

Closes #4189
2022-09-30 13:38:52 +02:00
Mark Paluch
2d2f67cc93 Prefer Java configuration over XML.
Closes #4186
2022-09-28 15:29:10 +02:00
Seungwoo Jo
e9818fe11a Fix documentation typo in BasicQuery.
Closes #4169
Original pull request: #4170.
2022-09-21 11:15:12 +02:00
Christoph Strobl
a63db5586c Add missing aggregation system variables.
Move inner class SystemVariable to upper level and add missing values (NOW, CLUSTER_TIME, DECEND, PRUNE, KEEP & SEARCH_META)

Original pull request: #4176.
Closes #4145
2022-09-21 10:51:00 +02:00
Mark Paluch
68ab74a5bf Polishing.
Reformat code.

See #4004
Original pull request: #4006.
2022-09-21 10:48:22 +02:00
Christoph Strobl
de33734118 Polishing
Update Javadoc to mention unit of measure for min/maxDistance depending on usage of geoJson.
Also remove unused imports from tests

See #4004
Original pull request: #4006.
2022-09-21 10:48:22 +02:00
Christoph Strobl
c272c7317e Fix rewrite near & nearSphere count queries using geoJson to geoWithin.
$near and $nearSphere queries are not supported via countDocuments and the used aggregation match stage and need to be rewritten to $geoWithin. The existing logic did not cover usage of geoJson types, which is fixed now. In case of nearSphere it is also required to convert the $maxDistance argument (given in meters for geoJson) to radians which is used by $geoWithin $centerSphere.

Closes #4004
Original pull request: #4006.
Related to #2925
2022-09-21 10:48:21 +02:00
Spring Builds
d7fc605f7b After release cleanups.
See #4117
2022-09-19 14:39:07 +00:00
Spring Builds
3b805b9e03 Prepare next development iteration.
See #4117
2022-09-19 14:38:55 +00:00
Spring Builds
91cca3f2c4 Release version 4.0 M6 (2022.0.0).
See #4117
2022-09-19 14:15:22 +00:00
Spring Builds
2de6384d0f Prepare 4.0 M6 (2022.0.0).
See #4117
2022-09-19 14:12:56 +00:00
Christoph Strobl
ab1c0ff7b8 Apply conversion on document reference lookup using nested property.
Closes #4033
Original pull request: #4044.
2022-09-19 09:57:24 +02:00
Christoph Strobl
ae2846c5bf Generate and convert id on insert if explicitly defined.
We now make sure to provide an id value that matches the desired target type when no id is set, and the property defines an explicit conversion target.
Previously a new ObjectId would have been generated which leads to type inconsistencies when querying for _id.

Closes #4026
Original pull request: #4057.
2022-09-19 09:47:18 +02:00
Christoph Strobl
e88c9cf791 Fix issue with reference conversion in updates.
We now make sure to convert references in update operations targeting collection like fields when using eg. the push modifier.

Closes #4041
Original pull request: #4045.
2022-09-19 08:54:26 +02:00
Christoph Strobl
fadca10f62 Support @DocumentReference via Querydsl.
Closes #4037
Original pull request: #4069.
2022-09-16 15:57:30 +02:00
Mark Paluch
40320136f3 Polishing.
See #4061
Original pull request: #4062.
2022-09-16 14:52:00 +02:00
Christoph Strobl
bc575de3b0 Improve exception message when deriving collection name from type.
We now provide a better worded exception message when trying to derive the collection name for a type that is not considered a user types (such as org.bson.Document).
Update the Javadoc to hint to the error.

Closes #4061
Original pull request: #4062.
2022-09-16 14:51:54 +02:00
Christoph Strobl
09b2afa79d Initialize lists with size where possible.
Closes #3941
Original pull request: #3974.
2022-09-16 14:45:34 +02:00
Mark Paluch
96b564eb9a Polishing.
Reformat code.

See #4167.
Original pull request: #4168.
2022-09-16 14:40:34 +02:00
Christoph Strobl
38390d3475 Fix usage of change stream option startAfter.
We now make sure to apply the token to startAfter method of the driver. Before this change it had been incorrectly applied to resumeAfter.

Closes #4167.
Original pull request: #4168.
2022-09-16 14:40:27 +02:00
Mark Paluch
6937bb519b Polishing.
Move off more deprecated API.

See #4164
Original pull request: #4165.
2022-09-16 11:11:17 +02:00
Christoph Strobl
6e4d463053 Move off deprecated API.
Closes #4164
Original pull request: #4165.
2022-09-16 11:11:12 +02:00
Mark Paluch
a9d2050806 Polishing.
Fix generics. Add warning suppressions for nullability checks.

See: #4104
Original pull request: #4156.
2022-09-14 14:06:44 +02:00
Christoph Strobl
6676389062 Fix GeoJson polygon conversion for polygons with inner ring.
Closes: #4104
Original pull request: #4156.
2022-09-14 14:06:35 +02:00
Mark Paluch
81f85b8cca Polishing.
Tweak Javadoc, make ViewOptions.collation final.

See: #2594
Original pull request: #4142.
2022-09-14 11:31:14 +02:00
Christoph Strobl
77f318bd77 Add support to create views via reactive/template API.
This commit introduces support to create MongoDB Views directly via the Reactive-/MongoOperations API.

Closes: #2594
Original pull request: #4142.
2022-09-14 11:30:24 +02:00
Brian Clozel
7c7e70418f Replace deprecated StreamUtils API
As of spring-projects/spring-framework#29125,
`StreamUtils..emptyInput()` is deprecated in favor of
`InputStream.nullInputStream()` from the JDK.

Closes: #4160
2022-09-14 08:14:06 +02:00
Mark Paluch
44a1123034 Adopt to changed Mockk artifact name.
Closes #4161
2022-09-12 14:08:14 +02:00
Mark Paluch
e487c08b0c Polishing.
Reformat pom.xml

See #4161
2022-09-12 10:18:38 +02:00
Mark Paluch
a002d30aa9 Polishing.
Reformat pom.xml

See #4161
2022-09-12 10:17:58 +02:00
Tommy Ludwig
36ddd26edc Adapt to SampleTestRunner refactor.
See: micrometer-metrics/tracing#57
Closes: #4159
2022-09-09 19:13:57 +02:00
Tommy Ludwig
6197655e98 Adapt to ObservationConvention location change
See: micrometer-metrics/micrometer#3387
Closes: #4158
2022-09-09 19:06:28 +02:00
Christoph Strobl
929faea88b Add snapshot plugin repository for micrometer docs.
See: #4151.
2022-09-07 14:51:57 +02:00
Greg L. Turnquist
1fe1c13531 Upgrade to Micrometer 1.10.0-SNAPSHOT.
Closes #4151.
2022-09-07 12:17:16 +02:00
Kirill Gavrilov
838ddb5d26 Align signature of Kotlin extension functions to match Java API.
Closes: #4153
Original Pull Request: #4154
Related issues: #2602 #3187
2022-09-07 09:48:48 +02:00
Christoph Strobl
33c7f0980f Remove usage SynthesizedAnnotation.
Closes: #4155
2022-09-07 08:54:41 +02:00
Mark Paluch
4bbc443a0e Polishing.
Refine assertions.

See #4132
Original pull request: #4147.
2022-08-25 15:45:29 +02:00
Christoph Strobl
655dbc9783 Favor relaxed type mapping over strict one for aggregateStream.
Align aggregation context usage of aggregate and aggregate stream methods.

Closes #4132
Original pull request: #4147.
2022-08-25 15:45:13 +02:00
Christoph Strobl
0d752fd6e6 Introduce dedicated Collation annotation.
The Collation annotation mainly serves as a meta annotation that allows common access to retrieving collation values for annotated queries, aggregations, etc.

Original Pull Request: #4131
2022-08-25 09:03:43 +02:00
Christoph Strobl
8aabf2fa5e Polishing.
Resolve collation from template expression & update issue references + Javadoc.

Original Pull Request: #4131
2022-08-25 09:00:51 +02:00
Stefan Tirea
ff9d338bd7 Add collation for an index via @CompoundIndex and @Index annotations.
Closes #3002, closes #4130

Original Pull Request: #4131
2022-08-25 09:00:17 +02:00
Mark Paluch
2a4ee12363 Document BulkOperations limitations.
Closes #4082
2022-08-23 15:39:59 +02:00
Christoph Strobl
a66438fc20 Resolve cglib proxies during AOT processing.
We now make sure to run the enhancer during AOT which allows the infrastructure to pick up the generated type.
Along the lines we removed the no longer supported asserts for class proxies and followed changes in FW6.

Closes: #4148
2022-08-23 11:48:23 +02:00
Mark Paluch
0ccc037b8e Polishing.
Introduce JUnit extension to declare tests that dirty or provide their state.

See #3817
Original pull request: #3987.
2022-08-23 10:00:16 +02:00
Christoph Strobl
00792192c3 Close clients created during tests.
See #3817
Original pull request: #3987.
2022-08-23 09:54:29 +02:00
Christoph Strobl
e064b505c9 Prevent sync client from being created in reactive test config.
Closes #3817
Original pull request: #3987.
2022-08-23 09:53:58 +02:00
Christoph Strobl
7df2bdf8ff Upgrade to MongoDB driver 4.7.1
Closes: #4144
2022-08-22 07:59:57 +02:00
Mark Paluch
2f9fc1618e Use mongosh instead of mongo CLI.
Switch from the deprecated command to its replacement.

See #4138
2022-08-17 12:56:26 +02:00
Mark Paluch
9e2aecf4ae Polishing.
Fix required Java version.

See #4140
2022-08-17 10:58:13 +02:00
Mark Paluch
c32c4beb59 Remove new & noteworthy section in favor of our release notes.
The release notes now outline new and noteworthy changes.

Closes #4140
2022-08-17 10:51:01 +02:00
Mark Paluch
d48f3ec535 Polishing.
Use && syntax to catch commands that exit with non-success exit codes.

See #4139
2022-08-17 10:44:11 +02:00
Mark Paluch
5f16aecd13 Assert compatibility with MongoDB 6.0.
Closes #4138
2022-08-17 10:42:57 +02:00
Mark Paluch
5fc49b1649 Polishing.
Encapsulate nested object lookup. Refine method signatures and tweak Javadoc.

See #4098
Original pull request: #4133.
2022-08-05 15:59:31 +02:00
Christoph Strobl
1e7dc7ce66 Fix non-association mapping when id value matches already resolved instance of same type.
This commit ensures to fully resolve non association values from the given source document instead of trying attempt a by id lookup in already resolved instances.

Closes: #4098
Original pull request: #4133.
2022-08-05 15:59:04 +02:00
Christoph Strobl
234783f442 Allow referencing the $id field of dbrefs within an aggregation pipeline.
Closes: #4123
Original pull request: #4125.
2022-08-05 14:19:29 +02:00
Sojin
3429350964 Fix AKNOWLEDGED typo in reference documentation.
Two typos found have been updated

Closes #4135
2022-08-05 14:07:54 +02:00
Mark Paluch
d130984bdc Allow disabling entity lifecycle events.
We now support disabling lifecycle events through the Template API to reduce the framework overhead when events are not needed.

Closes #4107
2022-07-20 16:07:01 +02:00
Christoph Strobl
f5378bf825 Upgrade to MongoDB driver 4.7.0
Closes: #4124
2022-07-20 08:13:53 +02:00
Mark Paluch
21057c3d17 Fix DTO projection instantiation.
We now correctly instantiate DTO projection classes by using the actual constructor argument type. Previously, we did not update the conversion context to fetch the correct type but used the type of the DTO projection class instead of the constructor argument.

Closes #4120
2022-07-19 11:17:40 +02:00
Christoph Strobl
d0a98eb71d After release cleanups.
See #4054
2022-07-15 15:30:53 +02:00
Christoph Strobl
bc95c4d390 Prepare next development iteration.
See #4054
2022-07-15 15:30:45 +02:00
Christoph Strobl
d56a4ea77d Release version 4.0 M5 (2022.0.0).
See #4054
2022-07-15 15:18:29 +02:00
Christoph Strobl
5a09626cbf Prepare 4.0 M5 (2022.0.0).
See #4054
2022-07-15 15:17:56 +02:00
John Blum
029291a1dd Adapt to repackaging of the AOT RuntimeHintsPredicate.
Closes #4111.
2022-07-12 18:05:46 -07:00
Mark Paluch
989a2596cb Upgrade to MongoDB driver 4.7.0-beta0.
Closes #4110
2022-07-12 15:40:36 +02:00
Greg L. Turnquist
f5c520dbc8 Upgrade Micrometer's tracing artifact to micrometer-tracing.
Closes #4106.
2022-07-11 08:24:41 -05:00
Mark Paluch
80c843eb20 Update README.adoc
See #4054
2022-07-11 15:06:12 +02:00
Christoph Strobl
9b136537c0 Simplify auditing configuration.
Use IsNewAwareAuditingHandler factory method to avoid exposing additional beans.

See: #4022
2022-07-08 08:57:34 +02:00
Mark Paluch
d334c5a44c Polishing.
Adopt to Framework changes.
Simplify auditing bean registration.
Remove ImportRuntimeHints in EnableMongoAuditing.
Refine ManagedTypes bean definitions.
Consistently use mongo as bean name prefix. Depend on store-specific ManagedTypes.
Rewrite ReactiveMongoAuditingRegistrar to avoid inner beans.
Reduce AOT processor visibility. Cleanup imports. Improve type naming. Update Javadoc.

Original Pull Request: #4093
2022-07-05 09:55:41 +02:00
Christoph Strobl
cfd55be95b Add AOT repository support
We now use the AOT infrastructure of Spring Framework 6 and data commons to provide AOT support building the foundation for native image compilation.
Additionally we register hints for GraalVM native image.

See: #4022
Original Pull Request: #4093
2022-07-05 09:55:10 +02:00
Mark Paluch
079c5a95aa Adopt test to Spring Framework 6 changes.
See #4054
2022-07-05 07:38:35 +02:00
Mark Paluch
3f6821f11f Adopt to Reactor 2022.0.0-M4 changes.
Closes #4100
2022-07-04 14:28:55 +02:00
Mark Paluch
1a868ae35e Avoid duplicate bean registrations in MappingMongoConverterParser.
We now ensure to not override `ValidatingMongoEventListener` and `LocalValidatorFactoryBean` bean definitions by avoiding duplicate registrations and checking whether a bean with the given name is already registered.

Closes #4087
2022-06-28 10:24:56 +02:00
Mark Paluch
248bcfa177 Polishing.
Simplify code.

Original pull request: #4059.
See #4038
2022-06-27 15:42:55 +02:00
Christoph Strobl
ee076ec02f Simplify usage of user provided aggregation operations.
Introduce Aggregation.stage which allows to use a plain JSON String or any valid Bson representation to be used within an aggregation pipeline stage, without having to implement AggregationOperation directly.
The change allows to make use of driver native builder API for aggregates.

Original pull request: #4059.
Closes #4038
2022-06-27 15:42:55 +02:00
Mark Paluch
1184d6ee2d Upgrade to Kotlin 1.7.
Adopt to stricter nullability checks.

Closes #4096
2022-06-24 11:53:12 +02:00
Christoph Strobl
062b4e8757 Provide Module Identifier via MongoRepositoryConfigurationExtension
Closes: #4092
2022-06-21 08:00:06 +02:00
Christoph Strobl
30a417d810 Retain parameter type when binding parameters in annotated Query/Aggregation.
This commit ensures the parameter type is preserved when binding parameters used within the value of the Query or Aggregation annotation

Closes: #4089
2022-06-20 10:37:49 +02:00
Christoph Strobl
1671f960b6 Upgrade to MongoDB driver 4.6.1
Closes: #4081
2022-06-20 09:10:40 +02:00
Mark Paluch
d4cce9ac00 Wrap SpEL documentation with admonition.
Closes #4085
2022-06-14 09:11:46 +02:00
Mark Paluch
8f9576aa42 Polishing.
Reformat asciidoc source.

See #4085
2022-06-14 09:04:20 +02:00
John Blum
f15fd2a418 Remove punctuation in Exception messages.
Closes #4079.
2022-06-08 15:21:22 -07:00
Mark Paluch
01656db002 Upgrade to Maven Wrapper 3.8.5.
See #4073
2022-06-03 09:32:40 +02:00
John Blum
84faff6bd4 Remove Docker Registry login.
Closes #4056.
2022-05-16 12:55:12 -07:00
Greg L. Turnquist
d72e1531d3 Adapt to changes in Micrometer APIs.
Micrometer has updated some of its APIs and we must adjust.

Closes: #4055
2022-05-16 14:34:50 -05:00
Mark Paluch
ac59cf930a Update driver compatibility matrix.
Closes #4052
2022-05-16 15:11:38 +02:00
Christoph Strobl
7bdc8d3aac Fix pom.xml formatting.
This commit reverts formatting changes introduced via 140fb2e9ea.

See #4005
2022-05-13 12:59:40 +02:00
Christoph Strobl
47548a21ea After release cleanups.
See #4005
2022-05-13 10:53:26 +02:00
Christoph Strobl
5aaa8f79e7 Prepare next development iteration.
See #4005
2022-05-13 10:53:23 +02:00
Christoph Strobl
1a77b1bc56 Release version 4.0 M4 (2022.0.0).
See #4005
2022-05-13 10:43:59 +02:00
Christoph Strobl
140fb2e9ea Prepare 4.0 M4 (2022.0.0).
See #4005
2022-05-13 10:43:20 +02:00
Jay Bryant
b571c8958d Editing pass for new content in reference documentation.
Closes: #4049
2022-05-11 05:38:24 +02:00
Christoph Strobl
8d54cae54d Polishing.
Update Query javadoc.

Original Pull Request: #3999
2022-05-10 16:33:19 +02:00
Raul Mello Silva
14a71f0498 Update Query.limit javadoc.
This commit explains usage of Query.limit(int), which will be set to unlimited when set to zero or a negative value.

Closes: #3999
2022-05-10 16:19:07 +02:00
Christoph Strobl
14c265f3a1 Provide additional meta information via pom.xml
Add scm & issueManagement.

Closes: #4048
2022-05-10 12:31:29 +02:00
nniesen
440a289ac6 Update spring.io project urls.
This commit updates outdated projects.spring.io links to spring.io/projects.

Closes: #4042
2022-05-09 13:57:49 +02:00
John Blum
9663a2227b Adapt to API changes in PropertyValueConverters.
Closes #4040.
2022-05-02 17:19:17 -07:00
Mark Paluch
b134e1916d Upgrade to MongoDB driver 4.6.0.
Closes #4027
2022-04-19 10:05:37 +02:00
Greg L. Turnquist
65b02f92b4 Use updated coordinates for Hibernate Validator.
See #4024.
2022-04-15 10:45:58 -05:00
Greg L. Turnquist
667b71e073 Switch to Micrometer 1.10's tracing APIs.
Micrometer Tracing 1.10 has some breaking APIs.

See #4023.
2022-04-15 10:04:41 -05:00
Mark Paluch
225dbee15f Simplify dependency version arrangement.
We now inherit the version number and repositories from the parent pom.

See #4017
2022-04-07 09:53:10 +02:00
Mark Paluch
c04ceb163b Polishing.
Reformat code.

See #4017
2022-04-07 09:44:42 +02:00
Greg L. Turnquist
711ac343fe Fix Micrometer-based deployment issues.
When deploying to artifactory, a Micrometer-based plugin can't be found.

See #4017.
2022-04-06 09:25:44 -05:00
Mark Paluch
852a4ecc59 Polishing.
Refine default conversions creation.

See #4014
Original pull request: #4015.
2022-04-05 10:07:51 +02:00
Christoph Strobl
7ab2428c64 Make sure to initialize PropvertyValueConversions in Converter setup.
Closes #4014
Original pull request: #4015.
2022-04-05 10:07:46 +02:00
Oliver Drotbohm
350acf66bc Adapt to API changes in Spring Data Commons.
spring-projects/spring-data-commons#2518 introduced TypeInformation.getTypeDescriptor() which we need to implement in our custom FieldTypeInformation.
2022-04-04 18:21:04 +02:00
Christoph Strobl
ab94a94b2e Upgrade to MongoDB driver 4.5.1
Resolves: #4013
2022-04-04 10:20:20 +02:00
Christoph Strobl
4c77763cd3 Introduce Observability with Micrometer and Micrometer Tracing.
See #3942.
2022-03-29 13:09:07 -05:00
Christoph Strobl
f197953480 Update build triggers.
See: #4005
2022-03-24 13:53:25 +01:00
Mark Paluch
44afd4939e After release cleanups.
See #4003
2022-03-22 14:07:38 +01:00
Mark Paluch
575917435e Prepare next development iteration.
See #4003
2022-03-22 14:07:36 +01:00
Mark Paluch
2db55ab0aa Release version 4.0 M3 (2022.0.0).
See #4003
2022-03-22 14:00:23 +01:00
Mark Paluch
79602b7dbe Prepare 4.0 M3 (2022.0.0).
See #4003
2022-03-22 14:00:02 +01:00
Mark Paluch
d5d2371b9e After release cleanups.
See #3937
2022-03-21 16:44:41 +01:00
Mark Paluch
c95e8a5748 Prepare next development iteration.
See #3937
2022-03-21 16:44:39 +01:00
Mark Paluch
f0c0a86118 Release version 4.0 M2 (2022.0.0).
See #3937
2022-03-21 16:35:08 +01:00
Mark Paluch
3d82e12e6b Prepare 4.0 M2 (2022.0.0).
See #3937
2022-03-21 16:34:37 +01:00
Mark Paluch
8672808222 Polishing.
Reformat code. Tweak documentation wording.

See #3596
Original pull request: #3982.
2022-03-21 09:20:47 +01:00
Christoph Strobl
29fb085d8b Add support for PropertyValueConverters.
Closes: #3596
Original pull request: #3982.
2022-03-21 09:20:34 +01:00
Mark Paluch
15cac49f9c Polishing.
Refine API naming towards merge/property instead of combine/specify. Tweak documentation. Introduce Resolution.ofValue(…) for easier creation.

See #3870
Original pull request: #3986.
2022-03-18 14:11:35 +01:00
Christoph Strobl
946deac48c Support generating JsonSchema for Polymorphic fields.
This commit introduces MergedJsonSchema and MergedJsonSchemaProperty that can be used to merge properties of multiple objects into one as long as the additions do not conflict with another (eg. due to usage of different types).
To resolve previously mentioned errors it is required to provide a ConflictResolutionFunction.

Closes #3870
Original pull request: #3986.
2022-03-18 14:11:35 +01:00
Mark Paluch
02229f291c Polishing.
Reformat code.

See #3998
2022-03-16 16:44:06 +01:00
Mark Paluch
1009491920 Create a new conversion context for projection properties.
We now create a new conversion context to ensure that we use the correct property type to avoid type retention when mapping complex objects within a projection.

Closes #3998
2022-03-16 16:29:14 +01:00
Mark Paluch
05730ded1b Fix CI trigger.
Use correct Spring Data Commons version as build trigger.

See #3973
2022-03-15 14:38:54 +01:00
Mark Paluch
612845f59c Polishing.
Extract CreateCollectionOptions conversion to EntityOperations to unify collection creation. Adopt tests.

See #3984
Original pull request: #3990.
2022-03-11 15:21:05 +01:00
Mark Paluch
1f06954952 Polishing.
Add missing Override annotations to template API methods.

See #3984
2022-03-11 15:20:20 +01:00
Christoph Strobl
7bcf0322d2 Propagate time series options correctly.
This commit fixes an issue when creating a collection via MongoTemplate without passing on type information. In this case potential time series information was lost.

Closes #3984
Original pull request: #3990.
2022-03-11 15:18:58 +01:00
Mark Paluch
7dd2f350eb Polishing.
Remove duplicate dependency declaration.

See: #3522
2022-03-11 14:10:49 +01:00
Mark Paluch
e433375cac Polishing.
Reorder methods. Add links to Javadoc. Tweak wording.

See: #3522
Original pull request: #3951.
2022-03-11 14:09:58 +01:00
Christoph Strobl
d16013aa6b Allow to estimate document count.
This commit introduce an option that allows users to opt in on using estimatedDocumentCount instead of countDocuments in case the used filter query is empty.
To still be able to retrieve the exact number of matching documents we also introduced MongoTemplate#exactCount.

Closes: #3522
Original pull request: #3951.
2022-03-11 14:06:30 +01:00
Christoph Strobl
dab5473740 Modify visibility of methods in TypedJsonSchemaObject.
Change visibility to public as it should have been in first place.

Closes: #3989
2022-03-10 09:23:25 +01:00
sangyongchoi
e6fce75dfd Remove duplicate condition in GeoConverters.
Closes: #3981
2022-03-03 13:29:24 +01:00
Mark Paluch
e75f022844 Update CI properties.
See #3937
2022-02-23 14:33:13 -06:00
Christoph Strobl
611ece049b Serialize values for debug output safely in AbstractMongoEventListener.
We now make sure that codec configuration will not cause an exception when debug logging is turned on.

Resolves: #3968
Original Pull Request: #3970
2022-02-18 10:15:28 +01:00
Christoph Strobl
be2286edf7 Update copyright year to 2022.
See: #3966
2022-02-17 10:49:44 +01:00
Christoph Strobl
b99648672b Introduce Update annotation.
Switch update execution to an annotation based model that allows usage of both the classic update as well as the aggregation pipeline variant. Add the reactive variant of it.
Make sure to allow parameter binding for update expressions and verify method return types.
Update Javadoc and reference documentation.

See: #2107
Original Pull Request: #284
2022-02-17 10:31:46 +01:00
Thomas Darimont
28708ce24e Add support for modifying documents via repository method.
We now support findAndModify operations on derived query methods.

Closes: #2107
Original Pull Request: #284
2022-02-17 10:29:37 +01:00
Christoph Strobl
1c6c703640 Deprecate mapReduce.
Closes: #3945
2022-02-16 14:45:01 +01:00
blu10ph
67b1fe5fbc Avoid obtaining mapped sort multiple times for mapReduce.
Apply already mapped sort for map reduce instead of running the source document through the mapping layer again.

Closes: #3960
2022-02-16 14:42:57 +01:00
Christoph Strobl
4f6501f140 Update GeoJson section in reference documentation.
Mention the relation of Point/GeoJsonPoint x/y coordinates to longitude/latitude.

Original Pull Request: #3956
2022-02-16 14:42:19 +01:00
sangyongchoi
2a3f746cb6 Update GeoJsonPoint Javadoc.
Mention x -> longitude, y -> latitude relation.

Closes: #3956
2022-02-16 14:41:33 +01:00
Mark Paluch
90b8ba7246 Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-16 13:36:39 +01:00
Oliver Drotbohm
f12648af4c Adapt to API changes regarding object creation metadata. 2022-02-15 17:37:58 +01:00
Greg L. Turnquist
e812f89b47 Update CI properties.
See #3937
2022-02-15 09:00:21 -06:00
Christoph Strobl
f96d700d8d Favor Base64Utils over bson internal Base64 type.
org.bson.internal.Base64 is no longer available in MongoDB driver 4.5.0.

Related to: #3962
2022-02-14 11:09:26 +01:00
Christoph Strobl
32e7f2032d Upgrade to MongoDB driver 4.5.0
Closes: #3962
2022-02-14 11:09:14 +01:00
Mark Paluch
43ac1984ab Adapt repository to List-based interface variants.
Closes #3964
2022-02-14 11:07:27 +01:00
Greg L. Turnquist
28f262309c Use Harbor Proxy for containers.
Leverage internal infrastructure for pulling Docker container images. Reduces pressure on Docker Hub and reduces risk of hitting rate limits.

See #3954.
Related https://github.com/spring-projects/spring-data-build/issues/1630.
2022-02-07 10:59:50 -06:00
Mark Paluch
eefe6b3b21 Update CI properties.
See #3937
2022-02-07 09:32:15 +01:00
Mark Paluch
d0f2ca9efc Polishing.
Refine build script.

See #3949
2022-02-04 08:49:49 +01:00
Mark Paluch
e6c8ee037a Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-04 08:46:05 +01:00
Christoph Strobl
e63013deac Remove previously deprecated API.
This commit removes and moves off deprecated API.
Additionally some blocks got deprecated due to changes in MongoDB server API.

Resolves: #3952
2022-02-03 16:44:03 +01:00
Mark Paluch
2367379b6d Use Java 8 Stream as return type for Template operations returning a stream.
We now use Stream instead of CloseableIterator for easier stream creation.

Closes: #3944
Original Pull Request: #3946
2022-02-03 08:30:27 +01:00
Mark Paluch
a1c483f2e1 After release cleanups.
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
64b8b500ae Prepare next development iteration.
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
2d15e37bc7 Release version 4.0 M1 (2022.0.0).
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
54655b88c0 Prepare 4.0 M1 (2022.0.0).
See #3927
2022-02-03 08:08:49 +01:00
Jens Schauder
cd395e3324 Remove Eclipse Non-Javadoc comments.
Closes #3924
2022-02-03 08:08:00 +01:00
Mark Paluch
33bdbbe851 Polishing. 2022-02-03 08:08:00 +01:00
Christoph Strobl
9f1448df44 Drop support for RxJava 1 and 2.
Closes: #3839
2022-02-03 08:08:00 +01:00
Christoph Strobl
e3a4bada63 Move to Jakarta EE9
Closes: #3830
2022-02-03 08:08:00 +01:00
Christoph Strobl
dcdf3a2365 Prepare Spring Data MongoDB 4.x branch.
Upgrade to data-commons 3.0 and Java 17 (still source level 16 due to asm).
Remove support for threeten, joda-time.
Transition to PersistentEntitiesFactoryBean from data-commons.
Update build to MongoDB 4.4 and 5 with Java17. Remove Java8 setup.
Fix javadoc tooling error on cdi 1 vs. 2 version mix.
Disabled internal package cycle analysis as this requires transition to ArchUnit.
2022-02-03 08:07:59 +01:00
562 changed files with 11930 additions and 13140 deletions

View File

@@ -1,2 +1,2 @@
#Fri Jun 03 09:39:35 CEST 2022
#Fri Jun 03 09:32:40 CEST 2022
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.5/apache-maven-3.8.5-bin.zip

134
Jenkinsfile vendored
View File

@@ -9,7 +9,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/2.7.x", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/main", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -20,64 +20,7 @@ pipeline {
stages {
stage("Docker images") {
parallel {
stage('Publish JDK (main) + MongoDB 4.0') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.0.version']} ci/openjdk8-mongodb-4.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (main) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk8-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (main) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk8-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (LTS) + MongoDB 4.4') {
stage('Publish JDK (Java 17) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-4.4/**"
@@ -89,7 +32,45 @@ pipeline {
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}", "--build-arg BASE=${p['docker.java.lts.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 6.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-6.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk17-mongodb-6.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
@@ -99,7 +80,7 @@ pipeline {
}
}
stage("test: baseline (main)") {
stage("test: baseline (Java 17)") {
when {
beforeAgent(true)
anyOf {
@@ -116,7 +97,7 @@ pipeline {
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
@@ -137,29 +118,8 @@ pipeline {
}
}
parallel {
stage("test: mongodb 4.4 (main)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
stage("test: mongodb 5.0 (main)") {
stage("test: MongoDB 5.0 (Java 17)") {
agent {
label 'data'
}
@@ -181,7 +141,7 @@ pipeline {
}
}
stage("test: baseline (LTS)") {
stage("test: MongoDB 6.0 (Java 17)") {
agent {
label 'data'
}
@@ -191,11 +151,11 @@ pipeline {
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}").inside(p['docker.java.inside.basic']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}

View File

@@ -93,142 +93,11 @@ and declare the appropriate dependency version.
</repository>
----
== Upgrading from 2.x
[[upgrading]]
== Upgrading
The 4.0 MongoDB Java Driver does no longer support certain features that have already been deprecated in one of the last minor versions.
Some of the changes affect the initial setup configuration as well as compile/runtime features. We summarized the most typical changes one might encounter.
=== XML Namespace
.Changed XML Namespace Elements and Attributes:
|===
| Element / Attribute | 2.x | 3.x
| `<mongo:mongo-client />`
| Used to create a `com.mongodb.MongoClient`
| Now exposes a `com.mongodb.client.MongoClient`
| `<mongo:mongo-client replica-set="..." />`
| Was a comma delimited list of replica set members (host/port)
| Now defines the replica set name. +
Use `<mongo:client-settings cluster-hosts="..." />` instead
| `<mongo:db-factory writeConcern="..." />`
| NONE, NORMAL, SAFE, FSYNC_SAFE, REPLICAS_SAFE, MAJORITY
| W1, W2, W3, UNACKNOWLEDGED, ACKNOWLEDGED, JOURNALED, MAJORITY
|===
.Removed XML Namespace Elements and Attributes:
|===
| Element / Attribute | Replacement in 3.x | Comment
| `<mongo:db-factory mongo-ref="..." />`
| `<mongo:db-factory mongo-client-ref="..." />`
| Referencing a `com.mongodb.client.MongoClient`.
| `<mongo:mongo-client credentials="..." />`
| `<mongo:mongo-client credential="..." />`
| Single authentication data instead of list.
| `<mongo:client-options />`
| `<mongo:client-settings />`
| See `com.mongodb.MongoClientSettings` for details.
|===
.New XML Namespace Elements and Attributes:
|===
| Element | Comment
| `<mongo:db-factory mongo-client-ref="..." />`
| Replacement for `<mongo:db-factory mongo-ref="..." />`
| `<mongo:db-factory connection-string="..." />`
| Replacement for `uri` and `client-uri`.
| `<mongo:mongo-client connection-string="..." />`
| Replacement for `uri` and `client-uri`.
| `<mongo:client-settings />`
| Namespace element for `com.mongodb.MongoClientSettings`.
|===
=== Java Configuration
.Java API changes
|===
| Type | Comment
| `MongoClientFactoryBean`
| Creates `com.mongodb.client.MongoClient` instead of `com.mongodb.MongoClient` +
Uses `MongoClientSettings` instead of `MongoClientOptions`.
| `MongoDataIntegrityViolationException`
| Uses `WriteConcernResult` instead of `WriteResult`.
| `BulkOperationException`
| Uses `MongoBulkWriteException` and `com.mongodb.bulk.BulkWriteError` instead of `BulkWriteException` and `com.mongodb.BulkWriteError`
| `ReactiveMongoClientFactoryBean`
| Uses `com.mongodb.MongoClientSettings` instead of `com.mongodb.async.client.MongoClientSettings`
| `ReactiveMongoClientSettingsFactoryBean`
| Now produces `com.mongodb.MongoClientSettings` instead of `com.mongodb.async.client.MongoClientSettings`
|===
.Removed Java API:
|===
| 2.x | Replacement in 3.x | Comment
| `MongoClientOptionsFactoryBean`
| `MongoClientSettingsFactoryBean`
| Creating a `com.mongodb.MongoClientSettings`.
| `AbstractMongoConfiguration`
| `AbstractMongoClientConfiguration` +
(Available since 2.1)
| Using `com.mongodb.client.MongoClient`.
| `MongoDbFactory#getLegacyDb()`
| -
| -
| `SimpleMongoDbFactory`
| `SimpleMongoClientDbFactory` +
(Available since 2.1)
|
| `MapReduceOptions#getOutputType()`
| `MapReduceOptions#getMapReduceAction()`
| Returns `MapReduceAction` instead of `MapReduceCommand.OutputType`.
| `Meta\|Query` maxScan & snapshot
|
|
|===
=== Other Changes
==== UUID Types
The MongoDB UUID representation can now be configured with different formats.
This has to be done via `MongoClientSettings` as shown in the snippet below.
.UUID Codec Configuration
====
[source,java]
----
static class Config extends AbstractMongoClientConfiguration {
@Override
public void configureClientSettings(MongoClientSettings.Builder builder) {
builder.uuidRepresentation(UuidRepresentation.STANDARD);
}
// ...
}
----
====
Instructions for how to upgrade from earlier versions of Spring Data are provided on the project https://github.com/spring-projects/spring-data-commons/wiki[wiki].
Follow the links in the https://github.com/spring-projects/spring-data-commons/wiki#release-notes[release notes section] to find the version that you want to upgrade to.
[[getting-help]]
== Getting Help
@@ -277,12 +146,12 @@ and accessible from Maven using the Maven configuration noted <<maven-configurat
NOTE: Configuration for Gradle is similar to Maven.
The best way to get started is by creating a Spring Boot project using MongoDB on https://start.spring.io[start.spring.io].
Follow this https://start.spring.io/#type=maven-project&language=java&platformVersion=2.5.4&packaging=jar&jvmVersion=1.8&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb[link]
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=2.5.4&packaging=jar&jvmVersion=1.8&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
Follow this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb[link]
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
to build a reactive one.
However, if you want to try out the latest and greatest, Spring Data MongoDB can be easily built with the https://github.com/takari/maven-wrapper[Maven wrapper]
and minimally, JDK 8 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
and minimally, JDK 17 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
In order to build Spring Data MongoDB, you will need to https://www.mongodb.com/try/download/community[download]
and https://docs.mongodb.com/manual/installation/[install a MongoDB distribution].
@@ -313,7 +182,7 @@ To initialize the replica set, start a mongo client:
[source,bash]
----
$ $MONGODB_HOME/bin/mongo
MongoDB server version: 5.0.0
MongoDB server version: 6.0.0
...
----
@@ -341,7 +210,7 @@ Now you are ready to build Spring Data MongoDB. Simply enter the following `mvnw
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.8.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular, please sign
the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._

View File

@@ -9,7 +9,7 @@ ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 5.0 release signing key

View File

@@ -9,11 +9,13 @@ ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \

View File

@@ -1,22 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4 && \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,24 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
ln -T /bin/true /usr/bin/systemctl && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
rm /usr/bin/systemctl && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,20 +1,16 @@
# Java versions
java.main.tag=8u362-b09-jdk-focal
java.next.tag=11.0.18_10-jdk-focal
java.lts.tag=17.0.6_10-jdk-focal
java.main.tag=17.0.5_8-jdk-focal
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.next.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.next.tag}
docker.java.lts.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.lts.tag}
# Supported versions of MongoDB
docker.mongodb.4.0.version=4.0.28
docker.mongodb.4.4.version=4.4.18
docker.mongodb.5.0.version=5.0.14
docker.mongodb.4.4.version=4.4.17
docker.mongodb.5.0.version=5.0.13
docker.mongodb.6.0.version=6.0.2
# Supported versions of Redis
docker.redis.6.version=6.2.10
docker.redis.6.version=6.2.6
# Supported versions of Cassandra
docker.cassandra.3.version=3.11.14

15
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.4.8</version>
<version>4.1.x-MANUAL-ENCRYPTION-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.7.8</version>
<version>3.1.0-SNAPSHOT</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.7.8</springdata.commons>
<mongo>4.6.1</mongo>
<springdata.commons>3.1.0-SNAPSHOT</springdata.commons>
<mongo>4.8.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -145,8 +145,11 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.4.8</version>
<version>4.1.x-MANUAL-ENCRYPTION-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -322,7 +322,7 @@ public class AbstractMicrobenchmark {
try {
ResultsWriter.forUri(uri).write(results);
} catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
System.err.println(String.format("Cannot save benchmark results to '%s'; Error was %s", uri, e));
}
}
}

View File

@@ -1,6 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -14,7 +15,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.4.8</version>
<version>4.1.x-MANUAL-ENCRYPTION-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -34,7 +35,8 @@
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<mongo-reactivestreams>${mongo.reactivestreams}
</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
@@ -43,4 +45,15 @@
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-plugins-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -1,5 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.4.8</version>
<version>4.1.x-MANUAL-ENCRYPTION-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -110,6 +112,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-crypt</artifactId>
<version>1.6.1</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
@@ -122,27 +131,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId>
@@ -152,12 +140,6 @@
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
@@ -167,31 +149,48 @@
</dependency>
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<groupId>jakarta.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>${javax-annotation-api}</version>
<groupId>jakarta.annotation</groupId>
<artifactId>jakarta.annotation-api</artifactId>
<version>${jakarta-annotation-api}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-spi</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-impl</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation -->
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -204,30 +203,37 @@
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-observation</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.hibernate.validator</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.4.3.Final</version>
<version>7.0.1.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jakarta.el</groupId>
<artifactId>jakarta.el-api</artifactId>
<version>4.0.0</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<artifactId>jakarta.el</artifactId>
<version>4.0.2</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
@@ -272,9 +278,9 @@
</dependency>
<dependency>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<groupId>jakarta.transaction</groupId>
<artifactId>jakarta.transaction-api</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
@@ -310,6 +316,29 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>com.github.tomakehurst</groupId>
<artifactId>wiremock-jre8-standalone</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-integration-test</artifactId>
<scope>test</scope>
</dependency>
<!-- jMolecules -->
<dependency>
@@ -343,8 +372,11 @@
<goal>test-process</goal>
</goals>
<configuration>
<outputDirectory>target/generated-test-sources</outputDirectory>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
<outputDirectory>target/generated-test-sources
</outputDirectory>
<processor>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</processor>
</configuration>
</execution>
</executions>
@@ -364,7 +396,9 @@
<exclude>**/ReactivePerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
<java.util.logging.config.file>
src/test/resources/logging.properties
</java.util.logging.config.file>
<reactor.trace.cancel>true</reactor.trace.cancel>
</systemPropertyVariables>
</configuration>

View File

@@ -103,19 +103,11 @@ public class BindableMongoExpression implements MongoExpression {
return new BindableMongoExpression(expressionString, codecRegistryProvider, args);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoExpression#toDocument()
*/
@Override
public Document toDocument() {
return target.get();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args="

View File

@@ -62,7 +62,7 @@ public interface CodecRegistryProvider {
*/
default <T> Optional<Codec<T>> getCodecFor(Class<T> type) {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(type, "Type must not be null");
try {
return Optional.of(getCodecRegistry().get(type));

View File

@@ -102,7 +102,7 @@ public class MongoDatabaseUtils {
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "Factory must not be null!");
Assert.notNull(factory, "Factory must not be null");
if (sessionSynchronization == SessionSynchronization.NEVER
|| !TransactionSynchronizationManager.isSynchronizationActive()) {
@@ -193,19 +193,11 @@ public class MongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
@@ -214,10 +206,6 @@ public class MongoDatabaseUtils {
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
@@ -228,10 +216,6 @@ public class MongoDatabaseUtils {
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -1,57 +0,0 @@
/*
* Copyright 2011-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 3.0, use {@link MongoDatabaseFactory} instead.
*/
@Deprecated
public interface MongoDbFactory extends MongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase()} instead.
*/
@Deprecated
default MongoDatabase getDb() throws DataAccessException {
return getMongoDatabase();
}
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase(String)} instead.
*/
@Deprecated
default MongoDatabase getDb(String dbName) throws DataAccessException {
return getMongoDatabase(dbName);
}
}

View File

@@ -0,0 +1,81 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Arrays;
import java.util.function.Consumer;
import org.springframework.data.domain.ManagedTypes;
/**
* @author Christoph Strobl
* @since 4.0
*/
public final class MongoManagedTypes implements ManagedTypes {
private final ManagedTypes delegate;
private MongoManagedTypes(ManagedTypes types) {
this.delegate = types;
}
/**
* Wraps an existing {@link ManagedTypes} object with {@link MongoManagedTypes}.
*
* @param managedTypes
* @return
*/
public static MongoManagedTypes from(ManagedTypes managedTypes) {
return new MongoManagedTypes(managedTypes);
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given array of {@link Class types}.
*
* @param types array of {@link Class types} used to initialize the {@link ManagedTypes}; must not be {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized from {@link Class types}.
*/
public static MongoManagedTypes from(Class<?>... types) {
return fromIterable(Arrays.asList(types));
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given, required {@link Iterable} of
* {@link Class types}.
*
* @param types {@link Iterable} of {@link Class types} used to initialize the {@link ManagedTypes}; must not be
* {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized the given, required {@link Iterable} of {@link Class
* types}.
*/
public static MongoManagedTypes fromIterable(Iterable<? extends Class<?>> types) {
return from(ManagedTypes.fromIterable(types));
}
/**
* Factory method to return an empty {@link MongoManagedTypes} object.
*
* @return an empty {@link MongoManagedTypes} object.
*/
public static MongoManagedTypes empty() {
return from(ManagedTypes.empty());
}
@Override
public void forEach(Consumer<Class<?>> action) {
delegate.forEach(action);
}
}

View File

@@ -68,7 +68,7 @@ class MongoResourceHolder extends ResourceHolderSupport {
ClientSession session = getSession();
if (session == null) {
throw new IllegalStateException("No session available!");
throw new IllegalStateException("No session available");
}
return session;

View File

@@ -100,16 +100,12 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
*/
public MongoTransactionManager(MongoDatabaseFactory dbFactory, @Nullable TransactionOptions options) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
Assert.notNull(dbFactory, "DbFactory must not be null");
this.dbFactory = dbFactory;
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
@@ -118,19 +114,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
@@ -160,10 +148,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
@@ -173,19 +157,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
@@ -236,10 +212,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
@@ -259,10 +231,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
@@ -270,10 +238,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
@@ -302,7 +266,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
*/
public void setDbFactory(MongoDatabaseFactory dbFactory) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
Assert.notNull(dbFactory, "DbFactory must not be null");
this.dbFactory = dbFactory;
}
@@ -325,19 +289,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDatabaseFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
@@ -359,7 +315,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoDatabaseFactory getRequiredDbFactory() {
Assert.state(dbFactory != null,
"MongoTransactionManager operates upon a MongoDbFactory. Did you forget to provide one? It's required.");
"MongoTransactionManager operates upon a MongoDbFactory; Did you forget to provide one; It's required");
return dbFactory;
}
@@ -494,30 +450,22 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present. o_O");
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present; o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null.");
Assert.state(session != null, "A Session is required but it turned out to be null");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();

View File

@@ -136,7 +136,7 @@ public class ReactiveMongoDatabaseUtils {
private static Mono<MongoDatabase> doGetMongoDatabase(@Nullable String dbName, ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "DatabaseFactory must not be null!");
Assert.notNull(factory, "DatabaseFactory must not be null");
if (sessionSynchronization == SessionSynchronization.NEVER) {
return getMongoDatabaseOrDefault(dbName, factory);
@@ -214,19 +214,11 @@ public class ReactiveMongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
@@ -237,10 +229,6 @@ public class ReactiveMongoDatabaseUtils {
return Mono.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override
public Mono<Void> afterCompletion(int status) {
@@ -256,10 +244,6 @@ public class ReactiveMongoDatabaseUtils {
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -104,16 +104,12 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory,
@Nullable TransactionOptions options) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
Assert.notNull(databaseFactory, "DatabaseFactory must not be null");
this.databaseFactory = databaseFactory;
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException {
@@ -123,19 +119,11 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return new ReactiveMongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException {
@@ -175,10 +163,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException {
@@ -192,10 +176,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) {
@@ -203,10 +183,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -243,10 +219,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) {
@@ -268,10 +240,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -282,10 +250,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) {
@@ -317,7 +281,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
*/
public void setDatabaseFactory(ReactiveMongoDatabaseFactory databaseFactory) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
Assert.notNull(databaseFactory, "DatabaseFactory must not be null");
this.databaseFactory = databaseFactory;
}
@@ -340,10 +304,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return databaseFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDatabaseFactory();
@@ -363,7 +323,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoDatabaseFactory getRequiredDatabaseFactory() {
Assert.state(databaseFactory != null,
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory. Did you forget to provide one? It's required.");
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory; Did you forget to provide one; It's required");
return databaseFactory;
}
@@ -498,30 +458,22 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present. o_O");
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present; o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null.");
Assert.state(session != null, "A Session is required but it turned out to be null");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
throw new UnsupportedOperationException("flush() not supported");

View File

@@ -76,13 +76,13 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null!");
Assert.notNull(sessionType, "SessionType must not be null!");
Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null!");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null!");
Assert.notNull(session, "ClientSession must not be null");
Assert.notNull(target, "Target must not be null");
Assert.notNull(sessionType, "SessionType must not be null");
Assert.notNull(databaseType, "Database type must not be null");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null");
Assert.notNull(collectionType, "Collection type must not be null");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null");
this.session = session;
this.target = target;
@@ -95,10 +95,6 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.sessionType = sessionType;
}
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable
@Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable {

View File

@@ -0,0 +1,110 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
import org.springframework.data.util.TypeUtils;
/**
* @author Christoph Strobl
* @since 4.0
*/
public class LazyLoadingProxyAotProcessor {
private boolean generalLazyLoadingProxyContributed = false;
public void registerLazyLoadingProxyIfNeeded(Class<?> type, GenerationContext generationContext) {
Set<Field> refFields = getFieldsWithAnnotationPresent(type, Reference.class);
if (refFields.isEmpty()) {
return;
}
refFields.stream() //
.filter(LazyLoadingProxyAotProcessor::isLazyLoading) //
.forEach(field -> {
if (!generalLazyLoadingProxyContributed) {
generationContext.getRuntimeHints().proxies().registerJdkProxy(
TypeReference.of(org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class),
TypeReference.of(org.springframework.aop.SpringProxy.class),
TypeReference.of(org.springframework.aop.framework.Advised.class),
TypeReference.of(org.springframework.core.DecoratingProxy.class));
generalLazyLoadingProxyContributed = true;
}
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
generationContext.getRuntimeHints().proxies().registerJdkProxy(interfaces.toArray(Class[]::new));
} else {
Class<?> proxyClass = LazyLoadingProxyFactory.resolveProxyType(field.getType(),
() -> LazyLoadingInterceptor.none());
// see: spring-projects/spring-framework/issues/29309
generationContext.getRuntimeHints().reflection().registerType(proxyClass,
MemberCategory.INVOKE_DECLARED_CONSTRUCTORS, MemberCategory.INVOKE_DECLARED_METHODS, MemberCategory.DECLARED_FIELDS);
}
});
}
private static boolean isLazyLoading(Field field) {
if (AnnotatedElementUtils.isAnnotated(field, DBRef.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DBRef.class).lazy();
}
if (AnnotatedElementUtils.isAnnotated(field, DocumentReference.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DocumentReference.class).lazy();
}
return false;
}
private static Set<Field> getFieldsWithAnnotationPresent(Class<?> type, Class<? extends Annotation> annotation) {
Set<Field> fields = new LinkedHashSet<>();
for (Field field : type.getDeclaredFields()) {
if (MergedAnnotations.from(field).get(annotation).isPresent()) {
fields.add(field);
}
}
return fields;
}
}

View File

@@ -0,0 +1,45 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.util.function.Predicate;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.ReactiveWrappers;
import org.springframework.data.util.ReactiveWrappers.ReactiveLibrary;
import org.springframework.data.util.TypeUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* @author Christoph Strobl
* @since 4.0
*/
public class MongoAotPredicates {
public static final Predicate<Class<?>> IS_SIMPLE_TYPE = (type) -> MongoSimpleTypes.HOLDER.isSimpleType(type) || TypeUtils.type(type).isPartOf("org.bson");
public static final Predicate<ReactiveLibrary> IS_REACTIVE_LIBARARY_AVAILABLE = (lib) -> ReactiveWrappers.isAvailable(lib);
public static final Predicate<ClassLoader> IS_SYNC_CLIENT_PRESENT = (classLoader) -> ClassUtils.isPresent("com.mongodb.client.MongoClient", classLoader);
public static boolean isReactorPresent() {
return IS_REACTIVE_LIBARARY_AVAILABLE.test(ReactiveWrappers.ReactiveLibrary.PROJECT_REACTOR);
}
public static boolean isSyncClientPresent(@Nullable ClassLoader classLoader) {
return IS_SYNC_CLIENT_PRESENT.test(classLoader);
}
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.core.ResolvableType;
import org.springframework.data.aot.ManagedTypesBeanRegistrationAotProcessor;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* @author Christoph Strobl
* @since 2022/06
*/
class MongoManagedTypesBeanRegistrationAotProcessor extends ManagedTypesBeanRegistrationAotProcessor {
private final LazyLoadingProxyAotProcessor lazyLoadingProxyAotProcessor = new LazyLoadingProxyAotProcessor();
public MongoManagedTypesBeanRegistrationAotProcessor() {
setModuleIdentifier("mongo");
}
@Override
protected boolean isMatch(@Nullable Class<?> beanType, @Nullable String beanName) {
return isMongoManagedTypes(beanType) || super.isMatch(beanType, beanName);
}
protected boolean isMongoManagedTypes(@Nullable Class<?> beanType) {
return beanType != null && ClassUtils.isAssignable(MongoManagedTypes.class, beanType);
}
@Override
protected void contributeType(ResolvableType type, GenerationContext generationContext) {
if (MongoAotPredicates.IS_SIMPLE_TYPE.test(type.toClass())) {
return;
}
super.contributeType(type, generationContext);
lazyLoadingProxyAotProcessor.registerLazyLoadingProxyIfNeeded(type.toClass(), generationContext);
}
}

View File

@@ -0,0 +1,83 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import static org.springframework.data.mongodb.aot.MongoAotPredicates.*;
import java.util.Arrays;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.RuntimeHints;
import org.springframework.aot.hint.RuntimeHintsRegistrar;
import org.springframework.aot.hint.TypeReference;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCallback;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* {@link RuntimeHintsRegistrar} for repository types and entity callbacks.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 4.0
*/
class MongoRuntimeHints implements RuntimeHintsRegistrar {
@Override
public void registerHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
hints.reflection().registerTypes(
Arrays.asList(TypeReference.of(BeforeConvertCallback.class), TypeReference.of(BeforeSaveCallback.class),
TypeReference.of(AfterConvertCallback.class), TypeReference.of(AfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
registerTransactionProxyHints(hints, classLoader);
if (isReactorPresent()) {
hints.reflection()
.registerTypes(Arrays.asList(TypeReference.of(ReactiveBeforeConvertCallback.class),
TypeReference.of(ReactiveBeforeSaveCallback.class), TypeReference.of(ReactiveAfterConvertCallback.class),
TypeReference.of(ReactiveAfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
}
}
private static void registerTransactionProxyHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
if (MongoAotPredicates.isSyncClientPresent(classLoader)
&& ClassUtils.isPresent("org.springframework.aop.SpringProxy", classLoader)) {
hints.proxies().registerJdkProxy(TypeReference.of("com.mongodb.client.MongoDatabase"),
TypeReference.of("org.springframework.aop.SpringProxy"),
TypeReference.of("org.springframework.core.DecoratingProxy"));
hints.proxies().registerJdkProxy(TypeReference.of("com.mongodb.client.MongoCollection"),
TypeReference.of("org.springframework.aop.SpringProxy"),
TypeReference.of("org.springframework.core.DecoratingProxy"));
}
}
}

View File

@@ -25,9 +25,7 @@ import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
@@ -80,30 +78,12 @@ public abstract class AbstractMongoClientConfiguration extends MongoConfiguratio
return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions)
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)
* @see #mongoDbFactory()
*/
@Bean

View File

@@ -84,10 +84,10 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #reactiveMongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions)
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)
* @see #reactiveMongoDbFactory()
* @return never {@literal null}.
*/

View File

@@ -30,10 +30,6 @@ import com.mongodb.ConnectionString;
*/
public class ConnectionStringPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String connectionString) {

View File

@@ -34,10 +34,6 @@ import org.w3c.dom.Element;
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -46,10 +42,6 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -80,7 +80,7 @@ import org.w3c.dom.Element;
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("jakarta.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
/* (non-Javadoc)
@@ -253,7 +253,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured",
element);
return;
}
@@ -374,10 +374,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory)
throws IOException {

View File

@@ -47,28 +47,16 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEntityCallback.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {

View File

@@ -18,11 +18,10 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.core.Ordered;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
@@ -36,68 +35,42 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport implements Ordered {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration,
BeanDefinitionRegistry registry) {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
super.registerBeanDefinitions(annotationMetadata, registry);
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext");
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
Assert.notNull(configuration, "AuditingConfiguration must not be null");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
return configureDefaultAuditHandlerAttributes(configuration,
BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class));
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null");
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEntityCallback.class);
@@ -108,4 +81,8 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
AuditingEntityCallback.class.getName(), registry);
}
@Override
public int getOrder() {
return Ordered.LOWEST_PRECEDENCE;
}
}

View File

@@ -35,10 +35,6 @@ import org.w3c.dom.Element;
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);

View File

@@ -30,6 +30,7 @@ import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions.MongoConverterConfigurationAdapter;
import org.springframework.data.mongodb.core.mapping.Document;
@@ -76,14 +77,13 @@ public abstract class MongoConfigurationSupport {
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext(MongoCustomConversions customConversions)
throws ClassNotFoundException {
public MongoMappingContext mongoMappingContext(MongoCustomConversions customConversions,
MongoManagedTypes mongoManagedTypes) {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setManagedTypes(mongoManagedTypes);
mappingContext.setSimpleTypeHolder(customConversions.getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
@@ -91,6 +91,16 @@ public abstract class MongoConfigurationSupport {
return mappingContext;
}
/**
* @return new instance of {@link MongoManagedTypes}.
* @throws ClassNotFoundException
* @since 4.0
*/
@Bean
public MongoManagedTypes mongoManagedTypes() throws ClassNotFoundException {
return MongoManagedTypes.fromIterable(getInitialEntitySet());
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the

View File

@@ -51,10 +51,6 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String OPTIONS_DELIMITER = "?";
private static final String OPTION_VALUE_DELIMITER = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String text) throws IllegalArgumentException {
@@ -121,7 +117,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'", authMechanism));
}
}
} else {
@@ -198,7 +194,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
@@ -213,21 +209,21 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
"Credentials need to specify username and password like in 'username:password@database'");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
throw new IllegalArgumentException("Credentials need to specify username");
}
}
@@ -235,7 +231,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
try {
return URLDecoder.decode(it, "UTF-8");
} catch (UnsupportedEncodingException e) {
throw new IllegalArgumentException("o_O UTF-8 not supported!", e);
throw new IllegalArgumentException("o_O UTF-8 not supported", e);
}
}
}

View File

@@ -62,10 +62,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -74,10 +70,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
@@ -171,7 +163,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error("Configure either MongoDB " + type + " or details individually!",
parserContext.getReaderContext().error("Configure either MongoDB " + type + " or details individually",
parserContext.extractSource(element));
}

View File

@@ -26,10 +26,6 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());

View File

@@ -40,7 +40,6 @@ import org.w3c.dom.Element;
* @author Christoph Strobl
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}

View File

@@ -39,10 +39,6 @@ import org.w3c.dom.Element;
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -51,10 +47,6 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -41,19 +41,11 @@ public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEnti
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;

View File

@@ -18,11 +18,9 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
@@ -34,56 +32,42 @@ import org.springframework.util.Assert;
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 3.1
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration,
BeanDefinitionRegistry registry) {
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext");
}
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
Assert.notNull(configuration, "AuditingConfiguration must not be null");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
return configureDefaultAuditHandlerAttributes(configuration,
BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class));
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);

View File

@@ -32,10 +32,6 @@ import com.mongodb.ReadConcernLevel;
*/
public class ReadConcernPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
public void setAsText(@Nullable String readConcernString) {

View File

@@ -29,10 +29,6 @@ import com.mongodb.ReadPreference;
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {

View File

@@ -43,13 +43,9 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'. Check your replica set configuration!";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'; Check your replica set configuration";
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String replicaSetString) {
@@ -72,7 +68,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration! Validate your config!");
"Could not resolve at least one server of the replica set configuration; Validate your config");
}
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
@@ -129,7 +125,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
*/
private String[] extractHostAddressAndPort(String addressAndPortSource) {
Assert.notNull(addressAndPortSource, "Address and port source must not be null!");
Assert.notNull(addressAndPortSource, "Address and port source must not be null");
String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN);
String hostAddress = hostAndPort[0];

View File

@@ -26,10 +26,6 @@ import com.mongodb.WriteConcern;
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source);

View File

@@ -29,10 +29,6 @@ import org.springframework.util.StringUtils;
*/
public class UUidRepresentationPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String value) {

View File

@@ -66,11 +66,12 @@ class AggregationUtil {
if (!(aggregation instanceof TypedAggregation)) {
if (inputType == null) {
if(inputType == null) {
return untypedMappingContext.get();
}
if (domainTypeMapping == DomainTypeMapping.STRICT && !aggregation.getPipeline().containsUnionWith()) {
if (domainTypeMapping == DomainTypeMapping.STRICT
&& !aggregation.getPipeline().containsUnionWith()) {
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
@@ -78,7 +79,8 @@ class AggregationUtil {
}
inputType = ((TypedAggregation<?>) aggregation).getInputType();
if (domainTypeMapping == DomainTypeMapping.STRICT && !aggregation.getPipeline().containsUnionWith()) {
if (domainTypeMapping == DomainTypeMapping.STRICT
&& !aggregation.getPipeline().containsUnionWith()) {
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}

View File

@@ -36,21 +36,29 @@ import com.mongodb.client.model.changestream.OperationType;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1
*/
public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "converted");
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocument");
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocumentBeforeChange");
private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType;
private final MongoConverter converter;
// accessed through CONVERTED_UPDATER.
private volatile @Nullable T converted;
// accessed through CONVERTED_FULL_DOCUMENT_UPDATER.
private volatile @Nullable T convertedFullDocument;
// accessed through CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER.
private volatile @Nullable T convertedFullDocumentBeforeChange;
/**
* @param raw can be {@literal null}.
@@ -147,27 +155,43 @@ public class ChangeStreamEvent<T> {
@Nullable
public T getBody() {
if (raw == null) {
if (raw == null || raw.getFullDocument() == null) {
return null;
}
Document fullDocument = raw.getFullDocument();
return getConvertedFullDocument(raw.getFullDocument());
}
if (fullDocument == null) {
return targetType.cast(fullDocument);
/**
* Get the potentially converted {@link ChangeStreamDocument#getFullDocumentBeforeChange() document} before being changed.
*
* @return {@literal null} when {@link #getRaw()} or {@link ChangeStreamDocument#getFullDocumentBeforeChange()} is
* {@literal null}.
* @since 4.0
*/
@Nullable
public T getBodyBeforeChange() {
if (raw == null || raw.getFullDocumentBeforeChange() == null) {
return null;
}
return getConverted(fullDocument);
return getConvertedFullDocumentBeforeChange(raw.getFullDocumentBeforeChange());
}
@SuppressWarnings("unchecked")
private T getConverted(Document fullDocument) {
return (T) doGetConverted(fullDocument);
private T getConvertedFullDocumentBeforeChange(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER);
}
private Object doGetConverted(Document fullDocument) {
@SuppressWarnings("unchecked")
private T getConvertedFullDocument(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_UPDATER);
}
Object result = CONVERTED_UPDATER.get(this);
private Object doGetConverted(Document fullDocument, AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> updater) {
Object result = updater.get(this);
if (result != null) {
return result;
@@ -176,23 +200,19 @@ public class ChangeStreamEvent<T> {
if (ClassUtils.isAssignable(Document.class, fullDocument.getClass())) {
result = converter.read(targetType, fullDocument);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
return updater.compareAndSet(this, null, result) ? result : updater.get(this);
}
if (converter.getConversionService().canConvert(fullDocument.getClass(), targetType)) {
result = converter.getConversionService().convert(fullDocument, targetType);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
return updater.compareAndSet(this, null, result) ? result : updater.get(this);
}
throw new IllegalArgumentException(
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';

View File

@@ -32,6 +32,7 @@ import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.model.changestream.FullDocumentBeforeChange;
/**
* Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended
@@ -40,6 +41,7 @@ import com.mongodb.client.model.changestream.FullDocument;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1
*/
public class ChangeStreamOptions {
@@ -47,6 +49,7 @@ public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
@@ -74,6 +77,14 @@ public class ChangeStreamOptions {
return Optional.ofNullable(fullDocumentLookup);
}
/**
* @return {@link Optional#empty()} if not set.
* @since 4.0
*/
public Optional<FullDocumentBeforeChange> getFullDocumentBeforeChangeLookup() {
return Optional.ofNullable(fullDocumentBeforeChangeLookup);
}
/**
* @return {@link Optional#empty()} if not set.
*/
@@ -148,7 +159,7 @@ public class ChangeStreamOptions {
}
throw new IllegalArgumentException(
"o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
"o_O that should actually not happen; The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp));
}
@@ -170,6 +181,9 @@ public class ChangeStreamOptions {
if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.fullDocumentBeforeChangeLookup, that.fullDocumentBeforeChangeLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) {
return false;
}
@@ -184,6 +198,7 @@ public class ChangeStreamOptions {
int result = ObjectUtils.nullSafeHashCode(filter);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentBeforeChangeLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp);
result = 31 * result + ObjectUtils.nullSafeHashCode(resume);
@@ -220,6 +235,7 @@ public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
@@ -234,7 +250,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder collation(Collation collation) {
Assert.notNull(collation, "Collation must not be null nor empty!");
Assert.notNull(collation, "Collation must not be null nor empty");
this.collation = collation;
return this;
@@ -258,7 +274,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder filter(Aggregation filter) {
Assert.notNull(filter, "Filter must not be null!");
Assert.notNull(filter, "Filter must not be null");
this.filter = filter;
return this;
@@ -287,7 +303,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) {
Assert.notNull(resumeToken, "ResumeToken must not be null!");
Assert.notNull(resumeToken, "ResumeToken must not be null");
this.resumeToken = resumeToken;
@@ -316,12 +332,38 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) {
Assert.notNull(lookup, "Lookup must not be null!");
Assert.notNull(lookup, "Lookup must not be null");
this.fullDocumentLookup = lookup;
return this;
}
/**
* Set the {@link FullDocumentBeforeChange} lookup to use.
*
* @param lookup must not be {@literal null}.
* @return this.
* @since 4.0
*/
public ChangeStreamOptionsBuilder fullDocumentBeforeChangeLookup(FullDocumentBeforeChange lookup) {
Assert.notNull(lookup, "Lookup must not be null");
this.fullDocumentBeforeChangeLookup = lookup;
return this;
}
/**
* Return the full document before being changed if it is available.
*
* @return this.
* @since 4.0
* @see #fullDocumentBeforeChangeLookup(FullDocumentBeforeChange)
*/
public ChangeStreamOptionsBuilder returnFullDocumentBeforeChange() {
return fullDocumentBeforeChangeLookup(FullDocumentBeforeChange.WHEN_AVAILABLE);
}
/**
* Set the cluster time to resume from.
*
@@ -330,7 +372,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeAt(Instant resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null");
this.resumeTimestamp = resumeTimestamp;
return this;
@@ -345,7 +387,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null");
this.resumeTimestamp = resumeTimestamp;
return this;
@@ -391,6 +433,7 @@ public class ChangeStreamOptions {
options.filter = this.filter;
options.resumeToken = this.resumeToken;
options.fullDocumentLookup = this.fullDocumentLookup;
options.fullDocumentBeforeChangeLookup = this.fullDocumentBeforeChangeLookup;
options.collation = this.collation;
options.resumeTimestamp = this.resumeTimestamp;
options.resume = this.resume;

View File

@@ -47,23 +47,11 @@ public class CollectionOptions {
private @Nullable Collation collation;
private ValidationOptions validationOptions;
private @Nullable TimeSeriesOptions timeSeriesOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise. Can be {@literal null}.
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none(), null);
}
private @Nullable CollectionChangeStreamOptions changeStreamOptions;
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions, @Nullable TimeSeriesOptions timeSeriesOptions) {
@Nullable Collation collation, ValidationOptions validationOptions, @Nullable TimeSeriesOptions timeSeriesOptions,
@Nullable CollectionChangeStreamOptions changeStreamOptions) {
this.maxDocuments = maxDocuments;
this.size = size;
@@ -71,6 +59,7 @@ public class CollectionOptions {
this.collation = collation;
this.validationOptions = validationOptions;
this.timeSeriesOptions = timeSeriesOptions;
this.changeStreamOptions = changeStreamOptions;
}
/**
@@ -82,9 +71,9 @@ public class CollectionOptions {
*/
public static CollectionOptions just(Collation collation) {
Assert.notNull(collation, "Collation must not be null!");
Assert.notNull(collation, "Collation must not be null");
return new CollectionOptions(null, null, null, collation, ValidationOptions.none(), null);
return new CollectionOptions(null, null, null, collation, ValidationOptions.none(), null, null);
}
/**
@@ -94,7 +83,7 @@ public class CollectionOptions {
* @since 2.0
*/
public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null, ValidationOptions.none(), null);
return new CollectionOptions(null, null, null, null, ValidationOptions.none(), null, null);
}
/**
@@ -111,6 +100,18 @@ public class CollectionOptions {
return empty().timeSeries(TimeSeriesOptions.timeSeries(timeField));
}
/**
* Quick way to set up {@link CollectionOptions} for emitting (pre & post) change events.
*
* @return new instance of {@link CollectionOptions}.
* @see #changeStream(CollectionChangeStreamOptions)
* @see CollectionChangeStreamOptions#preAndPostImages(boolean)
* @since 4.0
*/
public static CollectionOptions emitChangedRevisions() {
return empty().changeStream(CollectionChangeStreamOptions.preAndPostImages(true));
}
/**
* Create new {@link CollectionOptions} with already given settings and capped set to {@literal true}. <br />
* <strong>NOTE</strong> Using capped collections requires defining {@link #size(long)}.
@@ -119,7 +120,8 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions, null);
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -130,7 +132,8 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -141,7 +144,8 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -152,7 +156,8 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -244,7 +249,7 @@ public class CollectionOptions {
*/
public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) {
Assert.notNull(validationLevel, "ValidationLevel must not be null!");
Assert.notNull(validationLevel, "ValidationLevel must not be null");
return validation(validationOptions.validationLevel(validationLevel));
}
@@ -258,7 +263,7 @@ public class CollectionOptions {
*/
public CollectionOptions schemaValidationAction(ValidationAction validationAction) {
Assert.notNull(validationAction, "ValidationAction must not be null!");
Assert.notNull(validationAction, "ValidationAction must not be null");
return validation(validationOptions.validationAction(validationAction));
}
@@ -271,8 +276,9 @@ public class CollectionOptions {
*/
public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
Assert.notNull(validationOptions, "ValidationOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -284,8 +290,23 @@ public class CollectionOptions {
*/
public CollectionOptions timeSeries(TimeSeriesOptions timeSeriesOptions) {
Assert.notNull(timeSeriesOptions, "TimeSeriesOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
Assert.notNull(timeSeriesOptions, "TimeSeriesOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param changeStreamOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions changeStream(CollectionChangeStreamOptions changeStreamOptions) {
Assert.notNull(changeStreamOptions, "ChangeStreamOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
@@ -346,11 +367,21 @@ public class CollectionOptions {
return Optional.ofNullable(timeSeriesOptions);
}
/**
* Get the {@link CollectionChangeStreamOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 4.0
*/
public Optional<CollectionChangeStreamOptions> getChangeStreamOptions() {
return Optional.ofNullable(changeStreamOptions);
}
@Override
public String toString() {
return "CollectionOptions{" + "maxDocuments=" + maxDocuments + ", size=" + size + ", capped=" + capped
+ ", collation=" + collation + ", validationOptions=" + validationOptions + ", timeSeriesOptions="
+ timeSeriesOptions + ", disableValidation="
+ timeSeriesOptions + ", changeStreamOptions=" + changeStreamOptions + ", disableValidation="
+ disableValidation() + ", strictValidation=" + strictValidation() + ", moderateValidation="
+ moderateValidation() + ", warnOnValidationError=" + warnOnValidationError() + ", failOnValidationError="
+ failOnValidationError() + '}';
@@ -382,7 +413,10 @@ public class CollectionOptions {
if (!ObjectUtils.nullSafeEquals(validationOptions, that.validationOptions)) {
return false;
}
return ObjectUtils.nullSafeEquals(timeSeriesOptions, that.timeSeriesOptions);
if (!ObjectUtils.nullSafeEquals(timeSeriesOptions, that.timeSeriesOptions)) {
return false;
}
return ObjectUtils.nullSafeEquals(changeStreamOptions, that.changeStreamOptions);
}
@Override
@@ -393,6 +427,7 @@ public class CollectionOptions {
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(validationOptions);
result = 31 * result + ObjectUtils.nullSafeHashCode(timeSeriesOptions);
result = 31 * result + ObjectUtils.nullSafeHashCode(changeStreamOptions);
return result;
}
@@ -526,6 +561,58 @@ public class CollectionOptions {
}
}
/**
* Encapsulation of options applied to define collections change stream behaviour.
*
* @author Christoph Strobl
* @since 4.0
*/
public static class CollectionChangeStreamOptions {
private final boolean preAndPostImages;
private CollectionChangeStreamOptions(boolean emitChangedRevisions) {
this.preAndPostImages = emitChangedRevisions;
}
/**
* Output the version of a document before and after changes (the document pre- and post-images).
*
* @return new instance of {@link CollectionChangeStreamOptions}.
*/
public static CollectionChangeStreamOptions preAndPostImages(boolean emitChangedRevisions) {
return new CollectionChangeStreamOptions(true);
}
public boolean getPreAndPostImages() {
return preAndPostImages;
}
@Override
public String toString() {
return "CollectionChangeStreamOptions{" + "preAndPostImages=" + preAndPostImages + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
CollectionChangeStreamOptions that = (CollectionChangeStreamOptions) o;
return preAndPostImages == that.preAndPostImages;
}
@Override
public int hashCode() {
return (preAndPostImages ? 1 : 0);
}
}
/**
* Options applicable to Time Series collections.
*
@@ -544,7 +631,7 @@ public class CollectionOptions {
private TimeSeriesOptions(String timeField, @Nullable String metaField, GranularityDefinition granularity) {
Assert.hasText(timeField, "Time field must not be empty or null!");
Assert.hasText(timeField, "Time field must not be empty or null");
this.timeField = timeField;
this.metaField = metaField;

View File

@@ -187,7 +187,7 @@ class CountQuery {
criteria.addAll(andElements);
} else {
throw new IllegalArgumentException(
"Cannot rewrite query as it contains an '$and' element that is not a Collection!: Offending element: "
"Cannot rewrite query as it contains an '$and' element that is not a Collection: Offending element: "
+ $and);
}
} else {
@@ -208,9 +208,7 @@ class CountQuery {
return (Number) source.get("$maxDistance");
}
if ($near instanceof Document) {
Document nearDoc = (Document) $near;
if ($near instanceof Document nearDoc) {
if (nearDoc.containsKey("$maxDistance")) {
@@ -248,8 +246,7 @@ class CountQuery {
return Arrays.asList(((Point) value).getX(), ((Point) value).getY());
}
if (value instanceof Document) {
Document document = (Document) value;
if (value instanceof Document document) {
if (document.containsKey("x")) {
return Arrays.asList(document.get("x"), document.get("y"));

View File

@@ -61,8 +61,8 @@ public interface CursorPreparer extends ReadPreferenceAware {
default FindIterable<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindIterable<Document>> find) {
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
Assert.notNull(collection, "Collection must not be null");
Assert.notNull(find, "Find function must not be null");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());

View File

@@ -90,9 +90,9 @@ class DefaultBulkOperations implements BulkOperations {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
@@ -109,14 +109,10 @@ class DefaultBulkOperations implements BulkOperations {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
Assert.notNull(document, "Document must not be null");
maybeEmitEvent(new BeforeConvertEvent<>(document, collectionName));
Object source = maybeInvokeBeforeConvertCallback(document);
@@ -125,42 +121,30 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
Assert.notNull(documents, "Documents must not be null");
documents.forEach(this::insert);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
@@ -169,28 +153,20 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
Assert.notNull(updates, "Updates must not be null");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
@@ -199,19 +175,11 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
@@ -222,14 +190,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
@@ -239,14 +203,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
Assert.notNull(removes, "Removals must not be null");
for (Query query : removes) {
remove(query);
@@ -255,16 +215,12 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(options, "Options must not be null!");
Assert.notNull(query, "Query must not be null");
Assert.notNull(replacement, "Replacement must not be null");
Assert.notNull(options, "Options must not be null");
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert());
@@ -278,10 +234,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public com.mongodb.bulk.BulkWriteResult execute() {
@@ -289,7 +241,7 @@ class DefaultBulkOperations implements BulkOperations {
com.mongodb.bulk.BulkWriteResult result = mongoOperations.execute(collectionName, this::bulkWriteTo);
Assert.state(result != null, "Result must not be null.");
Assert.state(result != null, "Result must not be null");
models.forEach(this::maybeEmitAfterSaveEvent);
models.forEach(this::maybeInvokeAfterSaveCallback);
@@ -356,8 +308,8 @@ class DefaultBulkOperations implements BulkOperations {
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
UpdateOptions options = computeUpdateOptions(query, update, upsert);
@@ -518,7 +470,7 @@ class DefaultBulkOperations implements BulkOperations {
return options.ordered(false);
}
throw new IllegalStateException("BulkMode was null!");
throw new IllegalStateException("BulkMode was null");
}
/**

View File

@@ -83,9 +83,9 @@ public class DefaultIndexOperations implements IndexOperations {
public DefaultIndexOperations(MongoDatabaseFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
@Nullable Class<?> type) {
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null");
Assert.notNull(collectionName, "Collection name can not be null");
Assert.notNull(queryMapper, "QueryMapper must not be null");
this.collectionName = collectionName;
this.mapper = queryMapper;
@@ -103,8 +103,8 @@ public class DefaultIndexOperations implements IndexOperations {
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, @Nullable Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "Collection name must not be null or empty");
this.mongoOperations = mongoOperations;
this.mapper = new QueryMapper(mongoOperations.getConverter());
@@ -112,10 +112,6 @@ public class DefaultIndexOperations implements IndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public String ensureIndex(final IndexDefinition indexDefinition) {
return execute(collection -> {
@@ -150,10 +146,6 @@ public class DefaultIndexOperations implements IndexOperations {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
execute(collection -> {
@@ -163,18 +155,10 @@ public class DefaultIndexOperations implements IndexOperations {
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return execute(new CollectionCallback<List<IndexInfo>>() {
@@ -206,7 +190,7 @@ public class DefaultIndexOperations implements IndexOperations {
@Nullable
public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null!");
Assert.notNull(callback, "CollectionCallback must not be null");
if (type != null) {
return mongoOperations.execute(type, callback);

View File

@@ -42,10 +42,6 @@ class DefaultIndexOperationsProvider implements IndexOperationsProvider {
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName, Class<?> type) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type);

View File

@@ -76,9 +76,9 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
private DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName,
QueryMapper queryMapper, Optional<Class<?>> type) {
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null!");
Assert.notNull(collectionName, "Collection must not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null");
Assert.notNull(collectionName, "Collection must not be null");
Assert.notNull(queryMapper, "QueryMapper must not be null");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
@@ -86,10 +86,6 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, collection -> {
@@ -119,26 +115,14 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //

View File

@@ -31,7 +31,6 @@ import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
@@ -65,41 +64,29 @@ class DefaultScriptOperations implements ScriptOperations {
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(mongoOperations, "MongoOperations must not be null");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
Assert.notNull(script, "Script must not be null");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
Assert.notNull(script, "Script must not be null");
return mongoOperations.execute(new DbCallback<Object>() {
@@ -115,14 +102,10 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
Assert.hasText(scriptName, "ScriptName must not be null or empty");
return mongoOperations.execute(new DbCallback<Object>() {
@@ -135,22 +118,14 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
Assert.hasText(scriptName, "ScriptName must not be null or empty");
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {

View File

@@ -0,0 +1,60 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.lang.Nullable;
/**
* Delegate class to encapsulate lifecycle event configuration and publishing.
*
* @author Mark Paluch
* @since 4.0
* @see ApplicationEventPublisher
*/
class EntityLifecycleEventDelegate {
private @Nullable ApplicationEventPublisher publisher;
private boolean eventsEnabled = true;
public void setPublisher(@Nullable ApplicationEventPublisher publisher) {
this.publisher = publisher;
}
public boolean isEventsEnabled() {
return eventsEnabled;
}
public void setEventsEnabled(boolean eventsEnabled) {
this.eventsEnabled = eventsEnabled;
}
/**
* Publish an application event if event publishing is enabled.
*
* @param event the application event.
*/
public void publishEvent(Object event) {
if (canPublishEvent()) {
publisher.publishEvent(event);
}
}
private boolean canPublishEvent() {
return publisher != null && eventsEnabled;
}
}

View File

@@ -21,7 +21,6 @@ import java.util.Map;
import java.util.Optional;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.convert.CustomConversions;
@@ -57,6 +56,7 @@ import org.springframework.util.MultiValueMap;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.ChangeStreamPreAndPostImagesOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.TimeSeriesGranularity;
import com.mongodb.client.model.ValidationOptions;
@@ -112,7 +112,7 @@ class EntityOperations {
@SuppressWarnings({ "unchecked", "rawtypes" })
<T> Entity<T> forEntity(T entity) {
Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(entity, "Bean must not be null");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
@@ -135,8 +135,8 @@ class EntityOperations {
@SuppressWarnings({ "unchecked", "rawtypes" })
<T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(conversionService, "ConversionService must not be null!");
Assert.notNull(entity, "Bean must not be null");
Assert.notNull(conversionService, "ConversionService must not be null");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
@@ -171,7 +171,7 @@ class EntityOperations {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
"No class parameter provided, entity collection can't be determined!");
"No class parameter provided, entity collection can't be determined");
}
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(entityClass);
@@ -208,7 +208,7 @@ class EntityOperations {
*/
public String getIdPropertyName(Class<?> type) {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(type, "Type must not be null");
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(type);
@@ -247,12 +247,12 @@ class EntityOperations {
try {
return Document.parse(source);
} catch (org.bson.json.JsonParseException o_O) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
throw new MappingException("Could not parse given String to save into a JSON document", o_O);
} catch (RuntimeException o_O) {
// legacy 3.x exception
if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
throw new MappingException("Could not parse given String to save into a JSON document", o_O);
}
throw o_O;
}
@@ -287,12 +287,12 @@ class EntityOperations {
}
/**
* Convert given {@link CollectionOptions} to a document and take the domain type information into account when
* creating a mapped schema for validation.
* Convert {@link CollectionOptions} to {@link CreateCollectionOptions} using {@link Class entityType} to obtain
* mapping metadata.
*
* @param collectionOptions can be {@literal null}.
* @param entityType must not be {@literal null}. Use {@link Object} type instead.
* @return the converted {@link CreateCollectionOptions}.
* @param collectionOptions
* @param entityType
* @return
* @since 3.4
*/
public CreateCollectionOptions convertToCreateCollectionOptions(@Nullable CollectionOptions collectionOptions,
@@ -341,6 +341,9 @@ class EntityOperations {
result.timeSeriesOptions(options);
});
collectionOptions.getChangeStreamOptions().ifPresent(it -> result
.changeStreamPreAndPostImagesOptions(new ChangeStreamPreAndPostImagesOptions(it.getPreAndPostImages())));
return result;
}
@@ -499,37 +502,21 @@ class EntityOperations {
this.map = map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return ID_FIELD;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return map.get(ID_FIELD);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -539,19 +526,11 @@ class EntityOperations {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion()
*/
@Override
public Query getQueryForVersion() {
throw new MappingException("Cannot query for version on plain Documents!");
throw new MappingException("Cannot query for version on plain Documents");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document //
@@ -559,47 +538,27 @@ class EntityOperations {
: new Document(map));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#incrementVersion()
*/
@Override
public T incrementVersion() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return map.get(ID_FIELD) != null;
@@ -612,10 +571,6 @@ class EntityOperations {
super(map);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
@SuppressWarnings("unchecked")
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -655,33 +610,21 @@ class EntityOperations {
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return entity.getRequiredIdProperty().getFieldName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return idAccessor.getRequiredIdentifier();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
if (!entity.hasIdProperty()) {
throw new MappingException("No id property found for object of type " + entity.getType() + "!");
throw new MappingException("No id property found for object of type " + entity.getType());
}
MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
@@ -689,10 +632,6 @@ class EntityOperations {
return Query.query(Criteria.where(idProperty.getName()).is(getId()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion(java.lang.Object)
*/
@Override
public Query getQueryForVersion() {
@@ -703,10 +642,6 @@ class EntityOperations {
.and(versionProperty.getName()).is(getVersion()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -722,10 +657,6 @@ class EntityOperations {
return MappedDocument.of(document);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#assertUpdateableIdIfNotSet()
*/
public void assertUpdateableIdIfNotSet() {
if (!entity.hasIdProperty()) {
@@ -741,43 +672,27 @@ class EntityOperations {
if (!MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(property.getType())) {
throw new InvalidDataAccessApiUsageException(
String.format("Cannot autogenerate id of type %s for entity of type %s!", property.getType().getName(),
String.format("Cannot autogenerate id of type %s for entity of type %s", property.getType().getName(),
entity.getType().getName()));
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#isVersionedEntity()
*/
@Override
public boolean isVersionedEntity() {
return entity.hasVersionProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getVersion()
*/
@Override
@Nullable
public Object getVersion() {
return propertyAccessor.getProperty(entity.getRequiredVersionProperty());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
@@ -812,10 +727,6 @@ class EntityOperations {
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -837,10 +748,6 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MappedEntity#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
@@ -850,10 +757,6 @@ class EntityOperations {
return propertyAccessor.getProperty(versionProperty, Number.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
@@ -868,10 +771,6 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#incrementVersion()
*/
@Override
public T incrementVersion() {
@@ -943,19 +842,11 @@ class EntityOperations {
return (TypedOperations) INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
@@ -990,19 +881,11 @@ class EntityOperations {
this.entity = entity;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.ofNullable(entity.getCollation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {

View File

@@ -15,9 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.util.CloseableIterator;
/**
* {@link ExecutableAggregationOperation} allows creation and execution of MongoDB aggregation operations in a fluent
@@ -88,12 +89,12 @@ public interface ExecutableAggregationOperation {
/**
* Apply pipeline operations as specified and stream all matching elements. <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable}
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable}
*
* @return a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* Never {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
*/
CloseableIterator<T> stream();
Stream<T> stream();
}
/**

View File

@@ -15,10 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -37,14 +38,10 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ExecutableAggregation<T> aggregateAndReturn(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableAggregationSupport<>(template, domainType, null, null);
}
@@ -69,45 +66,29 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationWithAggregation<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithAggregation#by(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public TerminatingAggregation<T> by(Aggregation aggregation) {
Assert.notNull(aggregation, "Aggregation must not be null!");
Assert.notNull(aggregation, "Aggregation must not be null");
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#all()
*/
@Override
public AggregationResults<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#stream()
*/
@Override
public CloseableIterator<T> stream() {
public Stream<T> stream() {
return template.aggregateStream(aggregation, getCollectionName(aggregation), domainType);
}

View File

@@ -118,8 +118,8 @@ public interface ExecutableFindOperation {
/**
* Stream all matching elements.
*
* @return a {@link Stream} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed. Never
* {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
*/
Stream<T> stream();

View File

@@ -20,12 +20,11 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -51,14 +50,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation#query(java.lang.Class)
*/
@Override
public <T> ExecutableFind<T> query(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableFindSupport<>(template, domainType, domainType, null, ALL_QUERY);
}
@@ -74,11 +69,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
@Nullable private final String collection;
private final @Nullable String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
@Nullable String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
@@ -86,46 +81,30 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
Assert.hasText(collection, "Collection name must not be null nor empty");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection#as(Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
Assert.notNull(returnType, "ReturnType must not be null!");
Assert.notNull(returnType, "ReturnType must not be null");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#oneValue()
*/
@Override
public T oneValue() {
@@ -136,16 +115,12 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
}
if (result.size() > 1) {
throw new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result.", 1);
throw new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result", 1);
}
return result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#firstValue()
*/
@Override
public T firstValue() {
@@ -154,60 +129,36 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return ObjectUtils.isEmpty(result) ? null : result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#all()
*/
@Override
public List<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#stream()
*/
@Override
public Stream<T> stream() {
return StreamUtils.createStreamFromIterator(doStream());
return doStream();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#count()
*/
@Override
public long count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#exists()
*/
@Override
public boolean exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked")
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
Assert.notNull(field, "Field must not be null");
return new DistinctOperationSupport(this, field);
}
@@ -227,7 +178,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private CloseableIterator<T> doStream() {
private Stream<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType);
}
@@ -257,10 +208,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.clientFindIterable)
*/
@Override
public FindIterable<Document> prepare(FindIterable<Document> iterable) {
@@ -295,35 +242,23 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override
public List<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -40,14 +40,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.coreExecutableInsertOperation#insert(java.lan.Class)
*/
@Override
public <T> ExecutableInsert<T> insert(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableInsertSupport<>(template, domainType, null, null);
}
@@ -71,63 +67,43 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.bulkMode = bulkMode;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#insert(java.lang.Class)
*/
@Override
public T one(T object) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(object, "Object must not be null");
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Collection<T> all(Collection<? extends T> objects) {
Assert.notNull(objects, "Objects must not be null!");
Assert.notNull(objects, "Objects must not be null");
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingBulkInsert#bulk(java.util.Collection)
*/
@Override
public BulkWriteResult bulk(Collection<? extends T> objects) {
Assert.notNull(objects, "Objects must not be null!");
Assert.notNull(objects, "Objects must not be null");
return template.bulkOps(bulkMode != null ? bulkMode : BulkMode.ORDERED, domainType, getCollectionName())
.insert(new ArrayList<>(objects)).execute();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public InsertWithBulkMode<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty.");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithBulkMode#withBulkMode(org.springframework.data.mongodb.core.BulkMode)
*/
@Override
public TerminatingBulkInsert<T> withBulkMode(BulkMode bulkMode) {
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null");
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
}

View File

@@ -187,7 +187,9 @@ public interface ExecutableMapReduceOperation {
*
* @author Christoph Strobl
* @since 2.1
* @deprecated since 4.0 in favor of {@link org.springframework.data.mongodb.core.aggregation}.
*/
@Deprecated
interface MapReduceWithOptions<T> {
/**

View File

@@ -37,7 +37,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
ExecutableMapReduceOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
Assert.notNull(template, "Template must not be null");
this.template = template;
}
@@ -48,7 +48,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public <T> ExecutableMapReduceSupport<T> mapReduce(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null);
}
@@ -101,7 +101,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public MapReduceWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
Assert.hasText(collection, "Collection name must not be null nor empty");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -114,7 +114,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public TerminatingMapReduce<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -127,7 +127,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public <R> MapReduceWithQuery<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new ExecutableMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction,
reduceFunction, options);
@@ -140,7 +140,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public ExecutableMapReduce<T> with(MapReduceOptions options) {
Assert.notNull(options, "Options must not be null! Please consider empty MapReduceOptions#options() instead.");
Assert.notNull(options, "Options must not be null Please consider empty MapReduceOptions#options() instead");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -153,7 +153,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public MapReduceWithReduceFunction<T> map(String mapFunction) {
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty!");
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -166,7 +166,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override
public ExecutableMapReduce<T> reduce(String reduceFunction) {
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty!");
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);

View File

@@ -41,14 +41,10 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ExecutableRemove<T> remove(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableRemoveSupport<>(tempate, domainType, ALL_QUERY, null);
}
@@ -71,52 +67,32 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithCollection#inCollection(java.lang.String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#all()
*/
@Override
public DeleteResult all() {
return template.doRemove(getCollectionName(), query, domainType, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#one()
*/
@Override
public DeleteResult one() {
return template.doRemove(getCollectionName(), query, domainType, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public List<T> findAndRemove() {

View File

@@ -40,14 +40,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ExecutableUpdate<T> update(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ExecutableUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType);
}
@@ -85,128 +81,84 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(UpdateDefinition update) {
Assert.notNull(update, "Update must not be null!");
Assert.notNull(update, "Update must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
Assert.notNull(options, "Options must not be null!");
Assert.notNull(options, "Options must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, options,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#replaceWith(Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(replacement, "Replacement must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
Assert.notNull(options, "Options must not be null!");
Assert.notNull(options, "Options must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
options, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, resultType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#all()
*/
@Override
public UpdateResult all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#first()
*/
@Override
public UpdateResult first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public UpdateResult upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue()
*/
@Override
public @Nullable T findAndModifyValue() {
@@ -215,10 +167,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndReplace#findAndReplaceValue()
*/
@Override
public @Nullable T findAndReplaceValue() {

View File

@@ -35,7 +35,7 @@ public class FindAndModifyOptions {
private static final FindAndModifyOptions NONE = new FindAndModifyOptions() {
private static final String ERROR_MSG = "FindAndModifyOptions.none() cannot be changed. Please use FindAndModifyOptions.options() instead.";
private static final String ERROR_MSG = "FindAndModifyOptions.none() cannot be changed; Please use FindAndModifyOptions.options() instead";
@Override
public FindAndModifyOptions returnNew(boolean returnNew) {

View File

@@ -38,7 +38,7 @@ public class FindAndReplaceOptions {
private static final FindAndReplaceOptions NONE = new FindAndReplaceOptions() {
private static final String ERROR_MSG = "FindAndReplaceOptions.none() cannot be changed. Please use FindAndReplaceOptions.options() instead.";
private static final String ERROR_MSG = "FindAndReplaceOptions.none() cannot be changed; Please use FindAndReplaceOptions.options() instead";
@Override
public FindAndReplaceOptions returnNew() {

View File

@@ -61,8 +61,8 @@ public interface FindPublisherPreparer extends ReadPreferenceAware {
default FindPublisher<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindPublisher<Document>> find) {
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
Assert.notNull(collection, "Collection must not be null");
Assert.notNull(find, "Find function must not be null");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());

View File

@@ -39,7 +39,7 @@ class GeoCommandStatistics {
*/
private GeoCommandStatistics(Document source) {
Assert.notNull(source, "Source document must not be null!");
Assert.notNull(source, "Source document must not be null");
this.source = source;
}
@@ -51,7 +51,7 @@ class GeoCommandStatistics {
*/
public static GeoCommandStatistics from(Document commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Assert.notNull(commandResult, "Command result must not be null");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((Document) stats);

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.function.Function;
import org.bson.conversions.Bson;
import org.springframework.data.mongodb.CodecRegistryProvider;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
/**
* Function object to apply a query hint. Can be an index name or a BSON document.
*
* @author Mark Paluch
* @since 4.1
*/
class HintFunction {
private static final HintFunction EMPTY = new HintFunction(null);
private final @Nullable Object hint;
private HintFunction(@Nullable Object hint) {
this.hint = hint;
}
/**
* Return an empty hint function.
*
* @return
*/
static HintFunction empty() {
return EMPTY;
}
/**
* Create a {@link HintFunction} from a {@link Bson document} or {@link String index name}.
*
* @param hint
* @return
*/
static HintFunction from(@Nullable Object hint) {
return new HintFunction(hint);
}
/**
* Return whether a hint is present.
*
* @return
*/
public boolean isPresent() {
return (hint instanceof String hintString && StringUtils.hasText(hintString)) || hint instanceof Bson;
}
/**
* Apply the hint to consumers depending on the hint format.
*
* @param registryProvider
* @param stringConsumer
* @param bsonConsumer
* @return
* @param <R>
*/
public <R> R apply(@Nullable CodecRegistryProvider registryProvider, Function<String, R> stringConsumer,
Function<Bson, R> bsonConsumer) {
if (!isPresent()) {
throw new IllegalStateException("No hint present");
}
if (hint instanceof Bson bson) {
return bsonConsumer.apply(bson);
}
if (hint instanceof String hintString) {
if (BsonUtils.isJsonDocument(hintString)) {
return bsonConsumer.apply(BsonUtils.parse(hintString, registryProvider));
}
return stringConsumer.apply(hintString);
}
throw new IllegalStateException(
"Unable to read hint of type %s".formatted(hint != null ? hint.getClass() : "null"));
}
}

View File

@@ -122,55 +122,31 @@ public class MappedDocument {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override
public Document getUpdateObject() {
return delegate.getUpdateObject();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override
public boolean modifies(String key) {
return delegate.modifies(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override
public void inc(String version) {
delegate.inc(version);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override
public Boolean isIsolated() {
return delegate.isIsolated();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override
public boolean hasArrayFilters() {
return delegate.hasArrayFilters();

View File

@@ -40,7 +40,7 @@ import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.MongoJsonSchemaBuilder;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
@@ -81,7 +81,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
Predicate<JsonSchemaPropertyContext> filter, LinkedMultiValueMap<String, Class<?>> mergeProperties) {
Assert.notNull(converter, "Converter must not be null!");
Assert.notNull(converter, "Converter must not be null");
this.converter = converter;
this.mappingContext = mappingContext;
this.filter = filter;
@@ -115,10 +115,6 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, clone);
}
/*
* (non-Javadoc)
* org.springframework.data.mongodb.core.MongoJsonSchemaCreator#createSchemaFor(java.lang.Class)
*/
@Override
public MongoJsonSchema createSchemaFor(Class<?> type) {
@@ -271,7 +267,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
}
private boolean isSpecificType(MongoPersistentProperty property) {
return !ClassTypeInformation.OBJECT.equals(property.getTypeInformation().getActualType());
return !TypeInformation.OBJECT.equals(property.getTypeInformation().getActualType());
}
private JsonSchemaProperty applyEncryptionDataIfNecessary(MongoPersistentProperty property,

View File

@@ -57,8 +57,8 @@ public class MongoAction {
public MongoAction(@Nullable WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, @Nullable Class<?> entityType, @Nullable Document document, @Nullable Document query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(mongoActionOperation, "MongoActionOperation must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty");
Assert.notNull(mongoActionOperation, "MongoActionOperation must not be null");
this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation;

View File

@@ -42,29 +42,20 @@ public class MongoAdmin implements MongoAdminOperations {
*/
public MongoAdmin(MongoClient client) {
Assert.notNull(client, "Client must not be null!");
Assert.notNull(client, "Client must not be null");
this.mongoClient = client;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).drop();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson();

View File

@@ -119,27 +119,15 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends MongoClient> getObjectType() {
return MongoClient.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClient createInstance() throws Exception {
return createMongoClient(computeClientSetting());
@@ -158,7 +146,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
protected MongoClientSettings computeClientSetting() {
if (connectionString != null && (StringUtils.hasText(host) || port != null)) {
throw new IllegalStateException("ConnectionString and host/port configuration exclude one another!");
throw new IllegalStateException("ConnectionString and host/port configuration exclude one another");
}
ConnectionString connectionString = this.connectionString != null ? this.connectionString
@@ -336,10 +324,6 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
return !fromConnectionStringIsDefault ? fromConnectionString : defaultValue;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
@@ -353,6 +337,11 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
}
private String getOrDefault(Object value, String defaultValue) {
return !StringUtils.isEmpty(value) ? value.toString() : defaultValue;
if(value == null) {
return defaultValue;
}
String sValue = value.toString();
return StringUtils.hasText(sValue) ? sValue : defaultValue;
}
}

View File

@@ -44,8 +44,8 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
super(message);
Assert.notNull(writeResult, "WriteResult must not be null!");
Assert.notNull(actionOperation, "MongoActionOperation must not be null!");
Assert.notNull(writeResult, "WriteResult must not be null");
Assert.notNull(actionOperation, "MongoActionOperation must not be null");
this.writeResult = writeResult;
this.actionOperation = actionOperation;

View File

@@ -64,10 +64,10 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
protected MongoDatabaseFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
Assert.notNull(mongoClient, "MongoClient must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs");
this.mongoClient = mongoClient;
this.databaseName = databaseName;
@@ -84,22 +84,14 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
public MongoDatabase getMongoDatabase() throws DataAccessException {
return getMongoDatabase(getDefaultDatabaseName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty!");
Assert.hasText(dbName, "Database name must not be empty");
MongoDatabase db = doGetMongoDatabase(dbName);
@@ -118,28 +110,16 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
*/
protected abstract MongoDatabase doGetMongoDatabase(String dbName);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
closeClient();
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.Session)
*/
public MongoDatabaseFactory withSession(ClientSession session) {
return new MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session, this);
}
@@ -180,55 +160,31 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
@Override
public MongoDatabase getMongoDatabase() throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase(dbName));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#isTransactionActive()
*/
@Override
public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction();

View File

@@ -1,50 +0,0 @@
/*
* Copyright 2018-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.support.PersistenceExceptionTranslator;
/**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as
* database name and exception translator.
* <br />
* Not intended to be used directly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @param <C> Client type.
* @since 2.1
* @see SimpleMongoClientDatabaseFactory
* @deprecated since 3.0, use {@link MongoDatabaseFactorySupport} instead.
*/
@Deprecated
public abstract class MongoDbFactorySupport<C> extends MongoDatabaseFactorySupport<C> {
/**
* Create a new {@link MongoDbFactorySupport} object given {@code mongoClient}, {@code databaseName},
* {@code mongoInstanceCreated} and {@link PersistenceExceptionTranslator}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated {@literal true} if the client instance was created by a subclass of
* {@link MongoDbFactorySupport} to close the client on {@link #destroy()}.
* @param exceptionTranslator must not be {@literal null}.
*/
protected MongoDbFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
super(mongoClient, databaseName, mongoInstanceCreated, exceptionTranslator);
}
}

View File

@@ -88,10 +88,6 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
this.schemaMap = schemaMap;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public AutoEncryptionSettings getObject() {
@@ -109,10 +105,6 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
return source != null ? source : Collections.emptyMap();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return AutoEncryptionSettings.class;

View File

@@ -68,10 +68,8 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
private static final Set<String> SECURITY_EXCEPTIONS = Set.of("MongoCryptException");
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
@@ -135,6 +133,8 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new ClientSessionException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isTransactionFailureCode(code)) {
return new MongoTransactionException(ex.getMessage(), ex);
} else if(ex.getCause() != null && SECURITY_EXCEPTIONS.contains(ClassUtils.getShortName(ex.getCause().getClass()))) {
return new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);

View File

@@ -192,7 +192,7 @@ public interface MongoJsonSchemaCreator {
*/
static MongoJsonSchemaCreator create(MongoConverter mongoConverter) {
Assert.notNull(mongoConverter, "MongoConverter must not be null!");
Assert.notNull(mongoConverter, "MongoConverter must not be null");
return new MappingMongoJsonSchemaCreator(mongoConverter);
}

View File

@@ -20,20 +20,21 @@ import java.util.List;
import java.util.Set;
import java.util.function.Consumer;
import java.util.function.Supplier;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.AggregationUpdate;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
@@ -42,7 +43,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -181,7 +181,7 @@ public interface MongoOperations extends FluentMongoOperations {
*/
default SessionScoped withSession(Supplier<ClientSession> sessionProvider) {
Assert.notNull(sessionProvider, "SessionProvider must not be null!");
Assert.notNull(sessionProvider, "SessionProvider must not be null");
return new SessionScoped() {
@@ -220,34 +220,34 @@ public interface MongoOperations extends FluentMongoOperations {
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
* Returns a {@link String} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param <T> element return type
* @return will never be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
<T> Stream<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param <T> element return type
* @return will never be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
<T> Stream<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
@@ -283,6 +283,56 @@ public interface MongoOperations extends FluentMongoOperations {
*/
MongoCollection<Document> createCollection(String collectionName, @Nullable CollectionOptions collectionOptions);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationOperation pipeline
* stages} on another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param stages the {@link AggregationOperation aggregation pipeline stages} defining the view content.
* @since 4.0
*/
default MongoCollection<Document> createView(String name, Class<?> source, AggregationOperation... stages) {
return createView(name, source, AggregationPipeline.of(stages));
}
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @since 4.0
*/
default MongoCollection<Document> createView(String name, Class<?> source, AggregationPipeline pipeline) {
return createView(name, source, pipeline, null);
}
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
MongoCollection<Document> createView(String name, Class<?> source, AggregationPipeline pipeline, @Nullable ViewOptions options);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given source.
*
* @param name the name of the view to create.
* @param source the name of the collection or view defining the to be created views source.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
MongoCollection<Document> createView(String name, String source, AggregationPipeline pipeline, @Nullable ViewOptions options);
/**
* A set of collection names.
*
@@ -416,43 +466,6 @@ public interface MongoOperations extends FluentMongoOperations {
*/
<T> List<T> findAll(Class<T> entityClass, String collectionName);
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} and
* {@link org.springframework.data.mongodb.core.aggregation.MatchOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(@Nullable Criteria criteria, String inputCollectionName, GroupBy groupBy,
Class<T> entityClass);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
@@ -507,9 +520,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class. The name of the inputCollection is
* derived from the inputType of the aggregation.
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class. The name of the inputCollection is derived from
* the inputType of the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
@@ -518,31 +531,37 @@ public interface MongoOperations extends FluentMongoOperations {
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}. <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class and are returned as stream. The name
* of the inputCollection is derived from the inputType of the aggregation. <br />
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the inputType of the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}. <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class. <br />
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -551,15 +570,18 @@ public interface MongoOperations extends FluentMongoOperations {
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
<O> Stream<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}. <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class. <br />
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -568,10 +590,11 @@ public interface MongoOperations extends FluentMongoOperations {
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
<O> Stream<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
@@ -1019,7 +1042,7 @@ public interface MongoOperations extends FluentMongoOperations {
@Nullable
default <T> T findAndReplace(Query query, T replacement, FindAndReplaceOptions options, String collectionName) {
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(replacement, "Replacement must not be null");
return findAndReplace(query, replacement, options, (Class<T>) ClassUtils.getUserClass(replacement), collectionName);
}
@@ -1214,7 +1237,7 @@ public interface MongoOperations extends FluentMongoOperations {
*/
default long estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
Assert.notNull(entityClass, "Entity class must not be null");
return estimatedCount(getCollectionName(entityClass));
}
@@ -1600,7 +1623,7 @@ public interface MongoOperations extends FluentMongoOperations {
DeleteResult remove(Object object, String collectionName);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
@@ -1613,7 +1636,7 @@ public interface MongoOperations extends FluentMongoOperations {
DeleteResult remove(Query query, Class<?> entityClass);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
@@ -1663,9 +1686,9 @@ public interface MongoOperations extends FluentMongoOperations {
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass);
/**
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
* Returns and removes all documents that match the provided query document criteria from the collection used to store
* the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the
* query.
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.

View File

@@ -60,7 +60,6 @@ import org.springframework.data.projection.EntityProjection;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.DeleteOptions;
@@ -567,14 +566,11 @@ class QueryOperations {
if (query.getSkip() > 0) {
options.skip((int) query.getSkip());
}
if (StringUtils.hasText(query.getHint())) {
String hint = query.getHint();
if (BsonUtils.isJsonDocument(hint)) {
options.hint(BsonUtils.parse(hint, codecRegistryProvider));
} else {
options.hintString(hint);
}
HintFunction hintFunction = HintFunction.from(query.getHint());
if (hintFunction.isPresent()) {
options = hintFunction.apply(codecRegistryProvider, options::hintString, options::hint);
}
if (callback != null) {

View File

@@ -41,19 +41,15 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
*/
ReactiveAggregationOperationSupport(ReactiveMongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
Assert.notNull(template, "Template must not be null");
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ReactiveAggregation<T> aggregateAndReturn(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveAggregationSupport<>(template, domainType, null, null);
}
@@ -75,34 +71,22 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationOperationWithAggregation<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithAggregation#by(org.springframework.data.mongodb.core.Aggregation)
*/
@Override
public TerminatingAggregationOperation<T> by(Aggregation aggregation) {
Assert.notNull(aggregation, "Aggregation must not be null!");
Assert.notNull(aggregation, "Aggregation must not be null");
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.TerminatingAggregationOperation#all()
*/
@Override
public Flux<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);

View File

@@ -46,14 +46,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation#changeStream(java.lang.Class)
*/
@Override
public <T> ReactiveChangeStream<T> changeStream(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveChangeStreamSupport<>(template, domainType, domainType, null, null);
}
@@ -76,34 +72,22 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.String)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
Assert.hasText(collection, "Collection name must not be null nor empty");
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.Class)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(Class<?> entityClass) {
Assert.notNull(entityClass, "Collection type not be null!");
Assert.notNull(entityClass, "Collection type not be null");
return watchCollection(template.getCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAt(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAt(Object token) {
@@ -117,10 +101,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAfter(Object token) {
@@ -129,10 +109,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.resumeAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#startAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> startAfter(Object token) {
@@ -141,10 +117,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.startAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithOptions#withOptions(java.util.function.Consumer)
*/
@Override
public ReactiveChangeStreamSupport<T> withOptions(Consumer<ChangeStreamOptionsBuilder> optionsConsumer) {
@@ -154,31 +126,19 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, builder.build());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithProjection#as(java.lang.Class)
*/
@Override
public <R> ChangeStreamWithFilterAndProjection<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new ReactiveChangeStreamSupport<>(template, domainType, resultType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(Aggregation filter) {
return withOptions(builder -> builder.filter(filter));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(CriteriaDefinition by) {
@@ -188,10 +148,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return filter(aggregation);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.TerminatingChangeStream#listen()
*/
@Override
public Flux<ChangeStreamEvent<T>> listen() {
return template.changeStream(collection, options != null ? options : ChangeStreamOptions.empty(), returnType);

View File

@@ -44,14 +44,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation#query(java.lang.Class)
*/
@Override
public <T> ReactiveFind<T> query(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveFindSupport<>(template, domainType, domainType, null, ALL_QUERY);
}
@@ -81,46 +77,30 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
Assert.hasText(collection, "Collection name must not be null nor empty");
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection#as(java.lang.Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
Assert.notNull(returnType, "ReturnType must not be null!");
Assert.notNull(returnType, "ReturnType must not be null");
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#first()
*/
@Override
public Mono<T> first() {
@@ -130,10 +110,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return result.next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#one()
*/
@Override
public Mono<T> one() {
@@ -148,66 +124,42 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
if (it.size() > 1) {
return Mono.error(
new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result.", 1));
new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result", 1));
}
return Mono.just(it.get(0));
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#all()
*/
@Override
public Flux<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#tail()
*/
@Override
public Flux<T> tail() {
return doFind(template.new TailingQueryFindPublisherPreparer(query, domainType));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#count()
*/
@Override
public Mono<Long> count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#exists()
*/
@Override
public Mono<Boolean> exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindDistinct#distinct(java.lang.String)
*/
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
Assert.notNull(field, "Field must not be null");
return new DistinctOperationSupport<>(this, field);
}
@@ -255,35 +207,23 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new DistinctOperationSupport<>((ReactiveFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings("unchecked")
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new DistinctOperationSupport<>((ReactiveFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core..ReactiveFindOperation.TerminatingDistinct#all()
*/
@Override
public Flux<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -38,14 +38,10 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation#insert(java.lang.Class)
*/
@Override
public <T> ReactiveInsert<T> insert(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveInsertSupport<>(template, domainType, null);
}
@@ -63,38 +59,26 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#one(java.lang.Object)
*/
@Override
public Mono<T> one(T object) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(object, "Object must not be null");
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Flux<T> all(Collection<? extends T> objects) {
Assert.notNull(objects, "Objects must not be null!");
Assert.notNull(objects, "Objects must not be null");
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public ReactiveInsert<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty.");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ReactiveInsertSupport<>(template, domainType, collection);
}

View File

@@ -46,7 +46,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public <T> ReactiveMapReduceSupport<T> mapReduce(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null);
}
@@ -100,7 +100,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public MapReduceWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
Assert.hasText(collection, "Collection name must not be null nor empty");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -113,7 +113,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public TerminatingMapReduce<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -126,7 +126,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public <R> MapReduceWithQuery<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new ReactiveMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction,
reduceFunction, options);
@@ -139,7 +139,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public ReactiveMapReduce<T> with(MapReduceOptions options) {
Assert.notNull(options, "Options must not be null! Please consider empty MapReduceOptions#options() instead.");
Assert.notNull(options, "Options must not be null Please consider empty MapReduceOptions#options() instead");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -152,7 +152,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public MapReduceWithReduceFunction<T> map(String mapFunction) {
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty!");
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
@@ -165,7 +165,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
@Override
public ReactiveMapReduce<T> reduce(String reduceFunction) {
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty!");
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);

View File

@@ -115,7 +115,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
}
throw new IllegalStateException(
"Cannot create MongoClients. One of the following is required: mongoClientSettings, connectionString or host/port");
"Cannot create MongoClients; One of the following is required: mongoClientSettings, connectionString or host/port");
}
@Override

View File

@@ -70,8 +70,8 @@ public class ReactiveMongoContext {
*/
public static Context setSession(Context context, Publisher<ClientSession> session) {
Assert.notNull(context, "Context must not be null!");
Assert.notNull(session, "Session publisher must not be null!");
Assert.notNull(context, "Context must not be null");
Assert.notNull(session, "Session publisher must not be null");
return context.put(SESSION_KEY, Mono.from(session));
}

View File

@@ -25,10 +25,13 @@ import java.util.function.Supplier;
import org.bson.Document;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationPipeline;
import org.springframework.data.mongodb.core.aggregation.AggregationUpdate;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
@@ -42,7 +45,6 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.transaction.reactive.TransactionalOperator;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -58,8 +60,7 @@ import com.mongodb.reactivestreams.client.MongoCollection;
* <p>
* Implemented by {@link ReactiveMongoTemplate}. Not often used but a useful option for extensibility and testability
* (as it can be easily mocked, stubbed, or be the target of a JDK proxy). Command execution using
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}.
* <br />
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}. <br />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
@@ -120,8 +121,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Document> executeCommand(Document command, @Nullable ReadPreference readPreference);
/**
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary.
* <br />
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
@@ -132,8 +132,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(ReactiveDatabaseCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class.
* <br />
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
@@ -144,8 +143,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name.
* <br />
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
@@ -158,8 +156,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <br />
* provided by the given {@link Supplier} to each and every command issued against MongoDB. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -170,15 +167,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
default ReactiveSessionScoped withSession(Supplier<ClientSession> sessionProvider) {
Assert.notNull(sessionProvider, "SessionProvider must not be null!");
Assert.notNull(sessionProvider, "SessionProvider must not be null");
return withSession(Mono.fromSupplier(sessionProvider));
}
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding a new {@link ClientSession}
* with given {@literal sessionOptions} to each and every command issued against MongoDB.
* <br />
* with given {@literal sessionOptions} to each and every command issued against MongoDB. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -204,48 +200,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
ReactiveSessionScoped withSession(Publisher<ClientSession> sessionProvider);
/**
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* <br />
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
* @return {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* @since 2.1
*/
ReactiveMongoOperations withSession(ClientSession session);
/**
* Initiate a new {@link ClientSession} and obtain a {@link ClientSession session} bound instance of
* {@link ReactiveSessionScoped}. Starts the transaction and adds the {@link ClientSession} to each and every command
* issued against MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction();
/**
* Obtain a {@link ClientSession session} bound instance of {@link ReactiveSessionScoped}, start the transaction and
* bind the {@link ClientSession} provided by the given {@link Publisher} to each and every command issued against
* MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction(Publisher<ClientSession> sessionProvider);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -281,6 +243,56 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
Mono<MongoCollection<Document>> createCollection(String collectionName, CollectionOptions collectionOptions);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationOperation pipeline
* stages} on another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param stages the {@link AggregationOperation aggregation pipeline stages} defining the view content.
* @since 4.0
*/
default Mono<MongoCollection<Document>> createView(String name, Class<?> source, AggregationOperation... stages) {
return createView(name, source, AggregationPipeline.of(stages));
}
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @since 4.0
*/
default Mono<MongoCollection<Document>> createView(String name, Class<?> source, AggregationPipeline pipeline) {
return createView(name, source, pipeline, null);
}
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given {@link #getCollectionName(Class) source type}.
*
* @param name the name of the view to create.
* @param source the type defining the views source collection.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
Mono<MongoCollection<Document>> createView(String name, Class<?> source, AggregationPipeline pipeline, @Nullable ViewOptions options);
/**
* Create a view with the provided name. The view content is defined by the {@link AggregationPipeline pipeline} on
* another collection or view identified by the given source.
*
* @param name the name of the view to create.
* @param source the name of the collection or view defining the to be created views source.
* @param pipeline the {@link AggregationPipeline} defining the view content.
* @param options additional settings to apply when creating the view. Can be {@literal null}.
* @since 4.0
*/
Mono<MongoCollection<Document>> createView(String name, String source, AggregationPipeline pipeline, @Nullable ViewOptions options);
/**
* A set of collection names.
*
@@ -292,8 +304,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Get a {@link MongoCollection} by name. The returned collection may not exists yet (except in local memory) and is
* created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first.
* <br />
* exists} first. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection.
@@ -302,8 +313,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<MongoCollection<Document>> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists.
* <br />
* Check to see if a collection with a name indicated by the entity class exists. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection. Must not be {@literal null}.
@@ -312,8 +322,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Boolean> collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists.
* <br />
* Check to see if a collection with a given name exists. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -322,8 +331,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Boolean> collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class.
* <br />
* Drop the collection with the name indicated by the entity class. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
@@ -331,8 +339,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Void> dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name.
* <br />
* Drop the collection with the given name. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
@@ -340,11 +347,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Void> dropCollection(String collectionName);
/**
* Query for a {@link Flux} of objects of type T from the collection used by the entity class.
* <br />
* Query for a {@link Flux} of objects of type T from the collection used by the entity class. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -354,11 +359,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> findAll(Class<T> entityClass);
/**
* Query for a {@link Flux} of objects of type T from the specified collection.
* <br />
* Query for a {@link Flux} of objects of type T from the specified collection. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -370,11 +373,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
* <br />
* specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -387,11 +388,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type.
* <br />
* type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -437,8 +436,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -450,11 +448,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type.
* <br />
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -565,11 +561,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}.
* <br />
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -583,11 +577,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given {@code ouputType}. The name of the inputCollection is derived from the
* {@code inputType}.
* <br />
* {@code inputType}. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -603,10 +595,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* The raw results will be mapped to the given entity class.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given entity class. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -823,7 +813,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
default <T> Mono<T> findAndReplace(Query query, T replacement, FindAndReplaceOptions options, String collectionName) {
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(replacement, "Replacement must not be null");
return findAndReplace(query, replacement, options, (Class<T>) ClassUtils.getUserClass(replacement), collectionName);
}
@@ -907,10 +897,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* database. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -926,8 +914,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1005,8 +992,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <br />
* based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1018,13 +1004,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
default Mono<Long> estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
Assert.notNull(entityClass, "Entity class must not be null");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <br />
* Estimate the number of documents in the given collection based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1106,17 +1091,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Long> exactCount(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* If your object has an {@literal Id} property which holds a {@literal null} value, it will be set with the generated
* Id from MongoDB. If your Id property is a String then MongoDB ObjectId will be used to populate that string.
* Otherwise, the conversion from ObjectId to your property type will be handled by Spring's BeanWrapper class that
* leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1130,11 +1112,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<T> insert(T objectToSave);
/**
* Insert the object into the specified collection.
* <br />
* Insert the object into the specified collection. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1178,16 +1158,13 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> insertAll(Collection<? extends T> objectsToSave);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1226,17 +1203,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <br />
* object is not already present, that is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1249,15 +1223,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <br />
* is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1268,15 +1241,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <br />
* object is not already present, that is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the saved object.
@@ -1287,15 +1259,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <br />
* is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collReactiveMongoOperationsection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1510,7 +1481,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collectionName);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
@@ -1522,7 +1493,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<DeleteResult> remove(Query query, Class<?> entityClass);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* Remove all documents that match the provided query document criteria from the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query the query document that specifies the criteria used to remove a record.
@@ -1567,9 +1538,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass);
/**
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
* Returns and removes all documents that match the provided query document criteria from the collection used to store
* the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the
* query.
*
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.
@@ -1582,11 +1553,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1603,11 +1572,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1623,11 +1590,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the configured default database via the reactive infrastructure. Use the optional provided {@link Aggregation} to
* filter events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1647,11 +1612,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the given collection via the reactive infrastructure. Use the optional provided {@link Aggregation} to filter
* events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1672,11 +1635,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> via the reactive
* infrastructure. Use the optional provided {@link Aggregation} to filter events. The stream will not be completed
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}.
* <br />
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*

View File

@@ -41,14 +41,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ReactiveRemove<T> remove(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveRemoveSupport<>(tempate, domainType, ALL_QUERY, null);
}
@@ -68,34 +64,22 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithCollection#inCollection(String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#all()
*/
@Override
public Mono<DeleteResult> all() {
@@ -104,10 +88,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return template.doRemove(collectionName, query, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public Flux<T> findAndRemove() {

View File

@@ -42,14 +42,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ReactiveUpdate<T> update(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
Assert.notNull(domainType, "DomainType must not be null");
return new ReactiveUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType);
}
@@ -83,54 +79,34 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(org.springframework.data.mongodb.core.query.UpdateDefinition update) {
Assert.notNull(update, "Update must not be null!");
Assert.notNull(update, "Update must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty!");
Assert.hasText(collection, "Collection must not be null nor empty");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#first()
*/
@Override
public Mono<UpdateResult> first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public Mono<UpdateResult> upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndModify#findAndModify()
*/
@Override
public Mono<T> findAndModify() {
@@ -141,10 +117,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndReplace#findAndReplace()
*/
@Override
public Mono<T> findAndReplace() {
return template.findAndReplace(query, replacement,
@@ -152,75 +124,51 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
getCollectionName(), targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query, "Query must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#all()
*/
@Override
public Mono<UpdateResult> all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
Assert.notNull(options, "Options must not be null!");
Assert.notNull(options, "Options must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, options,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#replaceWith(java.lang.Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(replacement, "Replacement must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
Assert.notNull(options, "Options must not be null!");
Assert.notNull(options, "Options must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, options,
replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
Assert.notNull(resultType, "ResultType must not be null");
return new ReactiveUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, resultType);

Some files were not shown because too many files have changed in this diff Show More