Compare commits

..

207 Commits

Author SHA1 Message Date
Greg L. Turnquist
7ced6fa7d9 Authenticate with artifactory.
See #3616.
2021-04-22 15:06:46 -05:00
Mark Paluch
f9405ed2fd Updated changelog.
See #3598
2021-03-31 18:30:45 +02:00
Mark Paluch
14c874e701 Updated changelog.
See #3595
2021-03-31 17:26:05 +02:00
Mark Paluch
7273fe410d Updated changelog.
See #3558
2021-03-17 11:31:29 +01:00
Mark Paluch
23a54e3a5e Updated changelog.
See #3561
2021-03-17 11:03:40 +01:00
Mark Paluch
10bec0fa36 Updated changelog.
See #3556
2021-03-17 10:35:13 +01:00
Mark Paluch
39803621b1 Polishing.
Reformat code. Add since tags.

See #3395
Original pull request: #3554.
2021-02-18 15:10:04 +01:00
Christoph Strobl
154f691fbd Fix case insensitive derived in queries on String properties.
We now consider the IgnoreCase part of a derived query when used along with In. Strings will be quoted to avoid malicious strings from being handed over to the server as a regular expression to evaluate.

See #3395
Original pull request: #3554.
2021-02-18 15:10:02 +01:00
Christoph Strobl
770c28835f Updated changelog.
See #3560
2021-02-18 11:37:44 +01:00
Christoph Strobl
5fb1cabfbb Updated changelog.
See #3557
2021-02-18 11:18:27 +01:00
Mark Paluch
f6a23b5e3d Include Bintray plugin repository.
Closes #3559.
2021-02-17 15:38:18 +01:00
Christoph Strobl
fc271a37f8 Updated changelog.
See #3537
2021-02-17 14:20:35 +01:00
Christoph Strobl
6ef697f6b7 Updated changelog.
See #3536
2021-02-17 13:49:17 +01:00
Christoph Strobl
17e958e696 Updated changelog.
See #3520
2021-02-17 11:34:23 +01:00
Christoph Strobl
8318a78fd0 After release cleanups.
See #3519
2021-02-17 10:55:33 +01:00
Christoph Strobl
89a4b08b36 Prepare next development iteration.
See #3519
2021-02-17 10:55:29 +01:00
Christoph Strobl
52730a5f90 Release version 2.2.13 (Moore SR13).
See #3519
2021-02-17 10:34:59 +01:00
Christoph Strobl
bf3d118dbd Prepare 2.2.13 (Moore SR13).
See #3519
2021-02-17 10:34:05 +01:00
Christoph Strobl
72d0e69b3b Updated changelog.
See #3519
2021-02-17 10:33:59 +01:00
Christoph Strobl
fc3930663a Fix Criteria chaining for Criteria.alike().
This commit fixes an issue where an Example probe would not be added to the criteria chain.

Closes #3544
Original pull request: #3549.
2021-02-01 09:41:21 +01:00
Christoph Strobl
9c17ee09bb Fix method names in full text query documentation.
Closes #3525
2021-01-20 08:32:03 +01:00
Christoph Strobl
be84b18376 Updated changelog.
See #3521
2021-01-13 15:49:41 +01:00
Christoph Strobl
d7ac0c8260 Updated changelog.
See #3477
2021-01-13 15:16:22 +01:00
Christoph Strobl
40a50e02e5 Update issue tracker references after GitHub issues migration.
See: #3529
2021-01-12 13:59:16 +01:00
Mark Paluch
c93db7b42b Update copyright year to 2021.
Closes #3534
2021-01-12 11:55:42 +01:00
Mark Paluch
99091ad42c DATAMONGO-2653 - Updated changelog. 2020-12-09 16:47:41 +01:00
Mark Paluch
7d1cfd84ad DATAMONGO-2649 - Updated changelog. 2020-12-09 15:33:21 +01:00
Mark Paluch
158abf9d74 DATAMONGO-2647 - Updated changelog. 2020-12-09 12:42:26 +01:00
Mark Paluch
d90653d7f6 DATAMONGO-2646 - After release cleanups. 2020-12-09 09:53:32 +01:00
Mark Paluch
29571159fd DATAMONGO-2646 - Prepare next development iteration. 2020-12-09 09:53:28 +01:00
Mark Paluch
47f1b53ca0 DATAMONGO-2646 - Release version 2.2.12 (Moore SR12). 2020-12-09 09:37:36 +01:00
Mark Paluch
c10d92a0c3 DATAMONGO-2646 - Prepare 2.2.12 (Moore SR12). 2020-12-09 09:37:07 +01:00
Mark Paluch
1f18220dfa DATAMONGO-2646 - Updated changelog. 2020-12-09 09:37:03 +01:00
Mark Paluch
a4a2d6e29c DATAMONGO-2663 - Document Spring Data to MongoDB compatibility.
Original Pull Request: #895
2020-12-07 14:39:49 +01:00
Mark Paluch
786fbba428 DATAMONGO-2648 - Updated changelog. 2020-11-11 12:34:39 +01:00
Mark Paluch
8743270035 DATAMONGO-2639 - Updated changelog. 2020-10-28 16:28:00 +01:00
Mark Paluch
73d9b2e585 DATAMONGO-2625 - Updated changelog. 2020-10-28 15:03:04 +01:00
Mark Paluch
092c7e7260 DATAMONGO-2624 - After release cleanups. 2020-10-28 12:06:21 +01:00
Mark Paluch
8b3761656f DATAMONGO-2624 - Prepare next development iteration. 2020-10-28 12:06:18 +01:00
Mark Paluch
c89f707966 DATAMONGO-2624 - Release version 2.2.11 (Moore SR11). 2020-10-28 11:44:36 +01:00
Mark Paluch
b1764160c5 DATAMONGO-2624 - Prepare 2.2.11 (Moore SR11). 2020-10-28 11:44:06 +01:00
Mark Paluch
7e1e33d51f DATAMONGO-2624 - Updated changelog. 2020-10-28 11:43:53 +01:00
Mark Paluch
2a81418738 DATAMONGO-2641 - Updated changelog. 2020-10-28 11:32:30 +01:00
Robin Dupret
f72a961bb4 DATAMONGO-2638 - Fix list item rendering in reference documentation.
Original Pull Request: #885
2020-10-27 13:33:40 +01:00
LiangYong
e398bf8a24 DATAMONGO-2638 - Fix aggregation input parameter syntax in reference documentation.
Original Pull Request: #881
2020-10-27 13:33:36 +01:00
Christoph Strobl
7787a5cf0e DATAMONGO-2626 - Updated changelog. 2020-10-14 14:51:52 +02:00
Mark Paluch
0786969ef2 DATAMONGO-2616 - Polishing.
Reformat code. Merge if-statements.

Original pull request: #889.
2020-10-07 11:36:01 +02:00
Christoph Strobl
98040ad44e DATAMONGO-2616 - Short circuit id value assignment in MongoConverter.
Original pull request: #889.
2020-10-07 11:36:01 +02:00
Christoph Strobl
ad4ab65a9f DATAMONGO-2633 - Fix json parsing of nested arrays in ParameterBindingDocumentCodec.
Original pull request: #888.
2020-10-05 15:42:07 +02:00
Mark Paluch
e05113ea27 DATAMONGO-2608 - Updated changelog. 2020-09-16 14:12:07 +02:00
Mark Paluch
c78898edf7 DATAMONGO-2609 - Updated changelog. 2020-09-16 12:16:36 +02:00
Mark Paluch
9f6be1f471 DATAMONGO-2593 - After release cleanups. 2020-09-16 11:19:26 +02:00
Mark Paluch
f8a48637ec DATAMONGO-2593 - Prepare next development iteration. 2020-09-16 11:19:23 +02:00
Mark Paluch
df73c567ce DATAMONGO-2593 - Release version 2.2.10 (Moore SR10). 2020-09-16 10:53:21 +02:00
Mark Paluch
47ce8258b0 DATAMONGO-2593 - Prepare 2.2.10 (Moore SR10). 2020-09-16 10:52:53 +02:00
Mark Paluch
e523d9a1cb DATAMONGO-2593 - Updated changelog. 2020-09-16 10:52:37 +02:00
Mark Paluch
d65632c097 DATAMONGO-2592 - Updated changelog. 2020-09-16 10:39:00 +02:00
Christoph Strobl
29bf4f72f3 DATAMONGO-2618 - Fix visibility of ReplaceRootDocumentOperation. 2020-09-14 13:46:20 +02:00
Michal Kurcius
ee79b9939b DATAMONGO-2613 - Fix single element ArrayJsonSchemaObject to document mapping.
Now toDocument calls toDocument on items correctly.

Original Pull Request: #883
2020-08-20 09:13:09 +02:00
Mark Paluch
156ec5b441 DATAMONGO-2594 - Updated changelog. 2020-08-12 13:25:53 +02:00
Mark Paluch
931028b43e DATAMONGO-2579 - Updated changelog. 2020-08-12 12:01:24 +02:00
Mark Paluch
687c4d35ed DATAMONGO-2599 - Eagerly consider enum types as simple types.
MongoSimpleTypes now eagerly checks if a type is a simple one to avoid PersistentEntity registration for ChronoUnit.
2020-07-30 16:36:37 +02:00
Mark Paluch
71ac036dc3 DATAMONGO-2568 - Updated changelog. 2020-07-22 10:38:03 +02:00
Mark Paluch
da3ab42f10 DATAMONGO-2567 - After release cleanups. 2020-07-22 10:07:19 +02:00
Mark Paluch
537fc1caad DATAMONGO-2567 - Prepare next development iteration. 2020-07-22 10:07:16 +02:00
Mark Paluch
0ec0eaf937 DATAMONGO-2567 - Release version 2.2.9 (Moore SR9). 2020-07-22 09:55:55 +02:00
Mark Paluch
1babda2e69 DATAMONGO-2567 - Prepare 2.2.9 (Moore SR9). 2020-07-22 09:55:29 +02:00
Mark Paluch
e319ea82d0 DATAMONGO-2567 - Updated changelog. 2020-07-22 09:55:18 +02:00
Mark Paluch
63c8f63acf DATAMONGO-2566 - Updated changelog. 2020-07-22 09:44:31 +02:00
Mark Paluch
4f7d3131c5 DATAMONGO-2571 - Polishing.
Fix test method visibility.
2020-07-15 15:45:11 +02:00
Christoph Strobl
6a8609c7f8 DATAMONGO-2571 - Fix regular expression parameter binding for String-based queries.
Original pull request: #873.
2020-07-15 15:33:57 +02:00
Mark Paluch
d9477501ae DATAMONGO-2490 - Polishing.
Remove unnecessary code. Reuse session-associated collection when logging to avoid unqualified calls to MongoDbFactory.getMongoDatabase(). Create collection before transaction in test for compatibility with older MongoDB servers.

Original pull request: #875.
2020-07-15 15:17:26 +02:00
Christoph Strobl
be46540959 DATAMONGO-2490 - Fix dbref fetching during ongoing transaction.
Original pull request: #875.
2020-07-15 15:16:20 +02:00
Mark Paluch
773f20f861 DATAMONGO-2544 - Updated changelog. 2020-06-25 12:00:21 +02:00
Mark Paluch
171386baaf DATAMONGO-2570 - Polishing.
Add assertions. Use method references where possible.

Original pull request: #871.
2020-06-22 10:40:16 +02:00
Mehran Behnam
aa2716c2c0 DATAMONGO-2570 - Change type of count variable to primitive.
Avoiding unintentional unboxing.

Original pull request: #871.
2020-06-22 10:40:16 +02:00
Mark Paluch
1904a809a4 DATAMONGO-2572 - Remove usage of Oppressive Language.
Replaced blacklist with denylist and introduce meta keyword SECONDARY_READS as we no longer use MongoDB API with the initial replication concept.

Original Pull Request: #870
2020-06-17 13:49:04 +02:00
Mark Paluch
f8dddc366b DATAMONGO-2543 - Updated changelog. 2020-06-10 14:31:01 +02:00
Mark Paluch
f7bc5228ec DATAMONGO-2533 - After release cleanups. 2020-06-10 12:29:17 +02:00
Mark Paluch
4782900da5 DATAMONGO-2533 - Prepare next development iteration. 2020-06-10 12:29:14 +02:00
Mark Paluch
c011ff81b9 DATAMONGO-2533 - Release version 2.2.8 (Moore SR8). 2020-06-10 12:11:27 +02:00
Mark Paluch
cb768ce953 DATAMONGO-2533 - Prepare 2.2.8 (Moore SR8). 2020-06-10 12:11:00 +02:00
Mark Paluch
5d08c040b9 DATAMONGO-2533 - Updated changelog. 2020-06-10 12:10:46 +02:00
Mark Paluch
7a0f38d10e DATAMONGO-2532 - Updated changelog. 2020-06-10 11:22:53 +02:00
Mark Paluch
1e9ab02e3f DATAMONGO-2565 - Polishing.
Add unit test to verify behavior. Cleanup code.

Original pull request: #869.
2020-06-10 10:16:03 +02:00
BraveLeeLee
5e378bd21b DATAMONGO-2565 - Evaluate correct expression when obtaining collation from MongoPersistentEntity.
Original pull request: #869.
2020-06-10 10:15:29 +02:00
Mark Paluch
a2e5009f42 DATAMONGO-2542 - Polishing.
Fix nullable annotation.

Original pull request: #863.
2020-05-26 10:32:43 +02:00
Christoph Strobl
44cc0dad94 DATAMONGO-2542 - Shortcut PersistentPropertyPath resolution during query mapping.
By shortcutting the path resolution we avoid checking keywords like $in against a potential path expression.

Original pull request: #863.
2020-05-26 10:32:43 +02:00
Mark Paluch
68de8e57be DATAMONGO-2545 - Polishing.
Fix warnings and typos.

Original pull request: #864.
2020-05-26 10:24:31 +02:00
Christoph Strobl
fde59411da DATAMONGO-2545 - Fix full Query Document binding resulting from SpEL.
We reenabled annotated queries using a SpEL expression resulting in the actual query document.

Original pull request: #864.
2020-05-26 10:18:46 +02:00
Christoph Strobl
7d48dd0ce4 DATAMONGO-2545 - Fix regression in String query SpEL parameter binding.
We reenabled parameter binding within SpEL using query parameter placeholders ?0, ?1,... instead of their array index [0],[1],...

Original pull request: #864.
2020-05-26 10:18:44 +02:00
Christoph Strobl
699a7a00f5 DATAMONGO-2547 - Use target class ClassLoader instead of default CL when creating proxy instances.
Original pull request: #865.
2020-05-26 08:50:30 +02:00
Mark Paluch
16d55c8973 DATAMONGO-2509 - Add missing Query constructor in reference documentation.
Original pull request: #846.
2020-05-06 11:52:53 +02:00
Christoph Strobl
be0b9415b4 DATAMONGO-2509 - Update Javadoc.
Original pull request: #853.
2020-05-06 11:41:25 +02:00
Greg Turnquist
93df58a8d5 DATAMONGO-2535 - Remove Travis CI. 2020-05-04 15:09:18 -05:00
Mark Paluch
999e24902c DATAMONGO-2503 - Updated changelog. 2020-04-28 15:12:28 +02:00
Mark Paluch
d19208abec DATAMONGO-2500 - After release cleanups. 2020-04-28 14:45:30 +02:00
Mark Paluch
8f54676b97 DATAMONGO-2500 - Prepare next development iteration. 2020-04-28 14:45:29 +02:00
Mark Paluch
61bbc9ab7f DATAMONGO-2500 - Release version 2.2.7 (Moore SR7). 2020-04-28 14:35:55 +02:00
Mark Paluch
9e802a59c7 DATAMONGO-2500 - Prepare 2.2.7 (Moore SR7). 2020-04-28 14:35:24 +02:00
Mark Paluch
38e1e632a7 DATAMONGO-2500 - Updated changelog. 2020-04-28 14:35:16 +02:00
Mark Paluch
89cf78cc4a DATAMONGO-2484 - Updated changelog. 2020-04-28 11:59:03 +02:00
Mark Paluch
cecd47d679 DATAMONGO-2529 - Ensure that MappingMongoConverter.read(…) is never called with null.
Previously, various methods attempted to pass a null argument as source for the converter. The API is non-null and implementations relying on these constraints were easily breakable.

We now make sure that the source is never null.
2020-04-23 15:25:45 +02:00
Mark Paluch
ed35e577af DATAMONGO-2504 - Polishing.
Update equals/hashCode implementation to use the Spring Data form. Make fields final where possible. Use diamond syntax. Reorder methods. Reformat code. Extend tests.

Original pull request: #848.
2020-04-23 15:25:45 +02:00
ddebray
f54cf40eda DATAMONGO-2504 - Add hashCode and equals to TextCriteria and Term.
Original pull request: #848.
2020-04-23 12:29:07 +02:00
Christoph Strobl
5314e6f8bb DATAMONGO-2513 - Fix Eq aggregation operator comparing collection values.
Original pull request: #855.
2020-04-22 11:47:02 +02:00
Mark Paluch
b7b2709177 DATAMONGO-2523 - Polishing.
Reformat code.

Original pull request: #859.
2020-04-22 10:08:33 +02:00
Christoph Strobl
34c47e84c0 DATAMONGO-2523 - Fix Json binding of SpEL expressions in arrays.
The closing bracket must not have a leading whitespace.

Original pull request: #859.
2020-04-22 10:08:30 +02:00
Mark Paluch
f7d91184a0 DATAMONGO-2517 - Polishing.
Reformat code.

Original pull request: #857.
2020-04-21 16:02:36 +02:00
Christoph Strobl
eeddc860f7 DATAMONGO-2517 - Fix invalid entity creation for text queries.
Fix a glitch in the MappingMongoConverter that uses the single String argument constructor (since it matches in type and parameter count to the given input string) to falsely instantiate an Entity when it should not.

Original pull request: #857.
2020-04-21 16:02:24 +02:00
Christoph Strobl
bcefdd209b DATAMONGO-2506 - Provide meaningful error message when using unsupported return type in repository aggregation method.
We improved the error message for unsupported return types instead of running into an IllegalArgumentException for unique results.

Original pull request: #851.
2020-04-07 14:59:30 +02:00
Mark Paluch
3f1fea2d19 DATAMONGO-2502 - Polishing.
Extend tests. Fix generics. Consistently use compiled patterns for positional placeholder removal.

Original pull request: #847.
2020-04-07 14:47:42 +02:00
Christoph Strobl
665322a69a DATAMONGO-2502 - Fix nested array path mapping for updates.
Original pull request: #847.
2020-04-07 14:17:05 +02:00
Mark Paluch
3e59bc3b38 DATAMONGO-2492 - Updated changelog. 2020-03-31 15:08:52 +02:00
Mark Paluch
1752931dde DATAMONGO-2485 - After release cleanups. 2020-03-25 10:58:27 +01:00
Mark Paluch
4b9bae1656 DATAMONGO-2485 - Prepare next development iteration. 2020-03-25 10:58:26 +01:00
Mark Paluch
74c08fa8aa DATAMONGO-2485 - Release version 2.2.6 (Moore SR6). 2020-03-25 10:46:02 +01:00
Mark Paluch
628aad8f64 DATAMONGO-2485 - Prepare 2.2.6 (Moore SR6). 2020-03-25 10:45:35 +01:00
Mark Paluch
39c8672e6d DATAMONGO-2485 - Updated changelog. 2020-03-25 10:45:28 +01:00
Christoph Strobl
620991ddee DATAMONGO-2300 - Polishing.
Move null check to event publishing logic.

Original Pull Request: #763
2020-03-23 10:11:41 +01:00
Heesu Jung
ba8f28f623 DATAMONGO-2300 - Add check rawType is null in readMap.
Original Pull Request: #763
2020-03-23 10:01:50 +01:00
Mark Paluch
6389055d3a DATAMONGO-2497 - Update documentation regarding @Transient properties usage in the persistence constructor. 2020-03-19 15:37:37 +01:00
Christoph Strobl
4465ed9819 DATAMONGO-2445 - Deprecate ReactiveGridFsOperations using AsyncInputStream.
Methods using AsyncInputStream will be removed in 3.0. Please use the ones accepting a Publisher.

Original pull request: #843.
2020-03-19 09:42:49 +01:00
Mark Paluch
8dc97e5d01 DATAMONGO-2488 - Polishing.
Simplify conditional entity check.

Original pull request: #841.
2020-03-11 14:38:17 +01:00
Christoph Strobl
a037c50961 DATAMONGO-2488 - Fix nested array path field name mapping.
Original pull request: #841.
2020-03-11 14:38:16 +01:00
Jens Schauder
28d5f02e15 DATAMONGO-2473 - Updated changelog. 2020-03-11 09:59:36 +01:00
Mark Paluch
e65a353fc4 DATAMONGO-2453 - After release cleanups. 2020-02-26 11:54:09 +01:00
Mark Paluch
42400e7836 DATAMONGO-2453 - Prepare next development iteration. 2020-02-26 11:54:08 +01:00
Mark Paluch
cdd7a2008b DATAMONGO-2453 - Release version 2.2.5 (Moore SR5). 2020-02-26 11:38:17 +01:00
Mark Paluch
c2b80fddd8 DATAMONGO-2453 - Prepare 2.2.5 (Moore SR5). 2020-02-26 11:37:58 +01:00
Mark Paluch
8eaa8119e6 DATAMONGO-2453 - Updated changelog. 2020-02-26 11:37:53 +01:00
Mark Paluch
5fccadd41e DATAMONGO-2452 - Updated changelog. 2020-02-26 11:31:49 +01:00
Christoph Strobl
5d0ab340e3 DATAMONGO-2478 - Fix NPE in Query.of when given a proxied source.
Original pull request: #836.
2020-02-24 11:34:36 +01:00
Christoph Strobl
5d7e9199de DATAMONGO-2476 - Fix Json parsing for unquoted placeholders in arrays.
Original pull request: #835.
2020-02-24 11:07:56 +01:00
Mark Paluch
5e2c65a650 DATAMONGO-2456 - Updated changelog. 2020-02-12 15:05:04 +01:00
Mark Paluch
1f5553d2d8 DATAMONGO-2079 - MappingMongoConverter no longer implements ValueResolver.
MappingMongoConverter no longer implements a package-private interface so that converter instances can be proxied.

Original Pull Request: #832
2020-02-04 14:52:43 +01:00
Mark Paluch
40d5ab050f DATAMONGO-2464 - Polishing.
Apply fix also to reactive MongoDB repository documentation.

Original pull request: #816.
2020-02-03 11:33:42 +01:00
LiangYong
1629ba11b2 DATAMONGO-2464 - Fix code examples in reference documentation.
fixed repository miss "{" issue.

Original pull request: #816.
2020-02-03 11:31:33 +01:00
Mark Paluch
6e94f138d5 DATAMONGO-2460 - Polishing.
Reformat code. Use diamond syntax.

Original pull request: #830.
2020-02-03 11:25:36 +01:00
Christoph Strobl
1b7273db42 DATAMONGO-2460 - Fix target type computation for complex id properties with @Field annotation.
We now set the target type to org.bson.Document for id properties annotated with @Field having the implicit target type derived from the annotation. Along the lines we fixed warn message when an id property with explicit (unsupported) field name is detected.

Original pull request: #830.
2020-02-03 11:25:36 +01:00
Mark Paluch
8857903831 DATAMONGO-2406 - Polishing.
Add optimization for Mono.

Original pull request: #825.
2020-01-29 09:58:14 +01:00
Christoph Strobl
69cacb5fe3 DATAMONGO-2406 - Derived reactive deleteBy query execution should allow Mono<Void> result.
Mono<Void> is now a supported return type of derived reactive deleteBy queries like:

    Mono<Void> deleteByLastname(String lastname);

Original pull request: #825.
2020-01-29 09:58:14 +01:00
Mark Paluch
ba4b958114 DATAMONGO-2457 - Polishing.
Slightly tweak wording.

Original pull request: #829.
2020-01-29 09:52:02 +01:00
Christoph Strobl
6d971ef2c8 DATAMONGO-2459 - Add sample for passing on limit and offset using reactive repositories.
Original pull request: #829.
2020-01-29 09:46:43 +01:00
Christoph Strobl
03ff37db92 DATAMONGO-2457 - Fix id type explanation in repository documentation.
Original pull request: #829.
2020-01-29 09:46:41 +01:00
Mark Paluch
428126ef75 DATAMONGO-2454 - Updated changelog. 2020-01-17 09:58:36 +01:00
Mark Paluch
aee242c52a DATAMONGO-2383 - Updated changelog. 2020-01-16 16:12:40 +01:00
Mark Paluch
1299f78e80 DATAMONGO-2432 - After release cleanups. 2020-01-15 12:48:23 +01:00
Mark Paluch
fc3d13d5bc DATAMONGO-2432 - Prepare next development iteration. 2020-01-15 12:48:22 +01:00
Mark Paluch
6d2f7b0c9e DATAMONGO-2432 - Release version 2.2.4 (Moore SR4). 2020-01-15 12:37:44 +01:00
Mark Paluch
84128cab28 DATAMONGO-2432 - Prepare 2.2.4 (Moore SR4). 2020-01-15 12:37:18 +01:00
Mark Paluch
8f2d2784f8 DATAMONGO-2432 - Updated changelog. 2020-01-15 12:37:09 +01:00
Christoph Strobl
37ee677a67 DATAMONGO-2451 - Fix value conversion for id properties used in sort expression.
Previously we falsely converted the sort value (1/-1) into the id types target value when a Field annotation had been present.

Original pull request: #822.
2020-01-15 11:52:06 +01:00
Mark Paluch
1729744a9e DATAMONGO-2431 - Updated changelog. 2020-01-15 10:36:38 +01:00
Mark Paluch
b59903c890 DATAMONGO-2450 - Polishing.
Replace stream with for-loop.

Original pull request: #820.
2020-01-15 10:15:23 +01:00
Christoph Strobl
5063e68562 DATAMONGO-2450 - Apply array filters to bulk write.
Original pull request: #820.
2020-01-15 10:15:23 +01:00
Christoph Strobl
3cbb572c7c DATAMONGO-2442 - Fix thenValueOf in $cond builder.
We now create a field reference when calling the builder instead of using the value as is.

Original pull request: #818.
2020-01-08 15:18:47 +01:00
Mark Paluch
bb4f11a239 DATAMONGO-2440 - Polishing.
Iterate over List instead of using the Stream API.

Original pull request: #817.
2020-01-08 15:18:47 +01:00
Christoph Strobl
8a1750691b DATAMONGO-2440 - Fix field target type conversion for collection values.
We now preserve the collection nature of the source type when applying custom target type conversions. Prior to this change collection values had been changed to single element causing query errors in MongoDB when using $in queries.

Original pull request: #817.
2020-01-08 15:18:47 +01:00
Christoph Strobl
5cffaf9d73 DATAMONGO-2423 - Polishing.
Update nullability annotations and reduce visibility in tests.

Original pull request: #815.
2020-01-08 13:25:07 +01:00
Christoph Strobl
a9e50b28df DATAMONGO-2423 - Nullability refinements for Update.
Ease non null restrictions for operators that may use null values like $set.

Original pull request: #815.
2020-01-08 13:24:50 +01:00
Mark Paluch
24512639fb DATAMONGO-2444 - Update copyright years to 2020. 2020-01-07 08:59:35 +01:00
Jens Schauder
1355d043b6 DATAMONGO-2422 - After release cleanups. 2019-12-04 14:29:17 +01:00
Jens Schauder
c7600ef9ed DATAMONGO-2422 - Prepare next development iteration. 2019-12-04 14:29:14 +01:00
Jens Schauder
a4cb0d6432 DATAMONGO-2422 - Release version 2.2.3 (Moore SR3). 2019-12-04 14:12:24 +01:00
Jens Schauder
437368bedb DATAMONGO-2422 - Prepare 2.2.3 (Moore SR3). 2019-12-04 14:11:44 +01:00
Jens Schauder
4057b193d6 DATAMONGO-2422 - Updated changelog. 2019-12-04 14:11:36 +01:00
Jens Schauder
89b1b5c7f6 DATAMONGO-2421 - Updated changelog. 2019-12-04 12:09:48 +01:00
Mark Paluch
60d3438277 DATAMONGO-2430 - Upgrade to mongo-java-driver 3.11.2. 2019-12-04 11:42:04 +01:00
Mark Paluch
a578a10b5b DATAMONGO-2418 - Polishing.
Reformat code.

Original pull request: #814.
2019-12-04 10:31:33 +01:00
Christoph Strobl
5cf24af00b DATAMONGO-2418 - Obtain EvaluationContextExtensions lazily when parsing Bson queries.
An eager evaluation of the context extension can lead to errors when e.g. the security context was not present.

Original pull request: #814.
2019-12-04 10:31:33 +01:00
Mark Paluch
6c0e455146 DATAMONGO-2410 - Polishing.
Simplify cast. Extend test with DBObject interface.

Original pull request: #813.
2019-12-04 08:56:54 +01:00
Christoph Strobl
932a946868 DATAMONGO-2410 - Fix Document to BasicDBObject conversion.
Original pull request: #813.
2019-12-04 08:56:54 +01:00
Mark Paluch
99a1cfbff9 DATAMONGO-2402 - After release cleanups. 2019-11-18 12:42:05 +01:00
Mark Paluch
d7bbdde1e7 DATAMONGO-2402 - Prepare next development iteration. 2019-11-18 12:42:04 +01:00
Mark Paluch
7dba98dce8 DATAMONGO-2402 - Release version 2.2.2 (Moore SR2). 2019-11-18 12:32:12 +01:00
Mark Paluch
c1ae30bd82 DATAMONGO-2402 - Prepare 2.2.2 (Moore SR2). 2019-11-18 12:31:53 +01:00
Mark Paluch
6aa5aea424 DATAMONGO-2402 - Updated changelog. 2019-11-18 12:31:44 +01:00
Mark Paluch
bc1b00813c DATAMONGO-2401 - Updated changelog. 2019-11-18 12:16:28 +01:00
Mark Paluch
d1ad3ab301 DATAMONGO-2414 - Polishing.
Use longer timeout to cater for slower CI environments.

Original Pull Request: #807
2019-11-14 11:54:43 +01:00
Mark Paluch
923134bbdc DATAMONGO-2414 - Guard drain loop in AsyncInputStreamHandler with state switch.
We now use a non-blocking state switch to determine whether to invoke drainLoop(…) from Subscriber completion.

Previously, we relied on same thread identification assuming if the subscription thread and the completion thread were the same, that we're already running inside the drain loop.
It turns out that a I/O thread could also run in event-loop mode where subscription and completion happens on the same thread but in between there's some processing and so the the call to completion is a delayed signal and not being called on the same stack as drainLoop(…).
The same-thread assumption was in place to avoid StackOverflow caused by infinite recursions.

We now use a state lock to enter the drain loop. Any concurrent attempts to re-enter the drain loop in Subscriber completion is now prevented to make sure that we continue draining while not causing stack recursions.

Original Pull Request: #807
2019-11-14 11:54:30 +01:00
Mark Paluch
e211f69df5 DATAMONGO-2409 - Polishing.
Adapt also ExecutableFindOperation.DistinctWithProjection.asType() to return the appropriate TerminatingDistinct.

Original pull request: #805.
2019-11-11 10:22:37 +01:00
Christoph Strobl
fc35d706a0 DATAMONGO-2409 - Fix return type of Kotlin extension function for ReactiveFindOperation.DistinctWithProjection.asType().
Original pull request: #805.
2019-11-11 10:22:37 +01:00
Mark Paluch
82894e6aff DATAMONGO-2403 - Polishing.
Use handle(…) to skip values instead of flatMap(…) to reduce overhead.

Original pull request: #804.
2019-11-08 13:51:20 +01:00
Christoph Strobl
7356f157bb DATAMONGO-2403 - Fix aggregation simple type result retrieval from empty document.
Projections used within an aggregation pipeline can result in empty documents emitted by the driver. We now guarded those cases and skip those documents within a Flux or simply return an empty Mono depending on the methods signature.

Original pull request: #804.
2019-11-08 13:48:10 +01:00
Christoph Strobl
783fc6268a DATAMONGO-2382 - After release cleanups. 2019-11-04 15:34:27 +01:00
Christoph Strobl
360b17f299 DATAMONGO-2382 - Prepare next development iteration. 2019-11-04 15:34:26 +01:00
Christoph Strobl
2cfcdaff7c DATAMONGO-2382 - Release version 2.2.1 (Moore SR1). 2019-11-04 14:55:02 +01:00
Christoph Strobl
9d9cf46e47 DATAMONGO-2382 - Prepare 2.2.1 (Moore SR1). 2019-11-04 14:54:16 +01:00
Christoph Strobl
98661cf9a2 DATAMONGO-2382 - Updated changelog. 2019-11-04 14:54:10 +01:00
Christoph Strobl
cc50cd5e3a DATAMONGO-2381 - Updated changelog. 2019-11-04 10:34:52 +01:00
Christoph Strobl
d8399d2d23 DATAMONGO-2393 - Remove capturing lambdas and extract methods.
Original Pull Request: #799
2019-10-31 12:57:06 +01:00
Mark Paluch
f2134fb2f8 DATAMONGO-2393 - Support configurable chunk size.
We now allow consuming GridFS files using a configurable chunk size. The default chunk size is now 256kb.

Original Pull Request: #799
2019-10-31 12:56:59 +01:00
Mark Paluch
ec3ccc004e DATAMONGO-2393 - Polishing.
Extract read requests into inner class.

Original Pull Request: #799
2019-10-31 12:56:50 +01:00
Mark Paluch
6cb246c18a DATAMONGO-2393 - Fix BufferOverflow in GridFS upload.
AsyncInputStreamAdapter now properly splits and buffers incoming DataBuffers according the read requests of AsyncInputStream.read(…) calls.
Previously, the adapter used the input buffer size to be used as the output buffer size. A larger DataBuffer than the transfer buffer handed in through read(…) caused a BufferOverflow.

Original Pull Request: #799
2019-10-31 12:56:42 +01:00
Mark Paluch
e73cea0ecf DATAMONGO-2393 - Use drain loop for same-thread processing in GridFS download stream.
We now rely on an outer drain-loop when GridFS reads complete on the same thread instead of using recursive subscriptions to avoid StackOverflow. Previously, we recursively invoked subscriptions that lead to an increased stack size.

Original Pull Request: #799
2019-10-31 12:56:28 +01:00
Christoph Strobl
c69e185a2a DATAMONGO-2399 - Upgrade to MongoDB Java Driver 3.11.1 2019-10-30 10:49:43 +01:00
Mark Paluch
5789f59222 DATAMONGO-2388 - Polishing.
Use StringJoiner to create comma-delimited String. Add nullability annotations.

Original pull request: #797.
2019-10-28 10:55:58 +01:00
Christoph Strobl
5178eeb340 DATAMONGO-2388 - Fix CodecConfigurationException when reading index info that contains DbRef.
Provide the default CodecRegistry when converting partial index data to its String representation used in IndexInfo.

Original pull request: #797.
2019-10-28 10:47:57 +01:00
Mark Paluch
bc5e7fa4a2 DATAMONGO-2394 - Polishing.
Reformat code.

Original pull request: #798.
2019-10-28 09:32:14 +01:00
Christoph Strobl
c28ace6d40 DATAMONGO-2394 - Fix distance conversion for derived finder using near along with GeoJSON.
GeoJson requries the distance to be in meters instead of radians, so we now make sure to convert it correctly

Original pull request: #798.
2019-10-28 09:32:10 +01:00
Mark Paluch
de4fae37e1 DATAMONGO-2392 - Polishing.
Add author tags. Move integration tests to existing test class.
Apply more appropriate in existing tests assertions. Use diamond syntax.

Original pull request: #796.
2019-10-16 13:55:12 +02:00
Mark Paluch
2f1aff3ec3 DATAMONGO-2392 - Consistently use GridFS file Id instead of ObjectId.
We now consistently use GridFSFile.getId() to allow custom Id usage instead of enforcing the Id to be an ObjectId. Using the native Id allows interaction with files that use a custom Id type.

Original pull request: #796.
2019-10-16 13:55:12 +02:00
Nick Stolwijk
6970f934bd DATAMONGO-2392 - Fix handling in ReactiveGridFsTemplate of GridFS files with custom id type.
Original pull request: #796.
2019-10-16 13:55:12 +02:00
Greg Turnquist
6b5168e102 DATAMONGO-2334 - Create CI job. 2019-09-30 14:36:06 -05:00
Mark Paluch
4420edb4dc DATAMONGO-2334 - After release cleanups. 2019-09-30 16:17:54 +02:00
Mark Paluch
c2fae95fee DATAMONGO-2334 - Prepare next development iteration. 2019-09-30 16:17:53 +02:00
1160 changed files with 28111 additions and 70509 deletions

View File

@@ -6,6 +6,7 @@ Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).

View File

@@ -1,47 +0,0 @@
# GitHub Actions to automate GitHub issues for Spring Data Project Management
name: Spring Data GitHub Issues
on:
issues:
types: [opened, edited, reopened]
issue_comment:
types: [created]
pull_request_target:
types: [opened, edited, reopened]
jobs:
Inbox:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request == null
steps:
- name: Create or Update Issue Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Inbox'
project-location: 'spring-projects'
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}
Pull-Request:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request != null
steps:
- name: Create or Update Pull Request Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Review pending'
project-location: 'spring-projects'
issue-number: ${{ github.event.pull_request.number }}
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}
Feedback-Provided:
runs-on: ubuntu-latest
if: github.repository_owner == 'spring-projects' && github.event_name == 'issue_comment' && github.event.action == 'created' && github.actor != 'spring-projects-issues' && github.event.pull_request == null && github.event.issue.state == 'open' && contains(toJSON(github.event.issue.labels), 'waiting-for-feedback')
steps:
- name: Update Project Card
uses: peter-evans/create-or-update-project-card@v1.1.2
with:
project-name: 'Spring Data'
column-name: 'Feedback provided'
project-location: 'spring-projects'
token: ${{ secrets.GH_ISSUES_TOKEN_SPRING_DATA }}

View File

@@ -1,2 +1 @@
#Mon Jan 30 10:48:12 CET 2023
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.7/apache-maven-3.8.7-bin.zip
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip

View File

@@ -1,6 +1,6 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Moore%20(main)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]

27
CODE_OF_CONDUCT.adoc Normal file
View File

@@ -0,0 +1,27 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the https://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at https://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

View File

@@ -1,3 +1,3 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/main/CONTRIBUTING.adoc[here].
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

212
Jenkinsfile vendored
View File

@@ -1,15 +1,9 @@
def p = [:]
node {
checkout scm
p = readProperties interpolate: true, file: 'ci/pipeline.properties'
}
pipeline {
agent none
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/main", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.2.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -20,146 +14,136 @@ pipeline {
stages {
stage("Docker images") {
parallel {
stage('Publish JDK (Java 17) + MongoDB 4.4') {
stage('Publish JDK 8 + MongoDB 4.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
changeset "ci/openjdk8-mongodb-4.0/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.0", "ci/openjdk8-mongodb-4.0/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 5.0') {
stage('Publish JDK 8 + MongoDB 4.1') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
changeset "ci/openjdk8-mongodb-4.1/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.1", "ci/openjdk8-mongodb-4.1/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 6.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-6.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
stage('Publish JDK 8 + MongoDB 4.2') {
when {
changeset "ci/openjdk8-mongodb-4.2/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk17-mongodb-6.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
steps {
script {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.2", "ci/openjdk8-mongodb-4.2/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
}
}
stage("test: baseline (Java 17)") {
stage("test: baseline") {
when {
beforeAgent(true)
anyOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
branch '2.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
label 'data'
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.2:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Dsort -U -B'
}
}
stage("Test other configurations") {
when {
beforeAgent(true)
allOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
anyOf {
branch '2.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: MongoDB 5.0 (Java 17)") {
stage("test: mongodb 4.0") {
agent {
label 'data'
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Dsort -U -B'
}
}
stage("test: MongoDB 6.0 (Java 17)") {
stage("test: mongodb 4.1") {
agent {
label 'data'
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.1:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Dsort -U -B'
}
}
}
@@ -167,36 +151,62 @@ pipeline {
stage('Release to artifactory') {
when {
beforeAgent(true)
anyOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
branch '2.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
label 'data'
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -v'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
stage('Publish documentation') {
when {
branch '2.2.x'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.distribution-repository=temp-private-local " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}

View File

@@ -1,19 +1,17 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start]
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://spring.io/projects/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
[[code-of-conduct]]
== Code of Conduct
This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
[[getting-started]]
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
@@ -52,7 +50,12 @@ public class MyService {
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoClientConfiguration {
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
@@ -61,7 +64,6 @@ class ApplicationConfig extends AbstractMongoClientConfiguration {
}
----
[[maven-configuration]]
=== Maven configuration
Add the Maven dependency:
@@ -71,35 +73,27 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}</version>
<version>${version}.RELEASE</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository
and declare the appropriate dependency version.
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}-SNAPSHOT</version>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-snapshot</id>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/snapshot</url>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
----
[[upgrading]]
== Upgrading
Instructions for how to upgrade from earlier versions of Spring Data are provided on the project https://github.com/spring-projects/spring-data-commons/wiki[wiki].
Follow the links in the https://github.com/spring-projects/spring-data-commons/wiki#release-notes[release notes section] to find the version that you want to upgrade to.
[[getting-help]]
== Getting Help
Having trouble with Spring Data? Wed love to help!
@@ -113,7 +107,6 @@ If you are just starting out with Spring, try one of the https://spring.io/guide
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://github.com/spring-projects/spring-data-mongodb/issues[github.com/spring-projects/spring-data-mongodb/issues].
[[reporting-issues]]
== Reporting Issues
Spring Data uses Github as issue tracking system to record bugs and feature requests.
@@ -124,96 +117,19 @@ If you want to raise an issue, please follow the recommendations below:
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using, the JVM version, Stacktrace, etc.
* If you need to paste code, or include a stack trace use https://guides.github.com/features/mastering-markdown/[Markdown] code fences +++```+++.
[[guides]]
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
[[examples]]
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
[[building-from-source]]
== Building from Source
You do not need to build from source to use Spring Data. Binaries are available in https://repo.spring.io[repo.spring.io]
and accessible from Maven using the Maven configuration noted <<maven-configuration,above>>.
NOTE: Configuration for Gradle is similar to Maven.
The best way to get started is by creating a Spring Boot project using MongoDB on https://start.spring.io[start.spring.io].
Follow this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb[link]
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
to build a reactive one.
However, if you want to try out the latest and greatest, Spring Data MongoDB can be easily built with the https://github.com/takari/maven-wrapper[Maven wrapper]
and minimally, JDK 17 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
In order to build Spring Data MongoDB, you will need to https://www.mongodb.com/try/download/community[download]
and https://docs.mongodb.com/manual/installation/[install a MongoDB distribution].
Once you have installed MongoDB, you need to start a MongoDB server. It is convenient to set an environment variable to
your MongoDB installation directory (e.g. `MONGODB_HOME`).
To run the full test suite, a https://docs.mongodb.com/manual/tutorial/deploy-replica-set/[MongoDB Replica Set]
is required.
To run the MongoDB server enter the following command from a command-line:
[source,bash]
----
$ $MONGODB_HOME/bin/mongod --dbpath $MONGODB_HOME/runtime/data --ipv6 --port 27017 --replSet rs0
...
"msg":"Successfully connected to host"
----
Once the MongoDB server starts up, you should see the message (`msg`), "_Successfully connected to host_".
Notice the `--dbpath` option to the `mongod` command. You can set this to anything you like, but in this case, we set
the absolute path to a sub-directory (`runtime/data/`) under the MongoDB installation directory (in `$MONGODB_HOME`).
You need to initialize the MongoDB replica set only once on the first time the MongoDB server is started.
To initialize the replica set, start a mongo client:
[source,bash]
----
$ $MONGODB_HOME/bin/mongo
MongoDB server version: 6.0.0
...
----
Then enter the following command:
[source,bash]
----
mongo> rs.initiate({ _id: 'rs0', members: [ { _id: 0, host: '127.0.0.1:27017' } ] })
----
Finally, on UNIX-based system (for example, Linux or Mac OS X) you may need to adjust the `ulimit`.
In case you need to, you can adjust the `ulimit` with the following command (32768 is just a recommendation):
[source,bash]
----
$ ulimit -n 32768
----
You can use `ulimit -a` again to verify the `ulimit` for "_open files_" was set appropriately.
Now you are ready to build Spring Data MongoDB. Simply enter the following `mvnw` (Maven Wrapper) command:
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
[source,bash]
----
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.8.0 or above].
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular, please sign
the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
@@ -226,7 +142,17 @@ Building the documentation builds also the project without running tests.
The generated documentation is available from `target/site/reference/html/index.html`.
[[license]]
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

View File

@@ -1,22 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,24 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 && \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,24 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.9 mongodb-org-server=4.0.9 mongodb-org-shell=4.0.9 mongodb-org-mongos=4.0.9 mongodb-org-tools=4.0.9
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 4B7C549A058F8B6B
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.1 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.1.list
RUN apt-get update
RUN apt-get install -y mongodb-org-unstable=4.1.13 mongodb-org-unstable-server=4.1.13 mongodb-org-unstable-shell=4.1.13 mongodb-org-unstable-mongos=4.1.13 mongodb-org-unstable-tools=4.1.13
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv e162f504a20cdf15827f718d4b7c549a058f8b6b
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.2 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.2.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.2.0 mongodb-org-server=4.2.0 mongodb-org-shell=4.2.0 mongodb-org-mongos=4.2.0 mongodb-org-tools=4.2.0
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,25 +0,0 @@
# Java versions
java.main.tag=17.0.6_10-jdk-focal
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
# Supported versions of MongoDB
docker.mongodb.4.4.version=4.4.18
docker.mongodb.5.0.version=5.0.14
docker.mongodb.6.0.version=6.0.4
# Supported versions of Redis
docker.redis.6.version=6.2.10
# Supported versions of Cassandra
docker.cassandra.3.version=3.11.14
# Docker environment settings
docker.java.inside.basic=-v $HOME:/tmp/jenkins-home
docker.java.inside.docker=-u root -v /var/run/docker.sock:/var/run/docker.sock -v /usr/bin/docker:/usr/bin/docker -v $HOME:/tmp/jenkins-home
# Credentials
docker.registry=
docker.credentials=hub.docker.com-springbuildmaster
artifactory.credentials=02bd1690-b54f-4c9f-819d-a77cb7a9822c

42
pom.xml
View File

@@ -5,17 +5,17 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-M2</version>
<version>2.2.14.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>https://spring.io/projects/spring-data-mongodb</url>
<url>https://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.1.0-M2</version>
<version>2.2.14.BUILD-SNAPSHOT</version>
</parent>
<modules>
@@ -26,9 +26,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.1.0-M2</springdata.commons>
<mongo>4.9.0</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<springdata.commons>2.2.14.BUILD-SNAPSHOT</springdata.commons>
<mongo>3.11.2</mongo>
<mongo.reactivestreams>1.12.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -112,17 +112,6 @@
</developer>
</developers>
<scm>
<connection>scm:git:https://github.com/spring-projects/spring-data-mongodb.git</connection>
<developerConnection>scm:git:git@github.com:spring-projects/spring-data-mongodb.git</developerConnection>
<url>https://github.com/spring-projects/spring-data-mongodb</url>
</scm>
<issueManagement>
<system>GitHub</system>
<url>https://github.com/spring-projects/spring-data-mongodb/issues</url>
</issueManagement>
<profiles>
<profile>
<id>benchmarks</id>
@@ -138,28 +127,15 @@
<!-- MongoDB -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
<artifactId>mongo-java-driver</artifactId>
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
</repositories>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-M2</version>
<version>2.2.14.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -322,7 +322,7 @@ public class AbstractMicrobenchmark {
try {
ResultsWriter.forUri(uri).write(results);
} catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'; Error was %s", uri, e));
System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,7 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -15,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-M2</version>
<version>2.2.14.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -35,8 +34,7 @@
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}
</mongo-reactivestreams>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
@@ -45,15 +43,4 @@
</build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-plugins-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -1,7 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.1.0-M2</version>
<version>2.2.14.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -67,12 +65,6 @@
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
@@ -89,22 +81,7 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.2</version>
<optional>true</optional>
</dependency>
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-sync</artifactId>
<version>${mongo}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
@@ -112,6 +89,23 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
@@ -125,14 +119,34 @@
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava3}</version>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
@@ -142,48 +156,31 @@
</dependency>
<dependency>
<groupId>jakarta.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>jakarta.annotation</groupId>
<artifactId>jakarta.annotation-api</artifactId>
<version>${jakarta-annotation-api}</version>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>${javax-annotation-api}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-spi</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-impl</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation -->
<dependency>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -196,37 +193,23 @@
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-observation</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.hibernate.validator</groupId>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>7.0.1.Final</version>
<version>5.2.4.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jakarta.el</groupId>
<artifactId>jakarta.el-api</artifactId>
<version>4.0.0</version>
<scope>provided</scope>
<optional>true</optional>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>jakarta.el</artifactId>
<version>4.0.2</version>
<scope>provided</scope>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
@@ -236,6 +219,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
@@ -264,16 +254,9 @@
</dependency>
<dependency>
<groupId>org.junit-pioneer</groupId>
<artifactId>junit-pioneer</artifactId>
<version>0.5.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jakarta.transaction</groupId>
<artifactId>jakarta.transaction-api</artifactId>
<version>2.0.0</version>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
@@ -304,43 +287,11 @@
<dependency>
<groupId>io.mockk</groupId>
<artifactId>mockk-jvm</artifactId>
<artifactId>mockk</artifactId>
<version>${mockk}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>com.github.tomakehurst</groupId>
<artifactId>wiremock-jre8-standalone</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-integration-test</artifactId>
<scope>test</scope>
</dependency>
<!-- jMolecules -->
<dependency>
<groupId>org.jmolecules</groupId>
<artifactId>jmolecules-ddd</artifactId>
<version>${jmolecules}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
@@ -365,11 +316,8 @@
<goal>test-process</goal>
</goals>
<configuration>
<outputDirectory>target/generated-test-sources
</outputDirectory>
<processor>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</processor>
<outputDirectory>target/generated-test-sources</outputDirectory>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
</configuration>
</execution>
</executions>
@@ -389,11 +337,15 @@
<exclude>**/ReactivePerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>
src/test/resources/logging.properties
</java.util.logging.config.file>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
<reactor.trace.cancel>true</reactor.trace.cancel>
</systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration>
</plugin>

View File

@@ -1,144 +0,0 @@
/*
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Arrays;
import org.bson.Document;
import org.bson.codecs.DocumentCodec;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* A {@link MongoExpression} using the {@link ParameterBindingDocumentCodec} for parsing a raw ({@literal json})
* expression. The expression will be wrapped within <code>{ ... }</code> if necessary. The actual parsing and parameter
* binding of placeholders like {@code ?0} is delayed upon first call on the the target {@link Document} via
* {@link #toDocument()}.
* <br />
*
* <pre class="code">
* $toUpper : $name -> { '$toUpper' : '$name' }
*
* { '$toUpper' : '$name' } -> { '$toUpper' : '$name' }
*
* { '$toUpper' : '?0' }, "$name" -> { '$toUpper' : '$name' }
* </pre>
*
* Some types might require a special {@link org.bson.codecs.Codec}. If so, make sure to provide a {@link CodecRegistry}
* containing the required {@link org.bson.codecs.Codec codec} via {@link #withCodecRegistry(CodecRegistry)}.
*
* @author Christoph Strobl
* @since 3.2
*/
public class BindableMongoExpression implements MongoExpression {
private final String expressionString;
private final @Nullable CodecRegistryProvider codecRegistryProvider;
private final @Nullable Object[] args;
private final Lazy<Document> target;
/**
* Create a new instance of {@link BindableMongoExpression}.
*
* @param expression must not be {@literal null}.
* @param args can be {@literal null}.
*/
public BindableMongoExpression(String expression, @Nullable Object[] args) {
this(expression, null, args);
}
/**
* Create a new instance of {@link BindableMongoExpression}.
*
* @param expression must not be {@literal null}.
* @param codecRegistryProvider can be {@literal null}.
* @param args can be {@literal null}.
*/
public BindableMongoExpression(String expression, @Nullable CodecRegistryProvider codecRegistryProvider,
@Nullable Object[] args) {
this.expressionString = expression;
this.codecRegistryProvider = codecRegistryProvider;
this.args = args;
this.target = Lazy.of(this::parse);
}
/**
* Provide the {@link CodecRegistry} used to convert expressions.
*
* @param codecRegistry must not be {@literal null}.
* @return new instance of {@link BindableMongoExpression}.
*/
public BindableMongoExpression withCodecRegistry(CodecRegistry codecRegistry) {
return new BindableMongoExpression(expressionString, () -> codecRegistry, args);
}
/**
* Provide the arguments to bind to the placeholders via their index.
*
* @param args must not be {@literal null}.
* @return new instance of {@link BindableMongoExpression}.
*/
public BindableMongoExpression bind(Object... args) {
return new BindableMongoExpression(expressionString, codecRegistryProvider, args);
}
@Override
public Document toDocument() {
return target.get();
}
@Override
public String toString() {
return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args="
+ Arrays.toString(args) + '}';
}
private Document parse() {
String expression = wrapJsonIfNecessary(expressionString);
if (ObjectUtils.isEmpty(args)) {
if (codecRegistryProvider == null) {
return Document.parse(expression);
}
return Document.parse(expression, codecRegistryProvider.getCodecFor(Document.class)
.orElseGet(() -> new DocumentCodec(codecRegistryProvider.getCodecRegistry())));
}
ParameterBindingDocumentCodec codec = codecRegistryProvider == null ? new ParameterBindingDocumentCodec()
: new ParameterBindingDocumentCodec(codecRegistryProvider.getCodecRegistry());
return codec.decode(expression, args);
}
private static String wrapJsonIfNecessary(String json) {
if (StringUtils.hasText(json) && (json.startsWith("{") && json.endsWith("}"))) {
return json;
}
return "{" + json + "}";
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,9 +19,9 @@ import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.bulk.BulkWriteError;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
@@ -38,12 +38,12 @@ public class BulkOperationException extends DataAccessException {
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link MongoBulkWriteException}.
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, MongoBulkWriteException source) {
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2023 the original author or authors.
* Copyright 2017-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -62,7 +62,7 @@ public interface CodecRegistryProvider {
*/
default <T> Optional<Codec<T>> getCodecFor(Class<T> type) {
Assert.notNull(type, "Type must not be null");
Assert.notNull(type, "Type must not be null!");
try {
return Optional.of(getCodecRegistry().get(type));

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2023 the original author or authors.
* Copyright 2013-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,8 +20,8 @@ import org.springframework.util.StringUtils;
/**
* Helper class featuring helper methods for working with MongoDb collections.
* <br />
* <br />
* <p/>
* <p/>
* Mainly intended for internal use within the framework.
*
* @author Thomas Risberg

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,10 +27,10 @@ import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
/**
* Helper class for managing a {@link MongoDatabase} instances via {@link MongoDatabaseFactory}. Used for obtaining
* Helper class for managing a {@link MongoDatabase} instances via {@link MongoDbFactory}. Used for obtaining
* {@link ClientSession session bound} resources, such as {@link MongoDatabase} and
* {@link com.mongodb.client.MongoCollection} suitable for transactional usage.
* <br />
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
@@ -41,95 +41,93 @@ import com.mongodb.client.MongoDatabase;
public class MongoDatabaseUtils {
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory} using
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDatabaseFactory factory) {
public static MongoDatabase getDatabase(MongoDbFactory factory) {
return doGetMongoDatabase(null, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory}.
* <br />
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDatabaseFactory factory, SessionSynchronization sessionSynchronization) {
public static MongoDatabase getDatabase(MongoDbFactory factory, SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(null, factory, sessionSynchronization);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory} using
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(@Nullable String dbName, MongoDatabaseFactory factory) {
public static MongoDatabase getDatabase(@Nullable String dbName, MongoDbFactory factory) {
return doGetMongoDatabase(dbName, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory}.
* <br />
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(@Nullable String dbName, MongoDatabaseFactory factory,
public static MongoDatabase getDatabase(@Nullable String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(dbName, factory, sessionSynchronization);
}
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDatabaseFactory factory,
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "Factory must not be null");
Assert.notNull(factory, "Factory must not be null!");
if (sessionSynchronization == SessionSynchronization.NEVER
|| !TransactionSynchronizationManager.isSynchronizationActive()) {
return StringUtils.hasText(dbName) ? factory.getMongoDatabase(dbName) : factory.getMongoDatabase();
if (!TransactionSynchronizationManager.isSynchronizationActive()) {
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
ClientSession session = doGetSession(factory, sessionSynchronization);
if (session == null) {
return StringUtils.hasText(dbName) ? factory.getMongoDatabase(dbName) : factory.getMongoDatabase();
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
MongoDatabaseFactory factoryToUse = factory.withSession(session);
return StringUtils.hasText(dbName) ? factoryToUse.getMongoDatabase(dbName) : factoryToUse.getMongoDatabase();
MongoDbFactory factoryToUse = factory.withSession(session);
return StringUtils.hasText(dbName) ? factoryToUse.getDb(dbName) : factoryToUse.getDb();
}
/**
* Check if the {@link MongoDatabaseFactory} is actually bound to a {@link ClientSession} that has an active
* transaction, or if a {@link TransactionSynchronization} has been registered for the {@link MongoDatabaseFactory
* resource} and if the associated {@link ClientSession} has an {@link ClientSession#hasActiveTransaction() active
* transaction}.
* Check if the {@link MongoDbFactory} is actually bound to a {@link ClientSession} that has an active transaction, or
* if a {@link TransactionSynchronization} has been registered for the {@link MongoDbFactory resource} and if the
* associated {@link ClientSession} has an {@link ClientSession#hasActiveTransaction() active transaction}.
*
* @param dbFactory the resource to check transactions for. Must not be {@literal null}.
* @return {@literal true} if the factory has an ongoing transaction.
* @since 2.1.3
*/
public static boolean isTransactionActive(MongoDatabaseFactory dbFactory) {
public static boolean isTransactionActive(MongoDbFactory dbFactory) {
if (dbFactory.isTransactionActive()) {
return true;
@@ -140,8 +138,7 @@ public class MongoDatabaseUtils {
}
@Nullable
private static ClientSession doGetSession(MongoDatabaseFactory dbFactory,
SessionSynchronization sessionSynchronization) {
private static ClientSession doGetSession(MongoDbFactory dbFactory, SessionSynchronization sessionSynchronization) {
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager.getResource(dbFactory);
@@ -172,7 +169,7 @@ public class MongoDatabaseUtils {
return resourceHolder.getSession();
}
private static ClientSession createClientSession(MongoDatabaseFactory dbFactory) {
private static ClientSession createClientSession(MongoDbFactory dbFactory) {
return dbFactory.getSession(ClientSessionOptions.builder().causallyConsistent(true).build());
}
@@ -187,17 +184,25 @@ public class MongoDatabaseUtils {
private final MongoResourceHolder resourceHolder;
MongoSessionSynchronization(MongoResourceHolder resourceHolder, MongoDatabaseFactory dbFactory) {
MongoSessionSynchronization(MongoResourceHolder resourceHolder, MongoDbFactory dbFactory) {
super(resourceHolder, dbFactory);
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
@@ -206,6 +211,10 @@ public class MongoDatabaseUtils {
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
@@ -216,6 +225,10 @@ public class MongoDatabaseUtils {
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
@@ -30,26 +31,25 @@ import com.mongodb.client.MongoDatabase;
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
* @since 3.0
*/
public interface MongoDatabaseFactory extends CodecRegistryProvider, MongoSessionProvider {
public interface MongoDbFactory extends CodecRegistryProvider, MongoSessionProvider {
/**
* Obtain a {@link MongoDatabase} from the underlying factory.
* Creates a default {@link MongoDatabase} instance.
*
* @return never {@literal null}.
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase() throws DataAccessException;
MongoDatabase getDb() throws DataAccessException;
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
* Creates a {@link DB} instance to access the database with the given name.
*
* @param dbName must not be {@literal null}.
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase(String dbName) throws DataAccessException;
MongoDatabase getDb(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
@@ -58,6 +58,16 @@ public interface MongoDatabaseFactory extends CodecRegistryProvider, MongoSessio
*/
PersistenceExceptionTranslator getExceptionTranslator();
/**
* Get the legacy database entry point. Please consider {@link #getDb()} instead.
*
* @return
* @deprecated since 2.1, use {@link #getDb()}. This method will be removed with a future version as it works only
* with the legacy MongoDB driver.
*/
@Deprecated
DB getLegacyDb();
/**
* Get the underlying {@link CodecRegistry} used by the MongoDB Java driver.
*
@@ -65,7 +75,7 @@ public interface MongoDatabaseFactory extends CodecRegistryProvider, MongoSessio
*/
@Override
default CodecRegistry getCodecRegistry() {
return getMongoDatabase().getCodecRegistry();
return getDb().getCodecRegistry();
}
/**
@@ -78,29 +88,29 @@ public interface MongoDatabaseFactory extends CodecRegistryProvider, MongoSessio
ClientSession getSession(ClientSessionOptions options);
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoDatabaseFactory} returning {@link MongoDatabase}
* instances that are aware and bound to a new session with given {@link ClientSessionOptions options}.
* Obtain a {@link ClientSession} bound instance of {@link MongoDbFactory} returning {@link MongoDatabase} instances
* that are aware and bound to a new session with given {@link ClientSessionOptions options}.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default MongoDatabaseFactory withSession(ClientSessionOptions options) {
default MongoDbFactory withSession(ClientSessionOptions options) {
return withSession(getSession(options));
}
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoDatabaseFactory} returning {@link MongoDatabase}
* instances that are aware and bound to the given session.
* Obtain a {@link ClientSession} bound instance of {@link MongoDbFactory} returning {@link MongoDatabase} instances
* that are aware and bound to the given session.
*
* @param session must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
MongoDatabaseFactory withSession(ClientSession session);
MongoDbFactory withSession(ClientSession session);
/**
* Returns if the given {@link MongoDatabaseFactory} is bound to a {@link ClientSession} that has an
* Returns if the given {@link MongoDbFactory} is bound to a {@link ClientSession} that has an
* {@link ClientSession#hasActiveTransaction() active transaction}.
*
* @return {@literal true} if there's an active transaction, {@literal false} otherwise.

View File

@@ -1,73 +0,0 @@
/*
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
/**
* Wrapper object for MongoDB expressions like {@code $toUpper : $name} that manifest as {@link org.bson.Document} when
* passed on to the driver.
* <br />
* A set of predefined {@link MongoExpression expressions}, including a
* {@link org.springframework.data.mongodb.core.aggregation.AggregationSpELExpression SpEL based variant} for method
* like expressions (eg. {@code toUpper(name)}) are available via the
* {@link org.springframework.data.mongodb.core.aggregation Aggregation API}.
*
* @author Christoph Strobl
* @since 3.2
* @see org.springframework.data.mongodb.core.aggregation.ArithmeticOperators
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators
* @see org.springframework.data.mongodb.core.aggregation.ComparisonOperators
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators
* @see org.springframework.data.mongodb.core.aggregation.ConvertOperators
* @see org.springframework.data.mongodb.core.aggregation.DateOperators
* @see org.springframework.data.mongodb.core.aggregation.ObjectOperators
* @see org.springframework.data.mongodb.core.aggregation.SetOperators
* @see org.springframework.data.mongodb.core.aggregation.StringOperators
*/
@FunctionalInterface
public interface MongoExpression {
/**
* Create a new {@link MongoExpression} from plain {@link String} (eg. {@code $toUpper : $name}). <br />
* The given expression will be wrapped with <code>{ ... }</code> to match an actual MongoDB {@link org.bson.Document}
* if necessary.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link MongoExpression}.
*/
static MongoExpression create(String expression) {
return new BindableMongoExpression(expression, null);
}
/**
* Create a new {@link MongoExpression} from plain {@link String} containing placeholders (eg. {@code $toUpper : ?0})
* that will be resolved on first call of {@link #toDocument()}. <br />
* The given expression will be wrapped with <code>{ ... }</code> to match an actual MongoDB {@link org.bson.Document}
* if necessary.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link MongoExpression}.
*/
static MongoExpression create(String expression, Object... args) {
return new BindableMongoExpression(expression, args);
}
/**
* Obtain the native {@link org.bson.Document} representation.
*
* @return never {@literal null}.
*/
org.bson.Document toDocument();
}

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Arrays;
import java.util.function.Consumer;
import org.springframework.data.domain.ManagedTypes;
/**
* @author Christoph Strobl
* @since 4.0
*/
public final class MongoManagedTypes implements ManagedTypes {
private final ManagedTypes delegate;
private MongoManagedTypes(ManagedTypes types) {
this.delegate = types;
}
/**
* Wraps an existing {@link ManagedTypes} object with {@link MongoManagedTypes}.
*
* @param managedTypes
* @return
*/
public static MongoManagedTypes from(ManagedTypes managedTypes) {
return new MongoManagedTypes(managedTypes);
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given array of {@link Class types}.
*
* @param types array of {@link Class types} used to initialize the {@link ManagedTypes}; must not be {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized from {@link Class types}.
*/
public static MongoManagedTypes from(Class<?>... types) {
return fromIterable(Arrays.asList(types));
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given, required {@link Iterable} of
* {@link Class types}.
*
* @param types {@link Iterable} of {@link Class types} used to initialize the {@link ManagedTypes}; must not be
* {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized the given, required {@link Iterable} of {@link Class
* types}.
*/
public static MongoManagedTypes fromIterable(Iterable<? extends Class<?>> types) {
return from(ManagedTypes.fromIterable(types));
}
/**
* Factory method to return an empty {@link MongoManagedTypes} object.
*
* @return an empty {@link MongoManagedTypes} object.
*/
public static MongoManagedTypes empty() {
return from(ManagedTypes.empty());
}
@Override
public void forEach(Consumer<Class<?>> action) {
delegate.forEach(action);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,7 +24,7 @@ import com.mongodb.client.ClientSession;
/**
* MongoDB specific {@link ResourceHolderSupport resource holder}, wrapping a {@link ClientSession}.
* {@link MongoTransactionManager} binds instances of this class to the thread.
* <br />
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
@@ -36,15 +36,15 @@ import com.mongodb.client.ClientSession;
class MongoResourceHolder extends ResourceHolderSupport {
private @Nullable ClientSession session;
private MongoDatabaseFactory dbFactory;
private MongoDbFactory dbFactory;
/**
* Create a new {@link MongoResourceHolder} for a given {@link ClientSession session}.
*
* @param session the associated {@link ClientSession}. Can be {@literal null}.
* @param dbFactory the associated {@link MongoDatabaseFactory}. must not be {@literal null}.
* @param dbFactory the associated {@link MongoDbFactory}. must not be {@literal null}.
*/
MongoResourceHolder(@Nullable ClientSession session, MongoDatabaseFactory dbFactory) {
MongoResourceHolder(@Nullable ClientSession session, MongoDbFactory dbFactory) {
this.session = session;
this.dbFactory = dbFactory;
@@ -68,16 +68,16 @@ class MongoResourceHolder extends ResourceHolderSupport {
ClientSession session = getSession();
if (session == null) {
throw new IllegalStateException("No session available");
throw new IllegalStateException("No session available!");
}
return session;
}
/**
* @return the associated {@link MongoDatabaseFactory}.
* @return the associated {@link MongoDbFactory}.
*/
public MongoDatabaseFactory getDbFactory() {
public MongoDbFactory getDbFactory() {
return dbFactory;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -36,19 +36,18 @@ import com.mongodb.client.ClientSession;
/**
* A {@link org.springframework.transaction.PlatformTransactionManager} implementation that manages
* {@link ClientSession} based transactions for a single {@link MongoDatabaseFactory}.
* <br />
* Binds a {@link ClientSession} from the specified {@link MongoDatabaseFactory} to the thread.
* <br />
* {@link ClientSession} based transactions for a single {@link MongoDbFactory}.
* <p />
* Binds a {@link ClientSession} from the specified {@link MongoDbFactory} to the thread.
* <p />
* {@link TransactionDefinition#isReadOnly() Readonly} transactions operate on a {@link ClientSession} and enable causal
* consistency, and also {@link ClientSession#startTransaction() start}, {@link ClientSession#commitTransaction()
* commit} or {@link ClientSession#abortTransaction() abort} a transaction.
* <br />
* <p />
* Application code is required to retrieve the {@link com.mongodb.client.MongoDatabase} via
* {@link MongoDatabaseUtils#getDatabase(MongoDatabaseFactory)} instead of a standard
* {@link MongoDatabaseFactory#getMongoDatabase()} call. Spring classes such as
* {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly.
* <br />
* {@link MongoDatabaseUtils#getDatabase(MongoDbFactory)} instead of a standard {@link MongoDbFactory#getDb()} call.
* Spring classes such as {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly.
* <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. One may override
* {@link #doCommit(MongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
@@ -59,53 +58,57 @@ import com.mongodb.client.ClientSession;
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
* @see <a href="https://www.mongodb.com/transactions">MongoDB Transaction Documentation</a>
* @see MongoDatabaseUtils#getDatabase(MongoDatabaseFactory, SessionSynchronization)
* @see MongoDatabaseUtils#getDatabase(MongoDbFactory, SessionSynchronization)
*/
public class MongoTransactionManager extends AbstractPlatformTransactionManager
implements ResourceTransactionManager, InitializingBean {
private @Nullable MongoDatabaseFactory dbFactory;
private @Nullable MongoDbFactory dbFactory;
private @Nullable TransactionOptions options;
/**
* Create a new {@link MongoTransactionManager} for bean-style usage.
* <br />
* <strong>Note:</strong>The {@link MongoDatabaseFactory db factory} has to be
* {@link #setDbFactory(MongoDatabaseFactory) set} before using the instance. Use this constructor to prepare a
* {@link MongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}.
* <br />
* <p />
* <strong>Note:</strong>The {@link MongoDbFactory db factory} has to be {@link #setDbFactory(MongoDbFactory) set}
* before using the instance. Use this constructor to prepare a {@link MongoTransactionManager} via a
* {@link org.springframework.beans.factory.BeanFactory}.
* <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
*
* @see #setDbFactory(MongoDatabaseFactory)
* @see #setDbFactory(MongoDbFactory)
* @see #setTransactionSynchronization(int)
*/
public MongoTransactionManager() {}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDatabaseFactory}.
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory}.
*
* @param dbFactory must not be {@literal null}.
*/
public MongoTransactionManager(MongoDatabaseFactory dbFactory) {
public MongoTransactionManager(MongoDbFactory dbFactory) {
this(dbFactory, null);
}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDatabaseFactory}
* applying the given {@link TransactionOptions options}, if present, when starting a new transaction.
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory} applying the
* given {@link TransactionOptions options}, if present, when starting a new transaction.
*
* @param dbFactory must not be {@literal null}.
* @param options can be {@literal null}.
*/
public MongoTransactionManager(MongoDatabaseFactory dbFactory, @Nullable TransactionOptions options) {
public MongoTransactionManager(MongoDbFactory dbFactory, @Nullable TransactionOptions options) {
Assert.notNull(dbFactory, "DbFactory must not be null");
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
@@ -114,11 +117,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
@@ -148,6 +159,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
@@ -157,11 +172,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
@@ -188,8 +211,8 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
* By default those labels are ignored, nevertheless one might check for
* {@link MongoException#UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL transient commit errors labels} and retry the the
* commit. <br />
* <pre>
* <code>
* <pre>
* int retries = 3;
* do {
* try {
@@ -202,8 +225,8 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
* }
* Thread.sleep(500);
* } while (--retries > 0);
* </pre>
* </code>
* </pre>
*
* @param transactionObject never {@literal null}.
* @throws Exception in case of transaction errors.
@@ -212,6 +235,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
@@ -231,6 +258,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
@@ -238,6 +269,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
@@ -260,13 +295,13 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
/**
* Set the {@link MongoDatabaseFactory} that this instance should manage transactions for.
* Set the {@link MongoDbFactory} that this instance should manage transactions for.
*
* @param dbFactory must not be {@literal null}.
*/
public void setDbFactory(MongoDatabaseFactory dbFactory) {
public void setDbFactory(MongoDbFactory dbFactory) {
Assert.notNull(dbFactory, "DbFactory must not be null");
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
}
@@ -280,20 +315,28 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
/**
* Get the {@link MongoDatabaseFactory} that this instance manages transactions for.
* Get the {@link MongoDbFactory} that this instance manages transactions for.
*
* @return can be {@literal null}.
*/
@Nullable
public MongoDatabaseFactory getDbFactory() {
public MongoDbFactory getDbFactory() {
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDatabaseFactory getResourceFactory() {
public MongoDbFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
@@ -301,7 +344,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoResourceHolder newResourceHolder(TransactionDefinition definition, ClientSessionOptions options) {
MongoDatabaseFactory dbFactory = getResourceFactory();
MongoDbFactory dbFactory = getResourceFactory();
MongoResourceHolder resourceHolder = new MongoResourceHolder(dbFactory.getSession(options), dbFactory);
resourceHolder.setTimeoutIfNotDefaulted(determineTimeout(definition));
@@ -312,10 +355,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
/**
* @throws IllegalStateException if {@link #dbFactory} is {@literal null}.
*/
private MongoDatabaseFactory getRequiredDbFactory() {
private MongoDbFactory getRequiredDbFactory() {
Assert.state(dbFactory != null,
"MongoTransactionManager operates upon a MongoDbFactory; Did you forget to provide one; It's required");
"MongoTransactionManager operates upon a MongoDbFactory. Did you forget to provide one? It's required.");
return dbFactory;
}
@@ -353,7 +396,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %b, ", session.getServerSession().isClosed());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";
@@ -450,22 +493,30 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present; o_O");
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present. o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null");
Assert.state(session != null, "A Session is required but it turned out to be null.");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2023 the original author or authors.
* Copyright 2016-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,7 +31,6 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
*
* @author Mark Paluch
* @author Christoph Strobl
* @author Mathieu Ouellet
* @since 2.0
*/
public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
@@ -42,16 +41,16 @@ public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
* @return never {@literal null}.
* @throws DataAccessException
*/
Mono<MongoDatabase> getMongoDatabase() throws DataAccessException;
MongoDatabase getMongoDatabase() throws DataAccessException;
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
* Creates a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
*/
Mono<MongoDatabase> getMongoDatabase(String dbName) throws DataAccessException;
MongoDatabase getMongoDatabase(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
@@ -65,7 +64,10 @@ public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
*
* @return never {@literal null}.
*/
CodecRegistry getCodecRegistry();
@Override
default CodecRegistry getCodecRegistry() {
return getMongoDatabase().getCodecRegistry();
}
/**
* Obtain a {@link Mono} emitting a {@link ClientSession} for given {@link ClientSessionOptions options}.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2023 the original author or authors.
* Copyright 2019-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -36,12 +36,11 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* Helper class for managing reactive {@link MongoDatabase} instances via {@link ReactiveMongoDatabaseFactory}. Used for
* obtaining {@link ClientSession session bound} resources, such as {@link MongoDatabase} and {@link MongoCollection}
* suitable for transactional usage.
* <br />
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Mark Paluch
* @author Christoph Strobl
* @author Mathieu Ouellet
* @since 2.2
*/
public class ReactiveMongoDatabaseUtils {
@@ -75,7 +74,7 @@ public class ReactiveMongoDatabaseUtils {
/**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
@@ -88,7 +87,7 @@ public class ReactiveMongoDatabaseUtils {
/**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
@@ -104,7 +103,7 @@ public class ReactiveMongoDatabaseUtils {
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory} using {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
@@ -119,7 +118,7 @@ public class ReactiveMongoDatabaseUtils {
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory}.
* <br />
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
@@ -136,24 +135,21 @@ public class ReactiveMongoDatabaseUtils {
private static Mono<MongoDatabase> doGetMongoDatabase(@Nullable String dbName, ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "DatabaseFactory must not be null");
if (sessionSynchronization == SessionSynchronization.NEVER) {
return getMongoDatabaseOrDefault(dbName, factory);
}
Assert.notNull(factory, "DatabaseFactory must not be null!");
return TransactionSynchronizationManager.forCurrentTransaction()
.filter(TransactionSynchronizationManager::isSynchronizationActive) //
.flatMap(synchronizationManager -> {
return doGetSession(synchronizationManager, factory, sessionSynchronization) //
.flatMap(it -> getMongoDatabaseOrDefault(dbName, factory.withSession(it)));
}) //
.onErrorResume(NoTransactionException.class, e -> getMongoDatabaseOrDefault(dbName, factory))
.switchIfEmpty(getMongoDatabaseOrDefault(dbName, factory));
.map(it -> getMongoDatabaseOrDefault(dbName, factory.withSession(it)));
})
.onErrorResume(NoTransactionException.class,
e -> Mono.fromSupplier(() -> getMongoDatabaseOrDefault(dbName, factory)))
.defaultIfEmpty(getMongoDatabaseOrDefault(dbName, factory));
}
private static Mono<MongoDatabase> getMongoDatabaseOrDefault(@Nullable String dbName,
private static MongoDatabase getMongoDatabaseOrDefault(@Nullable String dbName,
ReactiveMongoDatabaseFactory factory) {
return StringUtils.hasText(dbName) ? factory.getMongoDatabase(dbName) : factory.getMongoDatabase();
}
@@ -214,11 +210,19 @@ public class ReactiveMongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
@@ -229,6 +233,10 @@ public class ReactiveMongoDatabaseUtils {
return Mono.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override
public Mono<Void> afterCompletion(int status) {
@@ -244,6 +252,10 @@ public class ReactiveMongoDatabaseUtils {
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2023 the original author or authors.
* Copyright 2019-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,7 +24,7 @@ import com.mongodb.reactivestreams.client.ClientSession;
/**
* MongoDB specific resource holder, wrapping a {@link ClientSession}. {@link ReactiveMongoTransactionManager} binds
* instances of this class to the subscriber context.
* <br />
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Mark Paluch
@@ -42,7 +42,7 @@ class ReactiveMongoResourceHolder extends ResourceHolderSupport {
* Create a new {@link ReactiveMongoResourceHolder} for a given {@link ClientSession session}.
*
* @param session the associated {@link ClientSession}. Can be {@literal null}.
* @param databaseFactory the associated {@link MongoDatabaseFactory}. must not be {@literal null}.
* @param databaseFactory the associated {@link MongoDbFactory}. must not be {@literal null}.
*/
ReactiveMongoResourceHolder(@Nullable ClientSession session, ReactiveMongoDatabaseFactory databaseFactory) {
@@ -99,7 +99,7 @@ class ReactiveMongoResourceHolder extends ResourceHolderSupport {
* If the {@link ReactiveMongoResourceHolder} is {@link #hasSession() not already associated} with a
* {@link ClientSession} the given value is {@link #setSession(ClientSession) set} and returned, otherwise the current
* bound session is returned.
*
*
* @param session
* @return
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2019-2023 the original author or authors.
* Copyright 2019-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -38,21 +38,21 @@ import com.mongodb.reactivestreams.client.ClientSession;
* A {@link org.springframework.transaction.ReactiveTransactionManager} implementation that manages
* {@link com.mongodb.reactivestreams.client.ClientSession} based transactions for a single
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory}.
* <br />
* <p />
* Binds a {@link ClientSession} from the specified
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory} to the subscriber
* {@link reactor.util.context.Context}.
* <br />
* <p />
* {@link org.springframework.transaction.TransactionDefinition#isReadOnly() Readonly} transactions operate on a
* {@link ClientSession} and enable causal consistency, and also {@link ClientSession#startTransaction() start},
* {@link com.mongodb.reactivestreams.client.ClientSession#commitTransaction() commit} or
* {@link ClientSession#abortTransaction() abort} a transaction.
* <br />
* <p />
* Application code is required to retrieve the {@link com.mongodb.reactivestreams.client.MongoDatabase} via
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory)} instead
* of a standard {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()} call. Spring
* classes such as {@link org.springframework.data.mongodb.core.ReactiveMongoTemplate} use this strategy implicitly.
* <br />
* <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. You can override
* {@link #doCommit(TransactionSynchronizationManager, ReactiveMongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
@@ -71,11 +71,11 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
/**
* Create a new {@link ReactiveMongoTransactionManager} for bean-style usage.
* <br />
* <p />
* <strong>Note:</strong>The {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory db factory} has to
* be {@link #setDatabaseFactory(ReactiveMongoDatabaseFactory)} set} before using the instance. Use this constructor
* to prepare a {@link ReactiveMongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}.
* <br />
* <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
*
@@ -104,12 +104,16 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory,
@Nullable TransactionOptions options) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null");
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory;
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException {
@@ -119,11 +123,19 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return new ReactiveMongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException {
@@ -163,6 +175,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException {
@@ -176,6 +192,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) {
@@ -183,6 +203,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -219,6 +243,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) {
@@ -240,6 +268,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -250,6 +282,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) {
@@ -281,7 +317,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
*/
public void setDatabaseFactory(ReactiveMongoDatabaseFactory databaseFactory) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null");
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory;
}
@@ -304,6 +340,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return databaseFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDatabaseFactory();
@@ -323,7 +363,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoDatabaseFactory getRequiredDatabaseFactory() {
Assert.state(databaseFactory != null,
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory; Did you forget to provide one; It's required");
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory. Did you forget to provide one? It's required.");
return databaseFactory;
}
@@ -361,7 +401,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %b, ", session.getServerSession().isClosed());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";
@@ -458,22 +498,30 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present; o_O");
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present. o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null");
Assert.state(session != null, "A Session is required but it turned out to be null.");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
throw new UnsupportedOperationException("flush() not supported");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -35,7 +35,7 @@ import com.mongodb.session.ClientSession;
/**
* {@link MethodInterceptor} implementation looking up and invoking an alternative target method having
* {@link ClientSession} as its first argument. This allows seamless integration with the existing code base.
* <br />
* <p />
* The {@link MethodInterceptor} is aware of methods on {@code MongoCollection} that my return new instances of itself
* like (eg. {@link com.mongodb.reactivestreams.client.MongoCollection#withWriteConcern(WriteConcern)} and decorate them
* if not already proxied.
@@ -76,13 +76,13 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null");
Assert.notNull(target, "Target must not be null");
Assert.notNull(sessionType, "SessionType must not be null");
Assert.notNull(databaseType, "Database type must not be null");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null");
Assert.notNull(collectionType, "Collection type must not be null");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null");
Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null!");
Assert.notNull(sessionType, "SessionType must not be null!");
Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null!");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null!");
this.session = session;
this.target = target;
@@ -95,6 +95,10 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.sessionType = sessionType;
}
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable
@Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,20 +15,13 @@
*/
package org.springframework.data.mongodb;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
/**
* {@link SessionSynchronization} is used along with {@code MongoTemplate} to define in which type of transactions to
* participate if any.
* {@link SessionSynchronization} is used along with {@link org.springframework.data.mongodb.core.MongoTemplate} to
* define in which type of transactions to participate if any.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see MongoTemplate#setSessionSynchronization(SessionSynchronization)
* @see MongoDatabaseUtils#getDatabase(MongoDatabaseFactory, SessionSynchronization)
* @see ReactiveMongoTemplate#setSessionSynchronization(SessionSynchronization)
* @see ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory, SessionSynchronization)
*/
public enum SessionSynchronization {
@@ -41,12 +34,5 @@ public enum SessionSynchronization {
/**
* Synchronize with native MongoDB transactions initiated via {@link MongoTransactionManager}.
*/
ON_ACTUAL_TRANSACTION,
/**
* Do not participate in ongoing transactions.
*
* @since 3.2.5
*/
NEVER;
ON_ACTUAL_TRANSACTION;
}

View File

@@ -1,77 +0,0 @@
/*
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.util.Version;
import org.springframework.util.StringUtils;
import com.mongodb.MongoDriverInformation;
/**
* Class that exposes the SpringData MongoDB specific information like the current {@link Version} or
* {@link MongoDriverInformation driver information}.
*
* @author Christoph Strobl
* @since 3.0
*/
public class SpringDataMongoDB {
private static final Log LOGGER = LogFactory.getLog(SpringDataMongoDB.class);
private static final Version FALLBACK_VERSION = new Version(3);
private static final MongoDriverInformation DRIVER_INFORMATION = MongoDriverInformation
.builder(MongoDriverInformation.builder().build()).driverName("spring-data").build();
/**
* Obtain the SpringData MongoDB specific driver information.
*
* @return never {@literal null}.
*/
public static MongoDriverInformation driverInformation() {
return DRIVER_INFORMATION;
}
/**
* Fetches the "Implementation-Version" manifest attribute from the jar file.
* <br />
* Note that some ClassLoaders do not expose the package metadata, hence this class might not be able to determine the
* version in all environments. In this case the current Major version is returned as a fallback.
*
* @return never {@literal null}.
*/
public static Version version() {
Package pkg = SpringDataMongoDB.class.getPackage();
String versionString = (pkg != null ? pkg.getImplementationVersion() : null);
if (!StringUtils.hasText(versionString)) {
LOGGER.debug("Unable to find Spring Data MongoDB version.");
return FALLBACK_VERSION;
}
try {
return Version.parse(versionString);
} catch (Exception e) {
LOGGER.debug(String.format("Cannot read Spring Data MongoDB version '%s'.", versionString));
}
return FALLBACK_VERSION;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,110 +0,0 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
import org.springframework.data.util.TypeUtils;
/**
* @author Christoph Strobl
* @since 4.0
*/
public class LazyLoadingProxyAotProcessor {
private boolean generalLazyLoadingProxyContributed = false;
public void registerLazyLoadingProxyIfNeeded(Class<?> type, GenerationContext generationContext) {
Set<Field> refFields = getFieldsWithAnnotationPresent(type, Reference.class);
if (refFields.isEmpty()) {
return;
}
refFields.stream() //
.filter(LazyLoadingProxyAotProcessor::isLazyLoading) //
.forEach(field -> {
if (!generalLazyLoadingProxyContributed) {
generationContext.getRuntimeHints().proxies().registerJdkProxy(
TypeReference.of(org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class),
TypeReference.of(org.springframework.aop.SpringProxy.class),
TypeReference.of(org.springframework.aop.framework.Advised.class),
TypeReference.of(org.springframework.core.DecoratingProxy.class));
generalLazyLoadingProxyContributed = true;
}
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
generationContext.getRuntimeHints().proxies().registerJdkProxy(interfaces.toArray(Class[]::new));
} else {
Class<?> proxyClass = LazyLoadingProxyFactory.resolveProxyType(field.getType(),
() -> LazyLoadingInterceptor.none());
// see: spring-projects/spring-framework/issues/29309
generationContext.getRuntimeHints().reflection().registerType(proxyClass,
MemberCategory.INVOKE_DECLARED_CONSTRUCTORS, MemberCategory.INVOKE_DECLARED_METHODS, MemberCategory.DECLARED_FIELDS);
}
});
}
private static boolean isLazyLoading(Field field) {
if (AnnotatedElementUtils.isAnnotated(field, DBRef.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DBRef.class).lazy();
}
if (AnnotatedElementUtils.isAnnotated(field, DocumentReference.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DocumentReference.class).lazy();
}
return false;
}
private static Set<Field> getFieldsWithAnnotationPresent(Class<?> type, Class<? extends Annotation> annotation) {
Set<Field> fields = new LinkedHashSet<>();
for (Field field : type.getDeclaredFields()) {
if (MergedAnnotations.from(field).get(annotation).isPresent()) {
fields.add(field);
}
}
return fields;
}
}

View File

@@ -1,45 +0,0 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.util.function.Predicate;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.ReactiveWrappers;
import org.springframework.data.util.ReactiveWrappers.ReactiveLibrary;
import org.springframework.data.util.TypeUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* @author Christoph Strobl
* @since 4.0
*/
public class MongoAotPredicates {
public static final Predicate<Class<?>> IS_SIMPLE_TYPE = (type) -> MongoSimpleTypes.HOLDER.isSimpleType(type) || TypeUtils.type(type).isPartOf("org.bson");
public static final Predicate<ReactiveLibrary> IS_REACTIVE_LIBARARY_AVAILABLE = (lib) -> ReactiveWrappers.isAvailable(lib);
public static final Predicate<ClassLoader> IS_SYNC_CLIENT_PRESENT = (classLoader) -> ClassUtils.isPresent("com.mongodb.client.MongoClient", classLoader);
public static boolean isReactorPresent() {
return IS_REACTIVE_LIBARARY_AVAILABLE.test(ReactiveWrappers.ReactiveLibrary.PROJECT_REACTOR);
}
public static boolean isSyncClientPresent(@Nullable ClassLoader classLoader) {
return IS_SYNC_CLIENT_PRESENT.test(classLoader);
}
}

View File

@@ -1,56 +0,0 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.core.ResolvableType;
import org.springframework.data.aot.ManagedTypesBeanRegistrationAotProcessor;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* @author Christoph Strobl
* @since 2022/06
*/
class MongoManagedTypesBeanRegistrationAotProcessor extends ManagedTypesBeanRegistrationAotProcessor {
private final LazyLoadingProxyAotProcessor lazyLoadingProxyAotProcessor = new LazyLoadingProxyAotProcessor();
public MongoManagedTypesBeanRegistrationAotProcessor() {
setModuleIdentifier("mongo");
}
@Override
protected boolean isMatch(@Nullable Class<?> beanType, @Nullable String beanName) {
return isMongoManagedTypes(beanType) || super.isMatch(beanType, beanName);
}
protected boolean isMongoManagedTypes(@Nullable Class<?> beanType) {
return beanType != null && ClassUtils.isAssignable(MongoManagedTypes.class, beanType);
}
@Override
protected void contributeType(ResolvableType type, GenerationContext generationContext) {
if (MongoAotPredicates.IS_SIMPLE_TYPE.test(type.toClass())) {
return;
}
super.contributeType(type, generationContext);
lazyLoadingProxyAotProcessor.registerLazyLoadingProxyIfNeeded(type.toClass(), generationContext);
}
}

View File

@@ -1,83 +0,0 @@
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import static org.springframework.data.mongodb.aot.MongoAotPredicates.*;
import java.util.Arrays;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.RuntimeHints;
import org.springframework.aot.hint.RuntimeHintsRegistrar;
import org.springframework.aot.hint.TypeReference;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCallback;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* {@link RuntimeHintsRegistrar} for repository types and entity callbacks.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 4.0
*/
class MongoRuntimeHints implements RuntimeHintsRegistrar {
@Override
public void registerHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
hints.reflection().registerTypes(
Arrays.asList(TypeReference.of(BeforeConvertCallback.class), TypeReference.of(BeforeSaveCallback.class),
TypeReference.of(AfterConvertCallback.class), TypeReference.of(AfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
registerTransactionProxyHints(hints, classLoader);
if (isReactorPresent()) {
hints.reflection()
.registerTypes(Arrays.asList(TypeReference.of(ReactiveBeforeConvertCallback.class),
TypeReference.of(ReactiveBeforeSaveCallback.class), TypeReference.of(ReactiveAfterConvertCallback.class),
TypeReference.of(ReactiveAfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
}
}
private static void registerTransactionProxyHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
if (MongoAotPredicates.isSyncClientPresent(classLoader)
&& ClassUtils.isPresent("org.springframework.aop.SpringProxy", classLoader)) {
hints.proxies().registerJdkProxy(TypeReference.of("com.mongodb.client.MongoDatabase"),
TypeReference.of("org.springframework.aop.SpringProxy"),
TypeReference.of("org.springframework.core.DecoratingProxy"));
hints.proxies().registerJdkProxy(TypeReference.of("com.mongodb.client.MongoCollection"),
TypeReference.of("org.springframework.aop.SpringProxy"),
TypeReference.of("org.springframework.core.DecoratingProxy"));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,20 +17,17 @@ package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.SpringDataMongoDB;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory;
import org.springframework.data.mongodb.core.SimpleMongoClientDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.client.MongoClient}.
@@ -38,74 +35,78 @@ import com.mongodb.client.MongoClients;
* @author Christoph Strobl
* @since 2.1
* @see MongoConfigurationSupport
* @see AbstractMongoConfiguration
*/
@Configuration(proxyBeanMethods = false)
@Configuration
public abstract class AbstractMongoClientConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}. <br />
* Override {@link #mongoClientSettings()} to configure connection details.
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return never {@literal null}.
* @see #mongoClientSettings()
* @see #configureClientSettings(Builder)
* @return
*/
public MongoClient mongoClient() {
return createMongoClient(mongoClientSettings());
}
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}.
*
* @see #mongoDbFactory()
* @see #mappingMongoConverter(MongoDatabaseFactory, MongoCustomConversions, MongoMappingContext)
* @return
*/
@Bean
public MongoTemplate mongoTemplate(MongoDatabaseFactory databaseFactory, MappingMongoConverter converter) {
return new MongoTemplate(databaseFactory, converter);
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory} to be used by the
* {@link MongoTemplate}. Will use the {@link MongoClient} instance configured in {@link #mongoClient()}.
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate(MongoDatabaseFactory, MappingMongoConverter)
* @see #mongoTemplate()
* @return
*/
@Bean
public MongoDatabaseFactory mongoDbFactory() {
return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName());
public MongoDbFactory mongoDbFactory() {
return new SimpleMongoClientDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied.
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter(MongoDatabaseFactory databaseFactory,
MongoCustomConversions customConversions, MongoMappingContext mappingContext) {
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(databaseFactory);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mappingContext);
converter.setCustomConversions(customConversions);
converter.setCodecRegistryProvider(databaseFactory);
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}
/**
* Create the Reactive Streams {@link com.mongodb.reactivestreams.client.MongoClient} instance with given
* {@link MongoClientSettings}.
*
* @return never {@literal null}.
* @since 3.0
*/
protected MongoClient createMongoClient(MongoClientSettings settings) {
return MongoClients.create(settings, SpringDataMongoDB.driverInformation());
}
}

View File

@@ -0,0 +1,121 @@
/*
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.MongoClient}.
* <p />
* <strong>INFO:</strong>In case you want to use {@link com.mongodb.client.MongoClients} for configuration please refer
* to {@link AbstractMongoClientConfiguration}.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
* @author Mark Paluch
* @see MongoConfigurationSupport
* @see AbstractMongoClientConfiguration
* @deprecated since 2.2 in favor of {@link AbstractMongoClientConfiguration}.
*/
@Configuration
@Deprecated
public abstract class AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}.
*
* @return
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
* @return
*/
@Bean
public MongoDbFactory mongoDbFactory() {
return new SimpleMongoDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2023 the original author or authors.
* Copyright 2016-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,19 +18,13 @@ package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.SpringDataMongoDB;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
/**
* Base class for reactive Spring Data MongoDB configuration using JavaConfig.
@@ -40,33 +34,25 @@ import com.mongodb.reactivestreams.client.MongoClients;
* @since 2.0
* @see MongoConfigurationSupport
*/
@Configuration(proxyBeanMethods = false)
@Configuration
public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the Reactive Streams {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want
* to expose a {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}. <br />
* Override {@link #mongoClientSettings()} to configure connection details.
* to expose a {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return never {@literal null}.
* @see #mongoClientSettings()
* @see #configureClientSettings(Builder)
*/
public MongoClient reactiveMongoClient() {
return createReactiveMongoClient(mongoClientSettings());
}
public abstract MongoClient reactiveMongoClient();
/**
* Creates {@link ReactiveMongoOperations}.
*
* @see #reactiveMongoDbFactory()
* @see #mappingMongoConverter(ReactiveMongoDatabaseFactory, MongoCustomConversions, MongoMappingContext)
* @return never {@literal null}.
*/
@Bean
public ReactiveMongoTemplate reactiveMongoTemplate(ReactiveMongoDatabaseFactory databaseFactory,
MappingMongoConverter mongoConverter) {
return new ReactiveMongoTemplate(databaseFactory, mongoConverter);
public ReactiveMongoOperations reactiveMongoTemplate() throws Exception {
return new ReactiveMongoTemplate(reactiveMongoDbFactory(), mappingMongoConverter());
}
/**
@@ -74,7 +60,7 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
* {@link MongoClient} instance configured in {@link #reactiveMongoClient()}.
*
* @see #reactiveMongoClient()
* @see #reactiveMongoTemplate(ReactiveMongoDatabaseFactory, MappingMongoConverter)
* @see #reactiveMongoTemplate()
* @return never {@literal null}.
*/
@Bean
@@ -84,31 +70,21 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #reactiveMongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied.
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)
* @see #mongoMappingContext()
* @see #reactiveMongoDbFactory()
* @return never {@literal null}.
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter(ReactiveMongoDatabaseFactory databaseFactory,
MongoCustomConversions customConversions, MongoMappingContext mappingContext) {
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
converter.setCustomConversions(customConversions);
converter.setCodecRegistryProvider(databaseFactory);
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(reactiveMongoDbFactory());
return converter;
}
/**
* Create the Reactive Streams {@link MongoClient} instance with given {@link MongoClientSettings}.
*
* @return never {@literal null}.
* @since 3.0
*/
protected MongoClient createReactiveMongoClient(MongoClientSettings settings) {
return MongoClients.create(settings, SpringDataMongoDB.driverInformation());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,42 +0,0 @@
/*
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
import com.mongodb.ConnectionString;
/**
* Parse a {@link String} to a {@link com.mongodb.ConnectionString}.
*
* @author Christoph Strobl
* @since 3.0
*/
public class ConnectionStringPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(@Nullable String connectionString) {
if (!StringUtils.hasText(connectionString)) {
return;
}
setValue(new ConnectionString(connectionString));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2023 the original author or authors.
* Copyright 2013-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -61,8 +61,8 @@ public @interface EnableMongoAuditing {
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
*
* @return empty {@link String} by default.
*/

View File

@@ -1,70 +0,0 @@
/*
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.context.annotation.Import;
import org.springframework.data.auditing.DateTimeProvider;
import org.springframework.data.domain.ReactiveAuditorAware;
/**
* Annotation to enable auditing in MongoDB using reactive infrastructure via annotation configuration.
*
* @author Mark Paluch
* @since 3.1
*/
@Inherited
@Documented
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Import(ReactiveMongoAuditingRegistrar.class)
public @interface EnableReactiveMongoAuditing {
/**
* Configures the {@link ReactiveAuditorAware} bean to be used to lookup the current principal.
*
* @return empty {@link String} by default.
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
*
* @return empty {@link String} by default.
*/
String dateTimeProviderRef() default "";
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2023 the original author or authors.
* Copyright 2013-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,10 @@ import org.w3c.dom.Element;
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -42,6 +46,10 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import java.util.List;
import java.util.Set;
import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
@@ -63,7 +64,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
@@ -80,7 +80,7 @@ import org.w3c.dom.Element;
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("jakarta.validation.Validator",
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
/* (non-Javadoc)
@@ -96,9 +96,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
String autoIndexCreation = element.getAttribute("auto-index-creation");
boolean autoIndexCreationEnabled = StringUtils.hasText(autoIndexCreation) && Boolean.valueOf(autoIndexCreation);
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mapping Mongo Converter", element));
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
@@ -135,7 +132,9 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider"));
}
if (!registry.containsBeanDefinition(INDEX_HELPER_BEAN_NAME)) {
try {
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
@@ -149,7 +148,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null && !registry.containsBeanDefinition(VALIDATING_EVENT_LISTENER_BEAN_NAME)) {
if (validatingMongoEventListener != null) {
parserContext.registerBeanComponent(
new BeanComponentDefinition(validatingMongoEventListener, VALIDATING_EVENT_LISTENER_BEAN_NAME));
}
@@ -163,16 +162,15 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
String disableValidation = element.getAttribute("disable-validation");
boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.parseBoolean(disableValidation);
boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.valueOf(disableValidation);
if (!validationDisabled) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition();
RuntimeBeanReference validator = getValidator(element, parserContext);
RuntimeBeanReference validator = getValidator(builder, parserContext);
if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.getRawBeanDefinition().setSource(element);
builder.addConstructorArgValue(validator);
return builder.getBeanDefinition();
@@ -194,17 +192,13 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName);
}
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
@Nullable BeanDefinition conversionsDefinition, @Nullable String converterId) {
return potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, converterId, false);
}
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
@Nullable BeanDefinition conversionsDefinition, @Nullable String converterId, boolean autoIndexCreation) {
String ctxRef = element.getAttribute("mapping-context-ref");
@@ -232,8 +226,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
mappingContextBuilder.addPropertyValue("autoIndexCreation", autoIndexCreation);
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
@@ -253,7 +245,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured",
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element);
return;
}
@@ -374,6 +366,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory)
throws IOException {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2023 the original author or authors.
* Copyright 2012-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -47,16 +47,28 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEntityCallback.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2023 the original author or authors.
* Copyright 2013-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,17 +17,25 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.Ordered;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
@@ -35,42 +43,71 @@ import org.springframework.util.Assert;
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport implements Ordered {
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration,
BeanDefinitionRegistry registry) {
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext");
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null");
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
return configureDefaultAuditHandlerAttributes(configuration,
BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class));
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null");
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEntityCallback.class);
@@ -79,10 +116,68 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport impl
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEntityCallback.class.getName(), registry);
if (PROJECT_REACTOR_AVAILABLE) {
registerReactiveAuditingEntityCallback(registry, auditingHandlerDefinition.getSource());
}
}
@Override
public int getOrder() {
return Ordered.LOWEST_PRECEDENCE;
private void registerReactiveAuditingEntityCallback(BeanDefinitionRegistry registry, Object source) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(source);
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
/**
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
*/
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -35,6 +35,10 @@ import org.w3c.dom.Element;
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
@@ -46,11 +50,10 @@ public class MongoClientParser implements BeanDefinitionParser {
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credential", "credential");
ParsingUtils.setPropertyValue(builder, element, "replica-set", "replicaSet");
ParsingUtils.setPropertyValue(builder, element, "connection-string", "connectionString");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientSettings(element, builder);
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
@@ -59,34 +62,22 @@ public class MongoClientParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition connectionStringPropertyEditor = helper
.getComponent(MongoParsingUtils.getConnectionStringPropertyEditorBuilder());
parserContext.registerBeanComponent(connectionStringPropertyEditor);
BeanComponentDefinition serverAddressPropertyEditor = helper
.getComponent(MongoParsingUtils.getServerAddressPropertyEditorBuilder());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper
.getComponent(MongoParsingUtils.getWriteConcernPropertyEditorBuilder());
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readConcernEditor = helper
.getComponent(MongoParsingUtils.getReadConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(readConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper
.getComponent(MongoParsingUtils.getReadPreferencePropertyEditorBuilder());
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper
.getComponent(MongoParsingUtils.getMongoCredentialPropertyEditor());
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
BeanComponentDefinition uuidRepresentationEditor = helper
.getComponent(MongoParsingUtils.getUUidRepresentationEditorBuilder());
parserContext.registerBeanComponent(uuidRepresentationEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2023 the original author or authors.
* Copyright 2016-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,27 +20,22 @@ import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.bson.UuidRepresentation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions.MongoConverterConfigurationAdapter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
/**
* Base class for Spring Data MongoDB to be extended for JavaConfiguration usage.
*
@@ -77,56 +72,30 @@ public abstract class MongoConfigurationSupport {
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext(MongoCustomConversions customConversions,
MongoManagedTypes mongoManagedTypes) {
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setManagedTypes(mongoManagedTypes);
mappingContext.setSimpleTypeHolder(customConversions.getSimpleTypeHolder());
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
return mappingContext;
}
/**
* @return new instance of {@link MongoManagedTypes}.
* @throws ClassNotFoundException
* @since 4.0
*/
@Bean
public MongoManagedTypes mongoManagedTypes() throws ClassNotFoundException {
return MongoManagedTypes.fromIterable(getInitialEntitySet());
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the
* {@link org.springframework.data.mongodb.core.convert.MappingMongoConverter} and {@link MongoMappingContext}.
* Returns an empty {@link MongoCustomConversions} instance by default.
* <p>
* <strong>NOTE:</strong> Use {@link #configureConverters(MongoConverterConfigurationAdapter)} to configure MongoDB
* native simple types and register custom {@link Converter converters}.
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link MongoMappingContext}. Returns an empty {@link MongoCustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public MongoCustomConversions customConversions() {
return MongoCustomConversions.create(this::configureConverters);
}
/**
* Configuration hook for {@link MongoCustomConversions} creation.
*
* @param converterConfigurationAdapter never {@literal null}.
* @since 2.3
* @see MongoConverterConfigurationAdapter#useNativeDriverJavaTimeCodecs()
* @see MongoConverterConfigurationAdapter#useSpringDataJavaTimeCodecs()
*/
protected void configureConverters(MongoConverterConfigurationAdapter converterConfigurationAdapter) {
public CustomConversions customConversions() {
return new MongoCustomConversions(Collections.emptyList());
}
/**
@@ -149,7 +118,8 @@ public abstract class MongoConfigurationSupport {
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document}.
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
@@ -169,6 +139,7 @@ public abstract class MongoConfigurationSupport {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
@@ -182,7 +153,8 @@ public abstract class MongoConfigurationSupport {
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created.
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
@@ -205,36 +177,11 @@ public abstract class MongoConfigurationSupport {
* Configure whether to automatically create indices for domain types by deriving the
* {@link org.springframework.data.mongodb.core.index.IndexDefinition} from the entity or not.
*
* @return {@literal false} by default. <br />
* <strong>INFO</strong>: As of 3.x the default is set to {@literal false}; In 2.x it was {@literal true}.
* @return {@literal true} by default. <br />
* <strong>INFO</strong>: As of 3.x the default will be set to {@literal false}.
* @since 2.2
*/
protected boolean autoIndexCreation() {
return false;
}
/**
* Return the {@link MongoClientSettings} used to create the actual {@literal MongoClient}. <br />
* Override either this method, or use {@link #configureClientSettings(Builder)} to alter the setup.
*
* @return never {@literal null}.
* @since 3.0
*/
protected MongoClientSettings mongoClientSettings() {
MongoClientSettings.Builder builder = MongoClientSettings.builder();
builder.uuidRepresentation(UuidRepresentation.JAVA_LEGACY);
configureClientSettings(builder);
return builder.build();
}
/**
* Configure {@link MongoClientSettings} via its {@link Builder} API.
*
* @param builder never {@literal null}.
* @since 3.0
*/
protected void configureClientSettings(MongoClientSettings.Builder builder) {
// customization hook
return true;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -51,6 +51,10 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String OPTIONS_DELIMITER = "?";
private static final String OPTION_VALUE_DELIMITER = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String text) throws IllegalArgumentException {
@@ -117,7 +121,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'", authMechanism));
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
@@ -194,7 +198,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value", optionArgs[0]));
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
@@ -209,21 +213,21 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'");
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'");
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username");
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
@@ -231,7 +235,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
try {
return URLDecoder.decode(it, "UTF-8");
} catch (UnsupportedEncodingException e) {
throw new IllegalArgumentException("o_O UTF-8 not supported", e);
throw new IllegalArgumentException("o_O UTF-8 not supported!", e);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,12 +32,14 @@ import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoClientDatabaseFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.ConnectionString;
import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
* {@link BeanDefinitionParser} to parse {@code db-factory} elements into {@link BeanDefinition}s.
@@ -62,6 +64,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -70,15 +76,18 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder
.genericBeanDefinition(SimpleMongoClientDatabaseFactory.class);
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
BeanDefinition mongoUri = getConnectionString(element, parserContext);
BeanDefinition mongoUri = getMongoUri(element, parserContext);
if (mongoUri != null) {
@@ -88,8 +97,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String mongoRef = element.getAttribute("mongo-client-ref");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
// Defaulting
@@ -111,8 +119,8 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Registers a default {@link BeanDefinition} of a {@link com.mongodb.client.MongoClient} instance and returns the
* name under which the {@link com.mongodb.client.MongoClient} instance was registered under.
* Registers a default {@link BeanDefinition} of a {@link Mongo} instance and returns the name under which the
* {@link Mongo} instance was registered under.
*
* @param element must not be {@literal null}.
* @param parserContext must not be {@literal null}.
@@ -128,7 +136,8 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Creates a {@link BeanDefinition} for a {@link ConnectionString} depending on configured attributes. <br />
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
*
@@ -137,19 +146,11 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* @return {@literal null} in case no client-/uri defined.
*/
@Nullable
private BeanDefinition getConnectionString(Element element, ParserContext parserContext) {
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
String type = null;
boolean hasClientUri = element.hasAttribute("client-uri");
if (element.hasAttribute("client-uri")) {
type = "client-uri";
} else if (element.hasAttribute("connection-string")) {
type = "connection-string";
} else if (element.hasAttribute("uri")) {
type = "uri";
}
if (!StringUtils.hasText(type)) {
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
@@ -163,12 +164,16 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error("Configure either MongoDB " + type + " or details individually",
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(ConnectionString.class);
builder.addConstructorArgValue(element.getAttribute(type));
Class<?> type = MongoClientURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,10 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,12 +22,9 @@ import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionValidationException;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientSettingsFactoryBean;
import org.springframework.data.mongodb.core.MongoServerApiFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -40,97 +37,66 @@ import org.w3c.dom.Element;
* @author Christoph Strobl
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
/**
* Parses the {@code mongo:client-settings} sub-element. Populates the given attribute factory with the proper
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element
* @param mongoClientBuilder
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 3.0
* @since 1.7
*/
public static boolean parseMongoClientSettings(Element element, BeanDefinitionBuilder mongoClientBuilder) {
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element settingsElement = DomUtils.getChildElementByTagName(element, "client-settings");
if (settingsElement == null) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientSettingsFactoryBean.class);
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, settingsElement, "application-name", "applicationName");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "read-concern", "readConcern");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "retry-reads", "retryReads");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "retry-writes", "retryWrites");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "uuid-representation", "uUidRepresentation");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "encryption-settings-ref", "autoEncryptionSettings");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
// SocketSettings
setPropertyValue(clientOptionsDefBuilder, settingsElement, "socket-connect-timeout", "socketConnectTimeoutMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "socket-read-timeout", "socketReadTimeoutMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "socket-receive-buffer-size", "socketReceiveBufferSize");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "socket-send-buffer-size", "socketSendBufferSize");
// Server Settings
setPropertyValue(clientOptionsDefBuilder, settingsElement, "server-heartbeat-frequency",
"serverHeartbeatFrequencyMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "server-min-heartbeat-frequency",
"serverMinHeartbeatFrequencyMS");
// Cluster Settings
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-srv-host", "clusterSrvHost");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-hosts", "clusterHosts");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-connection-mode", "clusterConnectionMode");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-type", "custerRequiredClusterType");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-local-threshold", "clusterLocalThresholdMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "cluster-server-selection-timeout",
"clusterServerSelectionTimeoutMS");
// Connection Pool Settings
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-max-size", "poolMaxSize");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-min-size", "poolMinSize");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-max-wait-time", "poolMaxWaitTimeMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-max-connection-life-time",
"poolMaxConnectionLifeTimeMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-max-connection-idle-time",
"poolMaxConnectionIdleTimeMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-maintenance-initial-delay",
"poolMaintenanceInitialDelayMS");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "connection-pool-maintenance-frequency",
"poolMaintenanceFrequencyMS");
// SSL Settings
setPropertyValue(clientOptionsDefBuilder, settingsElement, "ssl-enabled", "sslEnabled");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "ssl-invalid-host-name-allowed",
"sslInvalidHostNameAllowed");
setPropertyValue(clientOptionsDefBuilder, settingsElement, "ssl-provider", "sslProvider");
// Field level encryption
setPropertyReference(clientOptionsDefBuilder, settingsElement, "encryption-settings-ref", "autoEncryptionSettings");
// ServerAPI
if (StringUtils.hasText(settingsElement.getAttribute("server-api-version"))) {
MongoServerApiFactoryBean serverApiFactoryBean = new MongoServerApiFactoryBean();
serverApiFactoryBean.setVersion(settingsElement.getAttribute("server-api-version"));
try {
clientOptionsDefBuilder.addPropertyValue("serverApi", serverApiFactoryBean.getObject());
} catch (Exception exception) {
throw new BeanDefinitionValidationException("Non parsable server-api.", exception);
}
} else {
setPropertyReference(clientOptionsDefBuilder, settingsElement, "server-api-ref", "serverApi");
}
// and the rest
mongoClientBuilder.addPropertyValue("mongoClientSettings", clientOptionsDefBuilder.getBeanDefinition());
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
@@ -152,24 +118,6 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadConcernPropertyEditor}.
*
* @return
* @since 3.0
*/
static BeanDefinitionBuilder getReadConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<>();
customEditors.put("com.mongodb.ReadConcern", ReadConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
@@ -177,7 +125,7 @@ abstract class MongoParsingUtils {
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<>();
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
@@ -195,7 +143,7 @@ abstract class MongoParsingUtils {
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<>();
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
@@ -221,41 +169,4 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ConnectionStringPropertyEditor}.
*
* @return
* @since 3.0
*/
static BeanDefinitionBuilder getConnectionStringPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<>();
customEditors.put("com.mongodb.ConnectionString", ConnectionStringPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ConnectionStringPropertyEditor}.
*
* @return
* @since 3.0
*/
static BeanDefinitionBuilder getUUidRepresentationEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<>();
customEditors.put("org.bson.UuidRepresentation", UUidRepresentationPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,10 @@ import org.w3c.dom.Element;
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -47,6 +51,10 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -1,53 +0,0 @@
/*
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
/**
* Simple helper to be able to wire the {@link PersistentEntities} from a {@link MappingMongoConverter} bean available
* in the application context.
*
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 3.1
*/
public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEntities> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link PersistentEntitiesFactoryBean} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public PersistentEntitiesFactoryBean(MappingMongoConverter converter) {
this.converter = converter;
}
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;
}
}

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 3.1
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
@Override
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration,
BeanDefinitionRegistry registry) {
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext");
}
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null");
return configureDefaultAuditHandlerAttributes(configuration,
BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class));
}
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(auditingHandlerDefinition.getSource());
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
import com.mongodb.ReadConcern;
import com.mongodb.ReadConcernLevel;
/**
* Parse a {@link String} to a {@link ReadConcern}. If it is a well know {@link String} as identified by the
* {@link ReadConcernLevel#fromString(String)}.
*
* @author Christoph Strobl
* @since 3.0
*/
public class ReadConcernPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(@Nullable String readConcernString) {
if (!StringUtils.hasText(readConcernString)) {
return;
}
setValue(new ReadConcern(ReadConcernLevel.fromString(readConcernString)));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,10 @@ import com.mongodb.ReadPreference;
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,8 +21,8 @@ import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -43,9 +43,13 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'; Check your replica set configuration";
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class);
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String replicaSetString) {
@@ -68,7 +72,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration; Validate your config");
"Could not resolve at least one server of the replica set configuration! Validate your config!");
}
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
@@ -84,18 +88,14 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private ServerAddress parseServerAddress(String source) {
if (!StringUtils.hasText(source)) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
String[] hostAndPort = extractHostAddressAndPort(source.trim());
if (hostAndPort.length > 2) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
@@ -105,13 +105,9 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
} catch (UnknownHostException e) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
} catch (NumberFormatException e) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
}
return null;
@@ -125,7 +121,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
*/
private String[] extractHostAddressAndPort(String addressAndPortSource) {
Assert.notNull(addressAndPortSource, "Address and port source must not be null");
Assert.notNull(addressAndPortSource, "Address and port source must not be null!");
String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN);
String hostAddress = hostAndPort[0];

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2023 the original author or authors.
* Copyright 2012-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,6 +26,10 @@ import com.mongodb.WriteConcern;
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source);

View File

@@ -1,41 +0,0 @@
/*
* Copyright 2020-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import org.bson.UuidRepresentation;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
/**
* Parse a {@link String} to a {@link UuidRepresentation}.
*
* @author Christoph Strobl
* @since 3.0
*/
public class UUidRepresentationPropertyEditor extends PropertyEditorSupport {
@Override
public void setAsText(@Nullable String value) {
if (!StringUtils.hasText(value)) {
return;
}
setValue(UuidRepresentation.valueOf(value));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,7 +34,7 @@ import com.mongodb.WriteConcern;
public class WriteConcernPropertyEditor extends PropertyEditorSupport {
/**
* Parse a string to a {@link WriteConcern}.
* Parse a string to a List<ServerAddress>
*/
@Override
public void setAsText(@Nullable String writeConcernString) {
@@ -51,5 +51,6 @@ public class WriteConcernPropertyEditor extends PropertyEditorSupport {
// pass on the string to the constructor
setValue(new WriteConcern(writeConcernString));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
@@ -22,16 +26,20 @@ import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions.DomainTypeMapping;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.Lazy;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
@@ -41,50 +49,34 @@ import org.springframework.lang.Nullable;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
Lazy<AggregationOperationContext> untypedMappingContext;
AggregationUtil(QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation,
@Nullable AggregationOperationContext context) {
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
this.untypedMappingContext = Lazy
.of(() -> new RelaxedTypeBasedAggregationOperationContext(Object.class, mappingContext, queryMapper));
}
AggregationOperationContext createAggregationContext(Aggregation aggregation, @Nullable Class<?> inputType) {
DomainTypeMapping domainTypeMapping = aggregation.getOptions().getDomainTypeMapping();
if (domainTypeMapping == DomainTypeMapping.NONE) {
return Aggregation.DEFAULT_CONTEXT;
if (context != null) {
return context;
}
if (!(aggregation instanceof TypedAggregation)) {
if(inputType == null) {
return untypedMappingContext.get();
}
if (domainTypeMapping == DomainTypeMapping.STRICT
&& !aggregation.getPipeline().containsUnionWith()) {
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
return new RelaxedTypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
inputType = ((TypedAggregation<?>) aggregation).getInputType();
if (domainTypeMapping == DomainTypeMapping.STRICT
&& !aggregation.getPipeline().containsUnionWith()) {
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
return new RelaxedTypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
return Aggregation.DEFAULT_CONTEXT;
}
/**
@@ -95,7 +87,12 @@ class AggregationUtil {
* @return
*/
List<Document> createPipeline(Aggregation aggregation, AggregationOperationContext context) {
return aggregation.toPipeline(context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toPipeline(context);
}
return mapAggregationPipeline(aggregation.toPipeline(context));
}
/**
@@ -106,7 +103,63 @@ class AggregationUtil {
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
return aggregation.toDocument(collection, context);
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
/**
* Create a {@code $count} aggregation for {@link Query} and optionally a {@link Class entity class}.
*
* @param query must not be {@literal null}.
* @param entityClass can be {@literal null} if the {@link Query} object is empty.
* @return the {@link Aggregation} pipeline definition to run a {@code $count} aggregation.
*/
Aggregation createCountAggregation(Query query, @Nullable Class<?> entityClass) {
List<AggregationOperation> pipeline = computeCountAggregationPipeline(query, entityClass);
Aggregation aggregation = entityClass != null ? Aggregation.newAggregation(entityClass, pipeline)
: Aggregation.newAggregation(pipeline);
aggregation.withOptions(AggregationOptions.builder().collation(query.getCollation().orElse(null)).build());
return aggregation;
}
private List<AggregationOperation> computeCountAggregationPipeline(Query query, @Nullable Class<?> entityType) {
CountOperation count = Aggregation.count().as("totalEntityCount");
if (query.getQueryObject().isEmpty()) {
return Collections.singletonList(count);
}
Assert.notNull(entityType, "Entity type must not be null!");
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(),
mappingContext.getPersistentEntity(entityType));
CriteriaDefinition criteria = new CriteriaDefinition() {
@Override
public Document getCriteriaObject() {
return mappedQuery;
}
@Nullable
@Override
public String getKey() {
return null;
}
};
return Arrays.asList(Aggregation.match(criteria), count);
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,14 +24,10 @@ import org.springframework.data.util.Pair;
import com.mongodb.bulk.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. Bulk operations are available since MongoDB 2.6 and
* make use of low level bulk commands on the protocol level. This interface defines a fluent API to add multiple single
* operations or list of similar operations in sequence which can then eventually be executed by calling
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
* <p>
* Bulk operations are issued as one batch that pulls together all insert, update, and delete operations. Operations
* that require individual operation results such as optimistic locking (using {@code @Version}) are not supported and
* the version field remains not populated.
*
* @author Tobias Trelle
* @author Oliver Gierke

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
@@ -25,7 +27,6 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.OperationType;
@@ -36,29 +37,22 @@ import com.mongodb.client.model.changestream.OperationType;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocument");
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocumentBeforeChange");
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "converted");
private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType;
private final MongoConverter converter;
// accessed through CONVERTED_FULL_DOCUMENT_UPDATER.
private volatile @Nullable T convertedFullDocument;
// accessed through CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER.
private volatile @Nullable T convertedFullDocumentBeforeChange;
// accessed through CONVERTED_UPDATER.
private volatile @Nullable T converted;
/**
* @param raw can be {@literal null}.
@@ -155,43 +149,27 @@ public class ChangeStreamEvent<T> {
@Nullable
public T getBody() {
if (raw == null || raw.getFullDocument() == null) {
if (raw == null) {
return null;
}
return getConvertedFullDocument(raw.getFullDocument());
}
Document fullDocument = raw.getFullDocument();
/**
* Get the potentially converted {@link ChangeStreamDocument#getFullDocumentBeforeChange() document} before being changed.
*
* @return {@literal null} when {@link #getRaw()} or {@link ChangeStreamDocument#getFullDocumentBeforeChange()} is
* {@literal null}.
* @since 4.0
*/
@Nullable
public T getBodyBeforeChange() {
if (raw == null || raw.getFullDocumentBeforeChange() == null) {
return null;
if (fullDocument == null) {
return targetType.cast(fullDocument);
}
return getConvertedFullDocumentBeforeChange(raw.getFullDocumentBeforeChange());
return getConverted(fullDocument);
}
@SuppressWarnings("unchecked")
private T getConvertedFullDocumentBeforeChange(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER);
private T getConverted(Document fullDocument) {
return (T) doGetConverted(fullDocument);
}
@SuppressWarnings("unchecked")
private T getConvertedFullDocument(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_UPDATER);
}
private Object doGetConverted(Document fullDocument) {
private Object doGetConverted(Document fullDocument, AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> updater) {
Object result = updater.get(this);
Object result = CONVERTED_UPDATER.get(this);
if (result != null) {
return result;
@@ -200,44 +178,25 @@ public class ChangeStreamEvent<T> {
if (ClassUtils.isAssignable(Document.class, fullDocument.getClass())) {
result = converter.read(targetType, fullDocument);
return updater.compareAndSet(this, null, result) ? result : updater.get(this);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
if (converter.getConversionService().canConvert(fullDocument.getClass(), targetType)) {
result = converter.getConversionService().convert(fullDocument, targetType);
return updater.compareAndSet(this, null, result) ? result : updater.get(this);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
throw new IllegalArgumentException(
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
fullDocument.getClass(), targetType));
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamEvent<?> that = (ChangeStreamEvent<?>) o;
if (!ObjectUtils.nullSafeEquals(this.raw, that.raw)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.targetType, that.targetType);
}
@Override
public int hashCode() {
int result = raw != null ? raw.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(targetType);
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
@@ -23,6 +25,7 @@ import org.bson.BsonDocument;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
@@ -32,7 +35,6 @@ import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.model.changestream.FullDocumentBeforeChange;
/**
* Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended
@@ -41,15 +43,14 @@ import com.mongodb.client.model.changestream.FullDocumentBeforeChange;
*
* @author Christoph Strobl
* @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
@@ -77,14 +78,6 @@ public class ChangeStreamOptions {
return Optional.ofNullable(fullDocumentLookup);
}
/**
* @return {@link Optional#empty()} if not set.
* @since 4.0
*/
public Optional<FullDocumentBeforeChange> getFullDocumentBeforeChangeLookup() {
return Optional.ofNullable(fullDocumentBeforeChangeLookup);
}
/**
* @return {@link Optional#empty()} if not set.
*/
@@ -159,52 +152,10 @@ public class ChangeStreamOptions {
}
throw new IllegalArgumentException(
"o_O that should actually not happen; The timestamp should be an Instant or a BsonTimestamp but was "
"o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp));
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamOptions that = (ChangeStreamOptions) o;
if (!ObjectUtils.nullSafeEquals(this.filter, that.filter)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeToken, that.resumeToken)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.fullDocumentBeforeChangeLookup, that.fullDocumentBeforeChangeLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeTimestamp, that.resumeTimestamp)) {
return false;
}
return resume == that.resume;
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(filter);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentBeforeChangeLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp);
result = 31 * result + ObjectUtils.nullSafeHashCode(resume);
return result;
}
/**
* @author Christoph Strobl
* @since 2.2
@@ -235,7 +186,6 @@ public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
@@ -250,7 +200,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder collation(Collation collation) {
Assert.notNull(collation, "Collation must not be null nor empty");
Assert.notNull(collation, "Collation must not be null nor empty!");
this.collation = collation;
return this;
@@ -258,13 +208,13 @@ public class ChangeStreamOptions {
/**
* Set the filter to apply.
* <br />
* <p/>
* Fields on aggregation expression root level are prefixed to map to fields contained in
* {@link ChangeStreamDocument#getFullDocument() fullDocument}. However {@literal operationType}, {@literal ns},
* {@literal documentKey} and {@literal fullDocument} are reserved words that will be omitted, and therefore taken
* as given, during the mapping procedure. You may want to have a look at the
* <a href="https://docs.mongodb.com/manual/reference/change-events/">structure of Change Events</a>.
* <br />
* <p/>
* Use {@link org.springframework.data.mongodb.core.aggregation.TypedAggregation} to ensure filter expressions are
* mapped to domain type fields.
*
@@ -274,7 +224,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder filter(Aggregation filter) {
Assert.notNull(filter, "Filter must not be null");
Assert.notNull(filter, "Filter must not be null!");
this.filter = filter;
return this;
@@ -303,7 +253,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) {
Assert.notNull(resumeToken, "ResumeToken must not be null");
Assert.notNull(resumeToken, "ResumeToken must not be null!");
this.resumeToken = resumeToken;
@@ -332,38 +282,12 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) {
Assert.notNull(lookup, "Lookup must not be null");
Assert.notNull(lookup, "Lookup must not be null!");
this.fullDocumentLookup = lookup;
return this;
}
/**
* Set the {@link FullDocumentBeforeChange} lookup to use.
*
* @param lookup must not be {@literal null}.
* @return this.
* @since 4.0
*/
public ChangeStreamOptionsBuilder fullDocumentBeforeChangeLookup(FullDocumentBeforeChange lookup) {
Assert.notNull(lookup, "Lookup must not be null");
this.fullDocumentBeforeChangeLookup = lookup;
return this;
}
/**
* Return the full document before being changed if it is available.
*
* @return this.
* @since 4.0
* @see #fullDocumentBeforeChangeLookup(FullDocumentBeforeChange)
*/
public ChangeStreamOptionsBuilder returnFullDocumentBeforeChange() {
return fullDocumentBeforeChangeLookup(FullDocumentBeforeChange.WHEN_AVAILABLE);
}
/**
* Set the cluster time to resume from.
*
@@ -372,7 +296,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeAt(Instant resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null");
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
@@ -387,7 +311,7 @@ public class ChangeStreamOptions {
*/
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null");
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
@@ -433,7 +357,6 @@ public class ChangeStreamOptions {
options.filter = this.filter;
options.resumeToken = this.resumeToken;
options.fullDocumentLookup = this.fullDocumentLookup;
options.fullDocumentBeforeChangeLookup = this.fullDocumentBeforeChangeLookup;
options.collation = this.collation;
options.resumeTimestamp = this.resumeTimestamp;
options.resume = this.resume;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,18 +15,16 @@
*/
package org.springframework.data.mongodb.core;
import lombok.RequiredArgsConstructor;
import java.util.Optional;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.timeseries.GranularityDefinition;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.ValidationAction;
import com.mongodb.client.model.ValidationLevel;
@@ -46,20 +44,29 @@ public class CollectionOptions {
private @Nullable Boolean capped;
private @Nullable Collation collation;
private ValidationOptions validationOptions;
private @Nullable TimeSeriesOptions timeSeriesOptions;
private @Nullable CollectionChangeStreamOptions changeStreamOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise. Can be {@literal null}.
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none());
}
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions, @Nullable TimeSeriesOptions timeSeriesOptions,
@Nullable CollectionChangeStreamOptions changeStreamOptions) {
@Nullable Collation collation, ValidationOptions validationOptions) {
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
this.collation = collation;
this.validationOptions = validationOptions;
this.timeSeriesOptions = timeSeriesOptions;
this.changeStreamOptions = changeStreamOptions;
}
/**
@@ -71,9 +78,9 @@ public class CollectionOptions {
*/
public static CollectionOptions just(Collation collation) {
Assert.notNull(collation, "Collation must not be null");
Assert.notNull(collation, "Collation must not be null!");
return new CollectionOptions(null, null, null, collation, ValidationOptions.none(), null, null);
return new CollectionOptions(null, null, null, collation, ValidationOptions.none());
}
/**
@@ -83,33 +90,7 @@ public class CollectionOptions {
* @since 2.0
*/
public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null, ValidationOptions.none(), null, null);
}
/**
* Quick way to set up {@link CollectionOptions} for a Time Series collection. For more advanced settings use
* {@link #timeSeries(TimeSeriesOptions)}.
*
* @param timeField The name of the property which contains the date in each time series document. Must not be
* {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @see #timeSeries(TimeSeriesOptions)
* @since 3.3
*/
public static CollectionOptions timeSeries(String timeField) {
return empty().timeSeries(TimeSeriesOptions.timeSeries(timeField));
}
/**
* Quick way to set up {@link CollectionOptions} for emitting (pre & post) change events.
*
* @return new instance of {@link CollectionOptions}.
* @see #changeStream(CollectionChangeStreamOptions)
* @see CollectionChangeStreamOptions#preAndPostImages(boolean)
* @since 4.0
*/
public static CollectionOptions emitChangedRevisions() {
return empty().changeStream(CollectionChangeStreamOptions.preAndPostImages(true));
return new CollectionOptions(null, null, null, null, ValidationOptions.none());
}
/**
@@ -120,8 +101,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions);
}
/**
@@ -132,8 +112,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -144,8 +123,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -156,8 +134,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -249,7 +226,7 @@ public class CollectionOptions {
*/
public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) {
Assert.notNull(validationLevel, "ValidationLevel must not be null");
Assert.notNull(validationLevel, "ValidationLevel must not be null!");
return validation(validationOptions.validationLevel(validationLevel));
}
@@ -263,7 +240,7 @@ public class CollectionOptions {
*/
public CollectionOptions schemaValidationAction(ValidationAction validationAction) {
Assert.notNull(validationAction, "ValidationAction must not be null");
Assert.notNull(validationAction, "ValidationAction must not be null!");
return validation(validationOptions.validationAction(validationAction));
}
@@ -276,37 +253,8 @@ public class CollectionOptions {
*/
public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param timeSeriesOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions timeSeries(TimeSeriesOptions timeSeriesOptions) {
Assert.notNull(timeSeriesOptions, "TimeSeriesOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param changeStreamOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions changeStream(CollectionChangeStreamOptions changeStreamOptions) {
Assert.notNull(changeStreamOptions, "ChangeStreamOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions,
changeStreamOptions);
Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -357,80 +305,6 @@ public class CollectionOptions {
return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions);
}
/**
* Get the {@link TimeSeriesOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 3.3
*/
public Optional<TimeSeriesOptions> getTimeSeriesOptions() {
return Optional.ofNullable(timeSeriesOptions);
}
/**
* Get the {@link CollectionChangeStreamOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 4.0
*/
public Optional<CollectionChangeStreamOptions> getChangeStreamOptions() {
return Optional.ofNullable(changeStreamOptions);
}
@Override
public String toString() {
return "CollectionOptions{" + "maxDocuments=" + maxDocuments + ", size=" + size + ", capped=" + capped
+ ", collation=" + collation + ", validationOptions=" + validationOptions + ", timeSeriesOptions="
+ timeSeriesOptions + ", changeStreamOptions=" + changeStreamOptions + ", disableValidation="
+ disableValidation() + ", strictValidation=" + strictValidation() + ", moderateValidation="
+ moderateValidation() + ", warnOnValidationError=" + warnOnValidationError() + ", failOnValidationError="
+ failOnValidationError() + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
CollectionOptions that = (CollectionOptions) o;
if (!ObjectUtils.nullSafeEquals(maxDocuments, that.maxDocuments)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(size, that.size)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(capped, that.capped)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(collation, that.collation)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(validationOptions, that.validationOptions)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(timeSeriesOptions, that.timeSeriesOptions)) {
return false;
}
return ObjectUtils.nullSafeEquals(changeStreamOptions, that.changeStreamOptions);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(maxDocuments);
result = 31 * result + ObjectUtils.nullSafeHashCode(size);
result = 31 * result + ObjectUtils.nullSafeHashCode(capped);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(validationOptions);
result = 31 * result + ObjectUtils.nullSafeHashCode(timeSeriesOptions);
result = 31 * result + ObjectUtils.nullSafeHashCode(changeStreamOptions);
return result;
}
/**
* Encapsulation of ValidationOptions options.
*
@@ -438,6 +312,7 @@ public class CollectionOptions {
* @author Andreas Zink
* @since 2.1
*/
@RequiredArgsConstructor
public static class ValidationOptions {
private static final ValidationOptions NONE = new ValidationOptions(null, null, null);
@@ -446,13 +321,6 @@ public class CollectionOptions {
private final @Nullable ValidationLevel validationLevel;
private final @Nullable ValidationAction validationAction;
public ValidationOptions(Validator validator, ValidationLevel validationLevel, ValidationAction validationAction) {
this.validator = validator;
this.validationLevel = validationLevel;
this.validationAction = validationAction;
}
/**
* Create an empty {@link ValidationOptions}.
*
@@ -513,7 +381,7 @@ public class CollectionOptions {
/**
* Get the {@code validationAction} to perform.
*
* @return {@link Optional#empty()} if not set.
* @return @return {@link Optional#empty()} if not set.
*/
public Optional<ValidationAction> getValidationAction() {
return Optional.ofNullable(validationAction);
@@ -525,211 +393,5 @@ public class CollectionOptions {
boolean isEmpty() {
return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel());
}
@Override
public String toString() {
return "ValidationOptions{" + "validator=" + validator + ", validationLevel=" + validationLevel
+ ", validationAction=" + validationAction + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
ValidationOptions that = (ValidationOptions) o;
if (!ObjectUtils.nullSafeEquals(validator, that.validator)) {
return false;
}
if (validationLevel != that.validationLevel)
return false;
return validationAction == that.validationAction;
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(validator);
result = 31 * result + ObjectUtils.nullSafeHashCode(validationLevel);
result = 31 * result + ObjectUtils.nullSafeHashCode(validationAction);
return result;
}
}
/**
* Encapsulation of options applied to define collections change stream behaviour.
*
* @author Christoph Strobl
* @since 4.0
*/
public static class CollectionChangeStreamOptions {
private final boolean preAndPostImages;
private CollectionChangeStreamOptions(boolean emitChangedRevisions) {
this.preAndPostImages = emitChangedRevisions;
}
/**
* Output the version of a document before and after changes (the document pre- and post-images).
*
* @return new instance of {@link CollectionChangeStreamOptions}.
*/
public static CollectionChangeStreamOptions preAndPostImages(boolean emitChangedRevisions) {
return new CollectionChangeStreamOptions(true);
}
public boolean getPreAndPostImages() {
return preAndPostImages;
}
@Override
public String toString() {
return "CollectionChangeStreamOptions{" + "preAndPostImages=" + preAndPostImages + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
CollectionChangeStreamOptions that = (CollectionChangeStreamOptions) o;
return preAndPostImages == that.preAndPostImages;
}
@Override
public int hashCode() {
return (preAndPostImages ? 1 : 0);
}
}
/**
* Options applicable to Time Series collections.
*
* @author Christoph Strobl
* @since 3.3
* @see <a href=
* "https://docs.mongodb.com/manual/core/timeseries-collections">https://docs.mongodb.com/manual/core/timeseries-collections</a>
*/
public static class TimeSeriesOptions {
private final String timeField;
private @Nullable final String metaField;
private final GranularityDefinition granularity;
private TimeSeriesOptions(String timeField, @Nullable String metaField, GranularityDefinition granularity) {
Assert.hasText(timeField, "Time field must not be empty or null");
this.timeField = timeField;
this.metaField = metaField;
this.granularity = granularity;
}
/**
* Create a new instance of {@link TimeSeriesOptions} using the given field as its {@literal timeField}. The one,
* that contains the date in each time series document. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param timeField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public static TimeSeriesOptions timeSeries(String timeField) {
return new TimeSeriesOptions(timeField, null, Granularity.DEFAULT);
}
/**
* Set the name of the field which contains metadata in each time series document. Should not be the {@literal id}
* nor {@link TimeSeriesOptions#timeSeries(String)} timeField} nor point to an {@literal array} or
* {@link java.util.Collection}. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param metaField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public TimeSeriesOptions metaField(String metaField) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* Select the {@link GranularityDefinition} parameter to define how data in the time series collection is organized.
* Select one that is closest to the time span between incoming measurements.
*
* @return new instance of {@link TimeSeriesOptions}.
* @see Granularity
*/
public TimeSeriesOptions granularity(GranularityDefinition granularity) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* @return never {@literal null}.
*/
public String getTimeField() {
return timeField;
}
/**
* @return can be {@literal null}. Might be an {@literal empty} {@link String} as well, so maybe check via
* {@link org.springframework.util.StringUtils#hasText(String)}.
*/
@Nullable
public String getMetaField() {
return metaField;
}
/**
* @return never {@literal null}.
*/
public GranularityDefinition getGranularity() {
return granularity;
}
@Override
public String toString() {
return "TimeSeriesOptions{" + "timeField='" + timeField + '\'' + ", metaField='" + metaField + '\''
+ ", granularity=" + granularity + '}';
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
TimeSeriesOptions that = (TimeSeriesOptions) o;
if (!ObjectUtils.nullSafeEquals(timeField, that.timeField)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(metaField, that.metaField)) {
return false;
}
return ObjectUtils.nullSafeEquals(granularity, that.granularity);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(timeField);
result = 31 * result + ObjectUtils.nullSafeHashCode(metaField);
result = 31 * result + ObjectUtils.nullSafeHashCode(granularity);
return result;
}
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.client.MongoCollection;
/**
* Interface for functional preparation of a {@link MongoCollection}.
*
* @author Mark Paluch
* @since 4.1
*/
public interface CollectionPreparer<T> {
/**
* Returns a preparer that always returns its input collection.
*
* @return a preparer that always returns its input collection.
*/
static <T> CollectionPreparer<T> identity() {
return it -> it;
}
/**
* Prepare the {@code collection}.
*
* @param collection the collection to prepare.
* @return the prepared collection.
*/
T prepare(T collection);
/**
* Returns a composed {@code CollectionPreparer} that first applies this preparer to the collection, and then applies
* the {@code after} preparer to the result. If evaluation of either function throws an exception, it is relayed to
* the caller of the composed function.
*
* @param after the collection preparer to apply after this function is applied.
* @return a composed {@code CollectionPreparer} that first applies this preparer and then applies the {@code after}
* preparer.
*/
default CollectionPreparer<T> andThen(CollectionPreparer<T> after) {
Assert.notNull(after, "After CollectionPreparer must not be null");
return c -> after.prepare(prepare(c));
}
}

View File

@@ -1,182 +0,0 @@
/*
* Copyright 2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import java.util.function.BiFunction;
import java.util.function.Function;
import org.bson.Document;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
import com.mongodb.client.MongoCollection;
/**
* Support class for delegate implementations to apply {@link ReadConcern} and {@link ReadPreference} settings upon
* {@link CollectionPreparer preparing a collection}.
*
* @author Mark Paluch
* @since 4.1
*/
class CollectionPreparerSupport implements ReadConcernAware, ReadPreferenceAware {
private final List<Object> sources;
private CollectionPreparerSupport(List<Object> sources) {
this.sources = sources;
}
<T> T doPrepare(T collection, Function<T, ReadConcern> concernAccessor, BiFunction<T, ReadConcern, T> concernFunction,
Function<T, ReadPreference> preferenceAccessor, BiFunction<T, ReadPreference, T> preferenceFunction) {
T collectionToUse = collection;
for (Object source : sources) {
if (source instanceof ReadConcernAware rca && rca.hasReadConcern()) {
ReadConcern concern = rca.getReadConcern();
if (concernAccessor.apply(collectionToUse) != concern) {
collectionToUse = concernFunction.apply(collectionToUse, concern);
}
break;
}
}
for (Object source : sources) {
if (source instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
ReadPreference preference = rpa.getReadPreference();
if (preferenceAccessor.apply(collectionToUse) != preference) {
collectionToUse = preferenceFunction.apply(collectionToUse, preference);
}
break;
}
}
return collectionToUse;
}
@Override
public boolean hasReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return true;
}
}
return false;
}
@Override
public ReadConcern getReadConcern() {
for (Object aware : sources) {
if (aware instanceof ReadConcernAware rca && rca.hasReadConcern()) {
return rca.getReadConcern();
}
}
return null;
}
@Override
public boolean hasReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return true;
}
}
return false;
}
@Override
public ReadPreference getReadPreference() {
for (Object aware : sources) {
if (aware instanceof ReadPreferenceAware rpa && rpa.hasReadPreference()) {
return rpa.getReadPreference();
}
}
return null;
}
static class CollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<MongoCollection<Document>> {
private CollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static CollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static CollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (CollectionPreparerDelegate) mixedAwares[0];
}
return new CollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public MongoCollection<Document> prepare(MongoCollection<Document> collection) {
return doPrepare(collection, MongoCollection::getReadConcern, MongoCollection::withReadConcern,
MongoCollection::getReadPreference, MongoCollection::withReadPreference);
}
}
static class ReactiveCollectionPreparerDelegate extends CollectionPreparerSupport
implements CollectionPreparer<com.mongodb.reactivestreams.client.MongoCollection<Document>> {
private ReactiveCollectionPreparerDelegate(List<Object> sources) {
super(sources);
}
public static ReactiveCollectionPreparerDelegate of(ReadPreferenceAware... awares) {
return of((Object[]) awares);
}
public static ReactiveCollectionPreparerDelegate of(Object... mixedAwares) {
if (mixedAwares.length == 1 && mixedAwares[0] instanceof CollectionPreparerDelegate) {
return (ReactiveCollectionPreparerDelegate) mixedAwares[0];
}
return new ReactiveCollectionPreparerDelegate(Arrays.asList(mixedAwares));
}
@Override
public com.mongodb.reactivestreams.client.MongoCollection<Document> prepare(
com.mongodb.reactivestreams.client.MongoCollection<Document> collection) {
return doPrepare(collection, //
com.mongodb.reactivestreams.client.MongoCollection::getReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::withReadConcern,
com.mongodb.reactivestreams.client.MongoCollection::getReadPreference,
com.mongodb.reactivestreams.client.MongoCollection::withReadPreference);
}
}
}

View File

@@ -1,263 +0,0 @@
/*
* Copyright 2019-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.bson.Document;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.MetricConversion;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Value object representing a count query. Count queries using {@code $near} or {@code $nearSphere} require a rewrite
* to {@code $geoWithin}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.0
*/
class CountQuery {
private final Document source;
private CountQuery(Document source) {
this.source = source;
}
public static CountQuery of(Document source) {
return new CountQuery(source);
}
/**
* Returns the query {@link Document} that can be used with {@code countDocuments()}. Potentially rewrites the query
* to be usable with {@code countDocuments()}.
*
* @return the query {@link Document} that can be used with {@code countDocuments()}.
*/
public Document toQueryDocument() {
if (!requiresRewrite(source)) {
return source;
}
Document target = new Document();
for (Map.Entry<String, Object> entry : source.entrySet()) {
if (entry.getValue() instanceof Document && requiresRewrite(entry.getValue())) {
Document theValue = (Document) entry.getValue();
target.putAll(createGeoWithin(entry.getKey(), theValue, source.get("$and")));
continue;
}
if (entry.getValue() instanceof Collection && requiresRewrite(entry.getValue())) {
Collection<?> source = (Collection<?>) entry.getValue();
target.put(entry.getKey(), rewriteCollection(source));
continue;
}
if ("$and".equals(entry.getKey()) && target.containsKey("$and")) {
// Expect $and to be processed with Document and createGeoWithin.
continue;
}
target.put(entry.getKey(), entry.getValue());
}
return target;
}
/**
* @param valueToInspect
* @return {@code true} if the enclosing element needs to be rewritten.
*/
private boolean requiresRewrite(Object valueToInspect) {
if (valueToInspect instanceof Document) {
return requiresRewrite((Document) valueToInspect);
}
if (valueToInspect instanceof Collection) {
return requiresRewrite((Collection<?>) valueToInspect);
}
return false;
}
private boolean requiresRewrite(Collection<?> collection) {
for (Object o : collection) {
if (o instanceof Document && requiresRewrite((Document) o)) {
return true;
}
}
return false;
}
private boolean requiresRewrite(Document document) {
if (containsNear(document)) {
return true;
}
for (Object entry : document.values()) {
if (requiresRewrite(entry)) {
return true;
}
}
return false;
}
private Collection<Object> rewriteCollection(Collection<?> source) {
Collection<Object> rewrittenCollection = new ArrayList<>(source.size());
for (Object item : source) {
if (item instanceof Document && requiresRewrite(item)) {
rewrittenCollection.add(CountQuery.of((Document) item).toQueryDocument());
} else {
rewrittenCollection.add(item);
}
}
return rewrittenCollection;
}
/**
* Rewrite the near query for field {@code key} to {@code $geoWithin}.
*
* @param key the queried field.
* @param source source {@link Document}.
* @param $and potentially existing {@code $and} condition.
* @return the rewritten query {@link Document}.
*/
@SuppressWarnings("unchecked")
private static Document createGeoWithin(String key, Document source, @Nullable Object $and) {
boolean spheric = source.containsKey("$nearSphere");
Object $near = spheric ? source.get("$nearSphere") : source.get("$near");
Number maxDistance = getMaxDistance(source, $near, spheric);
List<Object> $centerMax = Arrays.asList(toCenterCoordinates($near), maxDistance);
Document $geoWithinMax = new Document("$geoWithin",
new Document(spheric ? "$centerSphere" : "$center", $centerMax));
if (!containsNearWithMinDistance(source)) {
return new Document(key, $geoWithinMax);
}
Number minDistance = (Number) source.get("$minDistance");
List<Object> $centerMin = Arrays.asList(toCenterCoordinates($near), minDistance);
Document $geoWithinMin = new Document("$geoWithin",
new Document(spheric ? "$centerSphere" : "$center", $centerMin));
List<Document> criteria;
if ($and != null) {
if ($and instanceof Collection) {
Collection<Document> andElements = (Collection<Document>) $and;
criteria = new ArrayList<>(andElements.size() + 2);
criteria.addAll(andElements);
} else {
throw new IllegalArgumentException(
"Cannot rewrite query as it contains an '$and' element that is not a Collection: Offending element: "
+ $and);
}
} else {
criteria = new ArrayList<>(2);
}
criteria.add(new Document("$nor", Collections.singletonList(new Document(key, $geoWithinMin))));
criteria.add(new Document(key, $geoWithinMax));
return new Document("$and", criteria);
}
private static Number getMaxDistance(Document source, Object $near, boolean spheric) {
Number maxDistance = Double.MAX_VALUE;
if (source.containsKey("$maxDistance")) { // legacy coordinate pair
return (Number) source.get("$maxDistance");
}
if ($near instanceof Document nearDoc) {
if (nearDoc.containsKey("$maxDistance")) {
maxDistance = (Number) nearDoc.get("$maxDistance");
// geojson is in Meters but we need radians x/(6378.1*1000)
if (spheric && nearDoc.containsKey("$geometry")) {
maxDistance = MetricConversion.metersToRadians(maxDistance.doubleValue());
}
}
}
return maxDistance;
}
private static boolean containsNear(Document source) {
return source.containsKey("$near") || source.containsKey("$nearSphere");
}
private static boolean containsNearWithMinDistance(Document source) {
if (!containsNear(source)) {
return false;
}
return source.containsKey("$minDistance");
}
private static Object toCenterCoordinates(Object value) {
if (ObjectUtils.isArray(value)) {
return value;
}
if (value instanceof Point) {
return Arrays.asList(((Point) value).getX(), ((Point) value).getY());
}
if (value instanceof Document document) {
if (document.containsKey("x")) {
return Arrays.asList(document.get("x"), document.get("y"));
}
if (document.containsKey("$geometry")) {
Document geoJsonPoint = document.get("$geometry", Document.class);
return geoJsonPoint.get("coordinates");
}
}
return value;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2023 the original author or authors.
* Copyright 2002-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -61,8 +61,8 @@ public interface CursorPreparer extends ReadPreferenceAware {
default FindIterable<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindIterable<Document>> find) {
Assert.notNull(collection, "Collection must not be null");
Assert.notNull(find, "Find function must not be null");
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@@ -24,13 +27,11 @@ import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
@@ -45,9 +46,7 @@ import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
@@ -63,8 +62,6 @@ import com.mongodb.client.model.*;
* @author Minsu Kim
* @author Jens Schauder
* @author Michail Nikolaev
* @author Roman Puchkovskiy
* @author Jacob Botuck
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
@@ -74,6 +71,7 @@ class DefaultBulkOperations implements BulkOperations {
private final BulkOperationContext bulkOperationContext;
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
@@ -90,16 +88,26 @@ class DefaultBulkOperations implements BulkOperations {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "CollectionName must not be null nor empty");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null");
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
@@ -109,10 +117,14 @@ class DefaultBulkOperations implements BulkOperations {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null");
Assert.notNull(document, "Document must not be null!");
maybeEmitEvent(new BeforeConvertEvent<>(document, collectionName));
Object source = maybeInvokeBeforeConvertCallback(document);
@@ -121,30 +133,42 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null");
Assert.notNull(documents, "Documents must not be null!");
documents.forEach(this::insert);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null");
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
@@ -153,20 +177,28 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null");
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
@@ -175,11 +207,19 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
@@ -190,10 +230,14 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(query, "Query must not be null!");
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
@@ -203,10 +247,14 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null");
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
@@ -215,12 +263,16 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(replacement, "Replacement must not be null");
Assert.notNull(options, "Options must not be null");
Assert.notNull(query, "Query must not be null!");
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(options, "Options must not be null!");
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert());
@@ -234,6 +286,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public com.mongodb.bulk.BulkWriteResult execute() {
@@ -241,10 +297,9 @@ class DefaultBulkOperations implements BulkOperations {
com.mongodb.bulk.BulkWriteResult result = mongoOperations.execute(collectionName, this::bulkWriteTo);
Assert.state(result != null, "Result must not be null");
Assert.state(result != null, "Result must not be null.");
models.forEach(this::maybeEmitAfterSaveEvent);
models.forEach(this::maybeInvokeAfterSaveCallback);
return result;
} finally {
@@ -258,26 +313,11 @@ class DefaultBulkOperations implements BulkOperations {
collection = collection.withWriteConcern(defaultWriteConcern);
}
try {
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
} catch (RuntimeException ex) {
if (ex instanceof MongoBulkWriteException) {
MongoBulkWriteException mongoBulkWriteException = (MongoBulkWriteException) ex;
if (mongoBulkWriteException.getWriteConcernError() != null) {
throw new DataIntegrityViolationException(ex.getMessage(), ex);
}
throw new BulkOperationException(ex.getMessage(), mongoBulkWriteException);
}
throw ex;
}
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
}
private WriteModel<Document> extractAndMapWriteModel(SourceAwareWriteModelHolder it) {
@@ -308,8 +348,8 @@ class DefaultBulkOperations implements BulkOperations {
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null");
Assert.notNull(update, "Update must not be null");
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
UpdateOptions options = computeUpdateOptions(query, update, upsert);
@@ -381,52 +421,38 @@ class DefaultBulkOperations implements BulkOperations {
models.add(new SourceAwareWriteModelHolder(source, model));
}
private void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder holder) {
private void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder it) {
if (holder.getModel() instanceof InsertOneModel) {
if (it.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(it.getSource(), target, collectionName));
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(holder.getSource(), target, collectionName));
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(it.getSource(), target, collectionName));
}
}
private void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder holder) {
private void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder it) {
if (holder.getModel() instanceof InsertOneModel) {
if (it.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(it.getSource(), target, collectionName));
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(holder.getSource(), target, collectionName));
}
}
private void maybeInvokeAfterSaveCallback(SourceAwareWriteModelHolder holder) {
if (holder.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) holder.getModel()).getDocument();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
} else if (holder.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) holder.getModel()).getReplacement();
maybeInvokeAfterSaveCallback(holder.getSource(), target);
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(it.getSource(), target, collectionName));
}
}
private <E extends MongoMappingEvent<T>, T> E maybeEmitEvent(E event) {
if (bulkOperationContext.getEventPublisher() == null) {
return event;
if (null != bulkOperationContext.getEventPublisher()) {
bulkOperationContext.getEventPublisher().publishEvent(event);
}
bulkOperationContext.getEventPublisher().publishEvent(event);
return event;
}
@@ -449,16 +475,6 @@ class DefaultBulkOperations implements BulkOperations {
collectionName);
}
private Object maybeInvokeAfterSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(AfterSaveCallback.class, value, mappedDocument,
collectionName);
}
private static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
@@ -470,7 +486,7 @@ class DefaultBulkOperations implements BulkOperations {
return options.ordered(false);
}
throw new IllegalStateException("BulkMode was null");
throw new IllegalStateException("BulkMode was null!");
}
/**
@@ -504,93 +520,15 @@ class DefaultBulkOperations implements BulkOperations {
* @author Christoph Strobl
* @since 2.0
*/
static final class BulkOperationContext {
@Value
static class BulkOperationContext {
private final BulkMode bulkMode;
private final Optional<? extends MongoPersistentEntity<?>> entity;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final ApplicationEventPublisher eventPublisher;
private final EntityCallbacks entityCallbacks;
BulkOperationContext(BulkOperations.BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, ApplicationEventPublisher eventPublisher,
EntityCallbacks entityCallbacks) {
this.bulkMode = bulkMode;
this.entity = entity;
this.queryMapper = queryMapper;
this.updateMapper = updateMapper;
this.eventPublisher = eventPublisher;
this.entityCallbacks = entityCallbacks;
}
public BulkMode getBulkMode() {
return this.bulkMode;
}
public Optional<? extends MongoPersistentEntity<?>> getEntity() {
return this.entity;
}
public QueryMapper getQueryMapper() {
return this.queryMapper;
}
public UpdateMapper getUpdateMapper() {
return this.updateMapper;
}
public ApplicationEventPublisher getEventPublisher() {
return this.eventPublisher;
}
public EntityCallbacks getEntityCallbacks() {
return this.entityCallbacks;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BulkOperationContext that = (BulkOperationContext) o;
if (bulkMode != that.bulkMode)
return false;
if (!ObjectUtils.nullSafeEquals(this.entity, that.entity)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.queryMapper, that.queryMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.updateMapper, that.updateMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.eventPublisher, that.eventPublisher)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.entityCallbacks, that.entityCallbacks);
}
@Override
public int hashCode() {
int result = bulkMode != null ? bulkMode.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(entity);
result = 31 * result + ObjectUtils.nullSafeHashCode(queryMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(updateMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(eventPublisher);
result = 31 * result + ObjectUtils.nullSafeHashCode(entityCallbacks);
return result;
}
public String toString() {
return "DefaultBulkOperations.BulkOperationContext(bulkMode=" + this.getBulkMode() + ", entity="
+ this.getEntity() + ", queryMapper=" + this.getQueryMapper() + ", updateMapper=" + this.getUpdateMapper()
+ ", eventPublisher=" + this.getEventPublisher() + ", entityCallbacks=" + this.getEntityCallbacks() + ")";
}
@NonNull BulkMode bulkMode;
@NonNull Optional<? extends MongoPersistentEntity<?>> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
ApplicationEventPublisher eventPublisher;
EntityCallbacks entityCallbacks;
}
/**
@@ -599,50 +537,10 @@ class DefaultBulkOperations implements BulkOperations {
* @since 2.2
* @author Christoph Strobl
*/
private static final class SourceAwareWriteModelHolder {
@Value
private static class SourceAwareWriteModelHolder {
private final Object source;
private final WriteModel<Document> model;
SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
this.source = source;
this.model = model;
}
public Object getSource() {
return this.source;
}
public WriteModel<Document> getModel() {
return this.model;
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SourceAwareWriteModelHolder that = (SourceAwareWriteModelHolder) o;
if (!ObjectUtils.nullSafeEquals(this.source, that.source)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.model, that.model);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(model);
result = 31 * result + ObjectUtils.nullSafeHashCode(source);
return result;
}
public String toString() {
return "DefaultBulkOperations.SourceAwareWriteModelHolder(source=" + this.getSource() + ", model="
+ this.getModel() + ")";
}
Object source;
WriteModel<Document> model;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2023 the original author or authors.
* Copyright 2011-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
@@ -64,7 +64,7 @@ public class DefaultIndexOperations implements IndexOperations {
* {@link DefaultIndexOperations#DefaultIndexOperations(MongoOperations, String, Class)}.
*/
@Deprecated
public DefaultIndexOperations(MongoDatabaseFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
this(mongoDbFactory, collectionName, queryMapper, null);
}
@@ -80,12 +80,12 @@ public class DefaultIndexOperations implements IndexOperations {
* {@link DefaultIndexOperations#DefaultIndexOperations(MongoOperations, String, Class)}.
*/
@Deprecated
public DefaultIndexOperations(MongoDatabaseFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
@Nullable Class<?> type) {
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null");
Assert.notNull(collectionName, "Collection name can not be null");
Assert.notNull(queryMapper, "QueryMapper must not be null");
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.collectionName = collectionName;
this.mapper = queryMapper;
@@ -103,8 +103,8 @@ public class DefaultIndexOperations implements IndexOperations {
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, @Nullable Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.hasText(collectionName, "Collection name must not be null or empty");
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.mapper = new QueryMapper(mongoOperations.getConverter());
@@ -112,6 +112,10 @@ public class DefaultIndexOperations implements IndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public String ensureIndex(final IndexDefinition indexDefinition) {
return execute(collection -> {
@@ -146,6 +150,10 @@ public class DefaultIndexOperations implements IndexOperations {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
execute(collection -> {
@@ -155,10 +163,18 @@ public class DefaultIndexOperations implements IndexOperations {
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return execute(new CollectionCallback<List<IndexInfo>>() {
@@ -172,8 +188,7 @@ public class DefaultIndexOperations implements IndexOperations {
private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
int available = cursor.available();
List<IndexInfo> indexInfoList = available > 0 ? new ArrayList<>(available) : new ArrayList<>();
List<IndexInfo> indexInfoList = new ArrayList<>();
while (cursor.hasNext()) {
@@ -190,7 +205,7 @@ public class DefaultIndexOperations implements IndexOperations {
@Nullable
public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null");
Assert.notNull(callback, "CollectionCallback must not be null!");
if (type != null) {
return mongoOperations.execute(type, callback);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2023 the original author or authors.
* Copyright 2016-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,13 +15,13 @@
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
/**
* {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDatabaseFactory}.
* {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDbFactory}.
*
* @author Mark Paluch
* @author Christoph Strobl
@@ -29,21 +29,25 @@ import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
*/
class DefaultIndexOperationsProvider implements IndexOperationsProvider {
private final MongoDatabaseFactory mongoDbFactory;
private final MongoDbFactory mongoDbFactory;
private final QueryMapper mapper;
/**
* @param mongoDbFactory must not be {@literal null}.
* @param mapper must not be {@literal null}.
*/
DefaultIndexOperationsProvider(MongoDatabaseFactory mongoDbFactory, QueryMapper mapper) {
DefaultIndexOperationsProvider(MongoDbFactory mongoDbFactory, QueryMapper mapper) {
this.mongoDbFactory = mongoDbFactory;
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName, Class<?> type) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type);
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2023 the original author or authors.
* Copyright 2016-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -76,9 +76,9 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
private DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName,
QueryMapper queryMapper, Optional<Class<?>> type) {
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null");
Assert.notNull(collectionName, "Collection must not be null");
Assert.notNull(queryMapper, "QueryMapper must not be null");
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null!");
Assert.notNull(collectionName, "Collection must not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
@@ -86,6 +86,10 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, collection -> {
@@ -115,14 +119,26 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2023 the original author or authors.
* Copyright 2014-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
@@ -64,29 +65,41 @@ class DefaultScriptOperations implements ScriptOperations {
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null");
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null");
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null");
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@@ -102,10 +115,14 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty");
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@@ -118,14 +135,22 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty");
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2023 the original author or authors.
* Copyright 2015-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2023 the original author or authors.
* Copyright 2010-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,29 +0,0 @@
/*
* Copyright 2021-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Encryption algorithms supported by MongoDB Client Side Field Level Encryption.
*
* @author Christoph Strobl
* @since 3.3
*/
public final class EncryptionAlgorithms {
public static final String AEAD_AES_256_CBC_HMAC_SHA_512_Deterministic = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic";
public static final String AEAD_AES_256_CBC_HMAC_SHA_512_Random = "AEAD_AES_256_CBC_HMAC_SHA_512-Random";
}

Some files were not shown because too many files have changed in this diff Show More