Compare commits

..

236 Commits

Author SHA1 Message Date
Jens Schauder
3e97d47248 Release version 3.1.11 (2020.0.11).
See #3681
2021-07-16 10:19:52 +02:00
Jens Schauder
c02840ca30 Prepare 3.1.11 (2020.0.11).
See #3681
2021-07-16 10:18:57 +02:00
Jens Schauder
975768c0d6 Updated changelog.
See #3681
2021-07-16 10:18:49 +02:00
Christoph Strobl
323ec3f1d6 Polishing.
Simplify KeyMapper current property/index setup.

Original Pull Request: #3689
2021-07-06 12:32:08 +02:00
David Julia
e480ceb2b7 Fix Regression in generating queries with nested maps with numeric keys.
While maps that have numeric keys work if there is only one map with an integer key, when there are multiple maps with numeric keys in a given query, it fails.

Take the following example for a map called outer with numeric keys holding reference to another object with a map called inner with numeric keys: Updates that are meant to generate {"$set": {"outerMap.1234.inner.5678": "hello"}} are instead generating {"$set": {"outerMap.1234.inner.inner": "hello"}}, repeating the later map property name instead of using the integer key value.

This commit adds unit tests both for the UpdateMapper and QueryMapper, which check multiple consecutive maps with numeric keys, and adds a fix in the KeyMapper. Because we cannot easily change the path parsing to somehow parse path parts corresponding to map keys differently, we address the issue in the KeyMapper. We keep track of the partial path corresponding to the current property and use it to skip adding the duplicated property name for the map to the query, and instead add the key.

This is a bit redundant in that we now have both an iterator and an index-based way of accessing the path parts, but it gets the tests passing and fixes the issue without making a large change to the current approach.

Fixes: #3688
Original Pull Request: #3689
2021-07-06 12:12:46 +02:00
Mark Paluch
9fedd8d8c3 Updated changelog.
See #3650
2021-06-22 16:07:27 +02:00
Mark Paluch
baddae25da After release cleanups.
See #3649
2021-06-22 15:28:51 +02:00
Mark Paluch
5064cf3e9a Prepare next development iteration.
See #3649
2021-06-22 15:28:47 +02:00
Mark Paluch
81271a4f2f Release version 3.1.10 (2020.0.10).
See #3649
2021-06-22 15:18:22 +02:00
Mark Paluch
c36f9988c7 Prepare 3.1.10 (2020.0.10).
See #3649
2021-06-22 15:17:54 +02:00
Mark Paluch
b49beb08b6 Updated changelog.
See #3649
2021-06-22 15:17:49 +02:00
Mark Paluch
18b0946879 Update reference docs to use correct MongoClient.
Closes #3666
2021-06-22 14:36:58 +02:00
larsw
cf3681f7c2 Add closing quote to GeoJson javadoc.
Closes #3677
2021-06-21 13:58:44 +02:00
Christoph Strobl
49ef3fbc74 Polishing.
Fix typo in class name and make sure MongoTestTemplate uses the configured simple types.

See: #3659
Original pull request: #3661.
2021-06-18 14:22:08 +02:00
Christoph Strobl
01141502a0 Fix query mapper path resolution for types considered simple ones.
spring-projects/spring-data-commons#2293 changed how PersistentProperty paths get resolved and considers potentially registered converters for those, which made the path resolution fail in during the query mapping process.
This commit makes sure to capture the according exception and continue with the given user input.

Fixes: #3659
Original pull request: #3661.
2021-06-18 14:14:45 +02:00
Christoph Strobl
5d8f3d5c8b Fix $or / $nor keyword mapping in query mapper.
This commit fixes an issue with the pattern used for detecting $or / $nor which also matched other keywords like $floor.

Closes: #3635
Original pull request: #3637.
2021-06-18 13:55:46 +02:00
Mark Paluch
5cf801ff8e Polishing.
Add nullability annotation. Return early on null value conversion.

See #3633
Original pull request: #3643.
2021-06-09 14:20:46 +02:00
Christoph Strobl
2f3fb4aea9 Fix NPE in QueryMapper when trying to apply target type on null value.
Closes #3633
Original pull request: #3643.
2021-06-09 14:20:45 +02:00
Mark Paluch
82e05e7e8e Updated changelog.
See #3629
2021-05-14 12:36:39 +02:00
Mark Paluch
5be2e3eea2 After release cleanups.
See #3628
2021-05-14 12:05:20 +02:00
Mark Paluch
54f55e04de Prepare next development iteration.
See #3628
2021-05-14 12:05:16 +02:00
Mark Paluch
9ef1386784 Release version 3.1.9 (2020.0.9).
See #3628
2021-05-14 11:52:23 +02:00
Mark Paluch
696fd725c3 Prepare 3.1.9 (2020.0.9).
See #3628
2021-05-14 11:51:53 +02:00
Mark Paluch
e6fda2ccdd Updated changelog.
See #3628
2021-05-14 11:51:46 +02:00
Greg L. Turnquist
7a24bab9a2 Authenticate with artifactory.
See #3616.
2021-04-22 15:04:01 -05:00
Mark Paluch
38b7fb7105 Updated changelog.
See #3616
2021-04-14 14:40:02 +02:00
Mark Paluch
d42d06e058 After release cleanups.
See #3617
2021-04-14 11:42:08 +02:00
Mark Paluch
e2709abfe0 Prepare next development iteration.
See #3617
2021-04-14 11:42:04 +02:00
Mark Paluch
12b4aab834 Release version 3.1.8 (2020.0.8).
See #3617
2021-04-14 11:33:25 +02:00
Mark Paluch
db06756c8f Prepare 3.1.8 (2020.0.8).
See #3617
2021-04-14 11:32:47 +02:00
Mark Paluch
b319b8a589 Updated changelog.
See #3617
2021-04-14 11:32:42 +02:00
Mark Paluch
a516795759 Updated changelog.
See #3597
2021-04-14 11:17:41 +02:00
Mark Paluch
bab08502a5 Polishing.
Fix nullability annotations for isEqual(…) parameters. Fix generics. Reformat code.

Add tests.

See #3414
Original pull request: #3615.
2021-04-13 09:40:00 +02:00
Clement Petit
3e1f95bc94 Handle nested Pattern and Document in Criteria.equals(…).
Closes #3414
Original pull request: #3615.
2021-04-13 09:39:59 +02:00
Mark Paluch
5c153dc76e Polishing.
Use ObjectUtils for empty check.

See #3623
Original pull request: #3625.
2021-04-13 09:09:42 +02:00
Christoph Strobl
8f4e207d97 Fix NPE in declarative aggregation execution.
This commit fixes an issue where using a simple return type leads to NPE when the actual aggregation result does not contain any values.

Closes: #3623
Original pull request: #3625.
2021-04-13 09:09:42 +02:00
Christoph Strobl
5000a40d72 Fix query mapping resolution of properties using underscore within field name.
Closes: #3601
Original pull request: #3607.
2021-04-09 12:27:08 +02:00
Mark Paluch
fb59f49dae After release cleanups.
See #3598
2021-03-31 18:29:41 +02:00
Mark Paluch
f3c1e014e9 Prepare next development iteration.
See #3598
2021-03-31 18:29:38 +02:00
Mark Paluch
f52cc3be1f Release version 3.1.7 (2020.0.7).
See #3598
2021-03-31 18:19:52 +02:00
Mark Paluch
1bda93858c Prepare 3.1.7 (2020.0.7).
See #3598
2021-03-31 18:19:19 +02:00
Mark Paluch
1808970daf Updated changelog.
See #3598
2021-03-31 18:19:13 +02:00
Mark Paluch
558fc28cce Updated changelog.
See #3595
2021-03-31 17:26:09 +02:00
Mark Paluch
16bef54f11 Use StringUtils.replace(…) instead of String.replaceAll(…) for mapKeyDotReplacement.
We now use StringUtils.replace(…) to replace the map key dot in MappingMongoConverter. StringUtils perform a plain search instead of using Regex which improves the overall performance.

Closes #3613
2021-03-30 14:29:50 +02:00
Mark Paluch
d68a812e1b Polishing.
Omit StreamUtils usage if input is a collection. Remove superfluous Flux.from(…). Simplify test and migrate test to JUnit 5.

See #3609.
Original pull request: #3611.
2021-03-29 11:02:34 +02:00
Clément Petit
ccb9f111d9 Return saved entity reference instead of original reference.
Make SimpleReactiveMongoRepository#saveAll(Publisher<S>) return the saved entity references instead of the original references.

Closes #3609
Original pull request: #3611.
2021-03-29 10:55:36 +02:00
Mark Paluch
f64b177c8f Updated changelog.
See #3558
2021-03-17 11:31:32 +01:00
Mark Paluch
c0c7ba767f After release cleanups.
See #3561
2021-03-17 11:02:15 +01:00
Mark Paluch
7639701f3f Prepare next development iteration.
See #3561
2021-03-17 11:02:13 +01:00
Mark Paluch
b39b2591b6 Release version 3.1.6 (2020.0.6).
See #3561
2021-03-17 10:54:12 +01:00
Mark Paluch
65c8317e38 Prepare 3.1.6 (2020.0.6).
See #3561
2021-03-17 10:53:43 +01:00
Mark Paluch
9d0f7bac6a Updated changelog.
See #3561
2021-03-17 10:53:41 +01:00
Mark Paluch
6f50747d21 Updated changelog.
See #3556
2021-03-17 10:35:15 +01:00
Mark Paluch
5cf1578ad3 Polishing.
Move hasValue(…) from DocumentAccessor to BsonUtils. Fix typo in tests.

See: #3590
Original pull request: #3591.
2021-03-15 14:03:33 +01:00
Christoph Strobl
78a59c45ca Fix ShardKey lookup for nested paths.
This commit fixes the lookup of shard key values for nested paths using the dot (.) notation.

Closes: #3590
Original pull request: #3591.
2021-03-15 14:02:49 +01:00
Christoph Strobl
dccdfc8b4d Upgrade MongoDB drivers to 4.1.2
Closes #3589
2021-03-15 09:17:52 +01:00
Mark Paluch
e48239eb8f Remove @Persistent from entity-scan include filters.
We now only scan for entities annotated with `@Document` to avoid inclusion of non-MongoDB entities. Previously, types annotated (or meta-annotated) with `@Persistent` were included as MongoDB entity which could lead to mapping rule violations.

Closes #3592
2021-03-11 15:08:05 +01:00
Christoph Strobl
c3b4f61d29 Preserve class keyword as Map key during update mapping.
This commit makes sure to skip the class property ob Object when mapping maps and their keys inside an Update.

Closes #3566
Original pull request: #3577.
2021-03-02 11:38:20 +01:00
Mark Paluch
22ed860b4a Polishing.
Reformat code. Reduce method visibility in JUnit 5 tests. Add Nullable annotations to address warnings.

See #3568
Original pull request: #3569.
2021-03-02 11:30:26 +01:00
Brice Vandeputte
bf642ad3f7 Translate MongoSocketException subclasses to DataAccessResourceFailureException.
Closes #3568
Original pull request: #3569.
2021-03-02 11:30:26 +01:00
Christoph Strobl
fcd48539ea Remove duplicate JSON Schema section from reference documentation.
Closes: #3573
Original pull request: #3574.
2021-03-01 14:42:33 +01:00
Mark Paluch
bf10f72a57 Polishing.
Simplify assertions.

See #3552.
Original pull request: #3565.
2021-02-22 09:57:03 +01:00
Christoph Strobl
1c652cce1c Preserve numeric keys that exceed Long range when mapping Updates.
This commit makes sure we preserve map keys no matter what.

Closes #3552.
Original pull request: #3565.
2021-02-22 09:57:02 +01:00
Mark Paluch
dc2de878bc Polishing.
Reformat code. Add since tags.

See #3395
Original pull request: #3554.
2021-02-18 15:08:49 +01:00
Christoph Strobl
00cacc02ac Fix case insensitive derived in queries on String properties.
We now consider the IgnoreCase part of a derived query when used along with In. Strings will be quoted to avoid malicious strings from being handed over to the server as a regular expression to evaluate.

See #3395
Original pull request: #3554.
2021-02-18 15:08:49 +01:00
Christoph Strobl
811c2e5d7b Updated changelog.
See #3560
2021-02-18 11:37:47 +01:00
Christoph Strobl
200f3006bd After release cleanups.
See #3557
2021-02-18 11:12:46 +01:00
Christoph Strobl
1d6bea51ec Prepare next development iteration.
See #3557
2021-02-18 11:12:44 +01:00
Christoph Strobl
7779ded45c Release version 3.1.5 (2020.0.5).
See #3557
2021-02-18 10:59:16 +01:00
Christoph Strobl
918bf7c138 Prepare 3.1.5 (2020.0.5).
See #3557
2021-02-18 10:58:49 +01:00
Christoph Strobl
abe3b9f6d7 Updated changelog.
See #3557
2021-02-18 10:58:42 +01:00
Christoph Strobl
41c453cc83 Updated changelog.
See #3537
2021-02-17 14:20:39 +01:00
Christoph Strobl
77784d88c7 After release cleanups.
See #3536
2021-02-17 13:41:54 +01:00
Christoph Strobl
263c62c880 Prepare next development iteration.
See #3536
2021-02-17 13:41:53 +01:00
Christoph Strobl
24ab8f67bb Release version 3.1.4 (2020.0.4).
See #3536
2021-02-17 12:00:24 +01:00
Christoph Strobl
572ceb867e Prepare 3.1.4 (2020.0.4).
See #3536
2021-02-17 11:59:37 +01:00
Christoph Strobl
b7caea8602 Updated changelog.
See #3536
2021-02-17 11:59:32 +01:00
Christoph Strobl
3696f2144f Updated changelog.
See #3520
2021-02-17 11:34:25 +01:00
Christoph Strobl
b25c8acca6 Updated changelog.
See #3519
2021-02-17 10:58:24 +01:00
Christoph Strobl
00d6271468 Fix DocumentToStringConverter UUID representation when calling toJson.
This commit makes sure to use an Encoder having UuidRepresentation set when calling org.bson.Document#toJson, preventing CodecConfigurationException from being raised.

Future versions will make sure the UUID string representation matches the Java default one.

Closes #3546.
Original pull request: #3551.
2021-02-17 07:47:49 +01:00
Christoph Strobl
bb603ba7b9 Updated reference documentation regarding GeoJsonModule.
Original pull request: #3539.
Closes #3517
2021-02-02 14:50:54 +01:00
Christoph Strobl
02eaa4cbd2 Allow access to mongoDatabaseFactory used in ReactiveMongoTemplate.
By offering a getter method for the ReactiveMongoDatabaseFactory users subclassing ReactiveMongoTemplate could evaluate the current transaction state via ReactiveMongoDatabaseUtils.isTransactionActive(getDatabaseFactory()).
This change also aligns the reactive and imperative template implementation in that regard.

Closes #3540
Original pull request: #3541.
2021-02-02 14:22:15 +01:00
Christoph Strobl
7429503c63 Update count vs. estimatedCount documentation.
Closes #3055
Original pull request: #3541.
2021-02-02 14:22:15 +01:00
Christoph Strobl
82f4e2276b Fix Criteria chaining for Criteria.alike().
This commit fixes an issue where an Example probe would not be added to the criteria chain.

Closes #3544
Original pull request: #3549.
2021-02-01 09:14:52 +01:00
Mark Paluch
e1bce7d942 Polishing.
Align type variable naming with imperative extensions(I, O). Add extension without accepting KClass. Update since tags and tests.

See #3508.
Original pull request: #893.
2021-01-27 09:57:39 +01:00
wonwoo
8bf3d395be Add missing ReactiveMongoOperations.aggregate Kotlin extension.
See #3508.
Original pull request: #893.
2021-01-27 09:57:39 +01:00
Christoph Strobl
d3c00a93c0 Update QBE Documentation section.
This commit adds a note explaining scenarios suitable for an UntypedExampleMatcher.

Closes: #3474
Original pull request: #3538.
2021-01-26 15:04:10 +01:00
Christoph Strobl
0aa805e1a2 Fix method names in full text query documentation.
Closes #3525
2021-01-20 08:29:52 +01:00
Christoph Strobl
9dc1df3deb Updated changelog.
See #3521
2021-01-13 15:49:50 +01:00
Christoph Strobl
92a73a5cc0 After release cleanups.
See #3477
2021-01-13 15:01:50 +01:00
Christoph Strobl
910d66afb0 Prepare next development iteration.
See #3477
2021-01-13 15:01:46 +01:00
Christoph Strobl
1b60eae771 Release version 3.1.3 (2020.0.3).
See #3477
2021-01-13 14:18:34 +01:00
Christoph Strobl
60dfdd7de6 Prepare 3.1.3 (2020.0.3).
See #3477
2021-01-13 14:17:43 +01:00
Christoph Strobl
4ec48c376d Updated changelog.
See #3477
2021-01-13 14:17:35 +01:00
Christoph Strobl
dee2ba15a4 Update issue tracker references after GitHub issues migration.
See: #3529
2021-01-12 13:57:14 +01:00
Mark Paluch
c3596a503c Update copyright year to 2021.
Closes #3534
2021-01-12 11:50:08 +01:00
Mark Paluch
7dca5a2218 DATAMONGO-2671 - Polishing.
Fix copyright header.

Original pull request: #897.
2021-01-11 12:15:15 +01:00
Mark Paluch
f493006ec9 DATAMONGO-2671 - Polishing.
Fix copyright header. Add since tags.

Original pull request: #897.
2021-01-11 12:15:15 +01:00
Christoph Strobl
a0d5c2bded DATAMONGO-2671 - Fix dateFromParts millisecond field name.
Use millisecond instead of milliseconds field name.

Related to: https://jira.mongodb.org/browse/DOCS-10652
Original pull request: #897.
2021-01-11 12:14:37 +01:00
Greg L. Turnquist
c697fe57c9 DATAMONGO-2665 - Use Docker hub credentials for all CI jobs, 2020-12-15 15:22:42 -06:00
Mark Paluch
8a23452fdc DATAMONGO-2653 - After release cleanups. 2020-12-09 16:46:53 +01:00
Mark Paluch
bec7023a43 DATAMONGO-2653 - Prepare next development iteration. 2020-12-09 16:46:46 +01:00
Mark Paluch
89a7309707 DATAMONGO-2653 - Release version 3.1.2 (2020.0.2). 2020-12-09 16:01:24 +01:00
Mark Paluch
55ec9c5e5b DATAMONGO-2653 - Prepare 3.1.2 (2020.0.2). 2020-12-09 16:00:53 +01:00
Mark Paluch
cbb4dcb025 DATAMONGO-2653 - Updated changelog. 2020-12-09 16:00:49 +01:00
Mark Paluch
a0bad13505 DATAMONGO-2649 - Updated changelog. 2020-12-09 15:33:27 +01:00
Mark Paluch
d13521ae5d DATAMONGO-2647 - Updated changelog. 2020-12-09 12:42:29 +01:00
Mark Paluch
51f1526214 DATAMONGO-2646 - Updated changelog. 2020-12-09 09:59:14 +01:00
Mark Paluch
d049d96952 DATAMONGO-2663 - Document Spring Data to MongoDB compatibility.
Original Pull Request: #895
2020-12-07 14:39:32 +01:00
Mark Paluch
885bf0eae9 DATAMONGO-2661 - Polishing.
Add ticket reference.

Original pull request: #894.
2020-11-26 11:46:31 +01:00
Yoann de Martino
2a8ffd53d8 DATAMONGO-2661 - Handle nullable types for KPropertyPath.
Original pull request: #894.
2020-11-26 11:46:24 +01:00
Mark Paluch
d54ed61581 DATAMONGO-2648 - After release cleanups. 2020-11-11 12:14:53 +01:00
Mark Paluch
9e76da91a9 DATAMONGO-2648 - Prepare next development iteration. 2020-11-11 12:14:50 +01:00
Mark Paluch
595fde7b04 DATAMONGO-2648 - Release version 3.1.1 (2020.0.1). 2020-11-11 11:58:58 +01:00
Mark Paluch
01f4e73b48 DATAMONGO-2648 - Prepare 3.1.1 (2020.0.1). 2020-11-11 11:58:35 +01:00
Mark Paluch
2934c4886b DATAMONGO-2648 - Updated changelog. 2020-11-11 11:58:18 +01:00
Christoph Strobl
080c798721 DATAMONGO-2644 - ProjectOperation no longer errors on inclusion of default _id field.
Original pull request: #890.
2020-11-10 09:40:49 +01:00
Christoph Strobl
7cfb68e6be DATAMONGO-2635 - Enforce aggregation pipeline mapping.
Avoid using the Aggregation.DEFAULT_CONTEXT which does not map contained values to the according MongoDB representation. We now use a relaxed aggregation context, preserving given field names, where possible.

Original pull request: #890.
2020-11-10 09:40:49 +01:00
Mark Paluch
1e24abe8e5 DATAMONGO-2639 - Enable maintenance branch build. 2020-10-29 09:43:58 +01:00
Mark Paluch
a316d156dc DATAMONGO-2639 - After release cleanups. 2020-10-28 16:10:54 +01:00
Mark Paluch
6563b125eb DATAMONGO-2639 - Prepare next development iteration. 2020-10-28 16:10:50 +01:00
Mark Paluch
c9251b1b29 DATAMONGO-2639 - Release version 3.1 GA (2020.0.0). 2020-10-28 15:46:54 +01:00
Mark Paluch
373f07e176 DATAMONGO-2639 - Prepare 3.1 GA (2020.0.0). 2020-10-28 15:46:31 +01:00
Mark Paluch
f5e2bdc7ef DATAMONGO-2639 - Updated changelog. 2020-10-28 15:46:17 +01:00
Mark Paluch
30e63fffe2 DATAMONGO-2625 - Updated changelog. 2020-10-28 15:03:01 +01:00
Mark Paluch
83136b4e60 DATAMONGO-2624 - Updated changelog. 2020-10-28 12:15:04 +01:00
Mark Paluch
56697545a3 DATAMONGO-2641 - Updated changelog. 2020-10-28 11:32:27 +01:00
Robin Dupret
76eecc443e DATAMONGO-2638 - Fix list item rendering in reference documentation.
Original Pull Request: #885
2020-10-27 13:31:45 +01:00
LiangYong
1f81806809 DATAMONGO-2638 - Fix aggregation input parameter syntax in reference documentation.
Original Pull Request: #881
2020-10-27 13:31:35 +01:00
Greg L. Turnquist
2d348be5b2 DATAMONGO-2629 - Use JDK 15 for next CI jobs. 2020-10-26 13:26:11 -05:00
Christoph Strobl
bbbe369093 DATAMONGO-2642 - Upgrade MongoDB drivers to 4.1.1. 2020-10-26 12:46:07 +01:00
Christoph Strobl
5aa29fc7b8 DATAMONGO-2626 - After release cleanups. 2020-10-14 14:48:47 +02:00
Christoph Strobl
05fc6546ff DATAMONGO-2626 - Prepare next development iteration. 2020-10-14 14:48:45 +02:00
Christoph Strobl
2c6e645a3d DATAMONGO-2626 - Release version 3.1 RC2 (2020.0.0). 2020-10-14 14:28:55 +02:00
Christoph Strobl
20f702512b DATAMONGO-2626 - Prepare 3.1 RC2 (2020.0.0). 2020-10-14 14:27:37 +02:00
Christoph Strobl
ad77f23364 DATAMONGO-2626 - Updated changelog. 2020-10-14 14:27:30 +02:00
Mark Paluch
9af8a73290 DATAMONGO-2616 - Polishing.
Reformat code. Merge if-statements.

Original pull request: #889.
2020-10-07 11:35:47 +02:00
Christoph Strobl
aaa4557887 DATAMONGO-2616 - Short circuit id value assignment in MongoConverter.
Original pull request: #889.
2020-10-07 11:35:40 +02:00
Mark Paluch
217be64a77 DATAMONGO-2623 - Polishing.
Avoid nullable method arguments and add assertions. Introduce build() method to AccumulatorFinalizeBuilder to build Accumulator without specifying a finalize function.

Original pull request: #887.
2020-10-07 09:51:08 +02:00
Christoph Strobl
0ef852a8fc DATAMONGO-2623 - Add support for $function and $accumulator aggregation operators.
Original pull request: #887.
2020-10-07 09:50:51 +02:00
Mark Paluch
26f0a1c7f9 DATAMONGO-2622 - Polishing.
Rename AggregationPipeline.requiresRelaxedChecking() to containsUnionWith() to avoid the concept of field validation leaking into AggregationPipeline.

Refactor AggregationOperation to consistently check their type and fallback to the operator check to allow for consistent checks when using custo AggregationOperations.

Original pull request: #886.
2020-10-06 12:09:18 +02:00
Christoph Strobl
230c32041a DATAMONGO-2622 - Add support for $unionWith aggregation stage.
We now support the $unionWith aggregation stage via the UnionWithOperation that performs a union of two collections by combining pipeline results, potentially containing duplicates, into a single result set that is handed over to the next stage.
In order to remove duplicates it is possible to append a GroupOperation right after UnionWithOperation.
If the UnionWithOperation uses a pipeline to process documents, field names within the pipeline will be treated as is. In order to map domain type property names to actual field names (considering potential org.springframework.data.mongodb.core.mapping.Field annotations) make sure the enclosing aggregation is a TypedAggregation and provide the target type for the $unionWith stage via mapFieldsTo(Class).

Original pull request: #886.
2020-10-06 12:09:12 +02:00
Mark Paluch
4548d07826 DATAMONGO-2596 - Polishing.
Refactor KPropertyPath.toString() into KProperty.asPath() extension function to allow rendering simple properties and property paths into a String property path.

Original pull request: #880.
2020-10-06 10:18:52 +02:00
Yoann de Martino
b879ec8c0f DATAMONGO-2596 - Introduce extension to render KProperty/KPropertyPath as property path.
Original pull request: #880.
2020-10-06 10:18:19 +02:00
Mark Paluch
c0581c4943 DATAMONGO-2294 - Polishing.
Reorganize imports after Delomboking. Use for-loop instead of Stream.forEach(…). Add Javadoc to methods. Add since tags.

Simplify tests.

Original pull request: #761.
2020-10-05 17:00:58 +02:00
owen-q
85022d24f3 DATAMONGO-2294 - Support query projections with collection types.
Query include/exclude now accepts a vararg array of fields to specify multiple fields at once.

Original pull request: #761.
2020-10-05 17:00:37 +02:00
Christoph Strobl
b2927ab419 DATAMONGO-2633 - Fix json parsing of nested arrays in ParameterBindingDocumentCodec.
Original pull request: #888.
2020-10-05 15:34:50 +02:00
Mark Paluch
91c39e2825 DATAMONGO-2630 - Add support for suspend repository query methods returning List<T>. 2020-09-22 15:01:09 +02:00
Greg L. Turnquist
965a34efd3 DATAMONGO-2629 - Only test other versions for local changes on main branch. 2020-09-18 11:08:38 -05:00
Mark Paluch
046cbb52a1 DATAMONGO-2608 - After release cleanups. 2020-09-16 14:05:28 +02:00
Mark Paluch
edfd07a3d0 DATAMONGO-2608 - Prepare next development iteration. 2020-09-16 14:05:24 +02:00
Mark Paluch
b4befc36c0 DATAMONGO-2608 - Release version 3.1 RC1 (2020.0.0). 2020-09-16 13:57:41 +02:00
Mark Paluch
6034fc1cbd DATAMONGO-2608 - Prepare 3.1 RC1 (2020.0.0). 2020-09-16 13:57:08 +02:00
Mark Paluch
61f4770b4a DATAMONGO-2608 - Updated changelog. 2020-09-16 13:56:57 +02:00
Mark Paluch
c9cfe7acd6 DATAMONGO-2609 - Updated changelog. 2020-09-16 12:16:30 +02:00
Mark Paluch
415ceeef63 DATAMONGO-2593 - Updated changelog. 2020-09-16 11:20:07 +02:00
Mark Paluch
1bdcb88430 DATAMONGO-2592 - Updated changelog. 2020-09-16 10:38:57 +02:00
Christoph Strobl
1a134aa444 DATAMONGO-2618 - Fix visibility of ReplaceRootDocumentOperation. 2020-09-14 13:43:56 +02:00
Mark Paluch
c1da95f5dc DATAMONGO-2621 - Adapt to changed array assertions in AssertJ. 2020-09-09 15:55:48 +02:00
Christoph Strobl
c9c005400c DATAMONGO-2613 - Polishing.
Use the opportunity to remove public modifiers from touched test class.

Original Pull Request: #883
2020-08-20 09:00:21 +02:00
Michal Kurcius
b388659c3f DATAMONGO-2613 - Fix single element ArrayJsonSchemaObject to document mapping.
Now toDocument calls toDocument on items correctly.

Original Pull Request: #883
2020-08-20 08:59:46 +02:00
Mark Paluch
90aa7b8f89 DATAMONGO-2594 - Updated changelog. 2020-08-12 13:25:47 +02:00
Mark Paluch
542de64711 DATAMONGO-2579 - After release cleanups. 2020-08-12 12:00:22 +02:00
Mark Paluch
88b1f9fcb3 DATAMONGO-2579 - Prepare next development iteration. 2020-08-12 12:00:19 +02:00
Mark Paluch
450365992a DATAMONGO-2579 - Release version 3.1 M2 (2020.0.0). 2020-08-12 11:52:05 +02:00
Mark Paluch
fd25f39236 DATAMONGO-2579 - Prepare 3.1 M2 (2020.0.0). 2020-08-12 11:51:40 +02:00
Mark Paluch
a7e3ed2e37 DATAMONGO-2579 - Updated changelog. 2020-08-12 11:51:28 +02:00
Mark Paluch
5795a507bd DATAMONGO-1836 - Polishing.
Revert constructor change of AggregationOptions to not break existing code. Update since tags. Reformat code.

Align visibility of AggregationOptionsTests with JUnit 5 rules. Update documentation.

Original pull request: #878.
2020-08-06 11:25:41 +02:00
Yadhukrishna S Pai
22bd3e64be DATAMONGO-1836 - Add support to hint in aggregation options.
Original pull request: #878.
2020-08-06 11:25:36 +02:00
Mark Paluch
6e47d5c76e DATAMONGO-2603 - Polishing.
Add missing Deprecated annotation.
2020-08-04 13:35:26 +02:00
Mark Paluch
bfab233d2f DATAMONGO-2603 - Adopt to Reactor 3.4 changes.
Align with ContextView and changes in other operators.
2020-08-04 13:35:26 +02:00
Christoph Strobl
c6f12ef0e2 DATAMONGO-2602 - Upgrade MongoDB drivers to 4.1.0 2020-08-03 17:14:24 +02:00
Mark Paluch
707ad8e232 DATAMONGO-1894 - Polishing.
Preinitialize EvaluationContextProvider with ReactiveQueryMethodEvaluationContextProvider to not require setting properties on vanilla ReactiveMongoRepositoryFactory objects.
2020-07-31 11:44:07 +02:00
Mark Paluch
b1f5717d63 DATAMONGO-2601 - Suppress results for suspended query methods returning kotlin.Unit.
We now discard results for suspended query methods if the return type is kotlin.Unit.

Related ticket: DATACMNS-1779
2020-07-31 11:44:07 +02:00
Mark Paluch
95c9789f43 DATAMONGO-2599 - Eagerly consider enum types as simple types.
MongoSimpleTypes now eagerly checks if a type is a simple one to avoid PersistentEntity registration for ChronoUnit.
2020-07-30 16:19:10 +02:00
Mark Paluch
8e84d397e2 DATAMONGO-2564 - Fix link to code of conduct. 2020-07-28 15:40:26 +02:00
Mark Paluch
2ea3ceda2d DATAMONGO-2598 - Polishing.
Original pull request: #872.
2020-07-28 15:21:05 +02:00
Jay Bryant
6a43f28466 DATAMONGO-2598 - Wording changes.
Removed the language of oppression and violence
and replaced it with more neutral language.

Note that problematic words in the code have
to remain in the docs until the code changes.

Original pull request: #872.
2020-07-28 15:20:55 +02:00
Mark Paluch
a44a0034b7 DATAMONGO-2557 - Polishing.
Inline methods.

Original pull request: #879.
2020-07-27 09:02:05 +02:00
Christoph Strobl
0085c8063a DATAMONGO-2557 - Use configured CodecRegistry when parsing String based queries instead of default one.
Original pull request: #879.
2020-07-27 09:01:58 +02:00
Christoph Strobl
873fffa202 DATAMONGO-1894 - Polishing.
Remove superfluous Optional wrappers and unify SpEL dependency resolution.

Original Pull Request: #874
2020-07-22 14:02:12 +02:00
Mark Paluch
41607b10d0 DATAMONGO-1894 - Introduce cached ExpressionParser.
Original Pull Request: #874
2020-07-22 14:01:47 +02:00
Mark Paluch
66fae82798 DATAMONGO-1894 - Use reactive SpEL extensions for SpEL evaluation in query execution.
Original Pull Request: #874
2020-07-22 14:01:20 +02:00
Christoph Strobl
00aaf2145b DATAMONGO-2591 - Upgrade MongoDB drivers to 4.1.0-rc0. 2020-07-22 13:27:51 +02:00
Mark Paluch
430c166a2b DATAMONGO-2568 - Updated changelog. 2020-07-22 10:37:57 +02:00
Mark Paluch
79c647a4d8 DATAMONGO-2567 - Updated changelog. 2020-07-22 10:08:43 +02:00
Mark Paluch
1b5a22730b DATAMONGO-2566 - Updated changelog. 2020-07-22 09:44:28 +02:00
Christoph Strobl
a8a364c2de DATAMONGO-2586 - Polishing.
Add tests to ensure no reactive auditing callback is registered when using imperative configuration and vice versa.
Update wording and minor code style tweaks.

Original Pull Request: #877
2020-07-17 11:09:35 +02:00
Mark Paluch
6bafcea539 DATAMONGO-2586 - Add support for reactive auditing.
We now provide a fully reactive variant for auditing with EnableReactiveMongoAuditing.

Original Pull Request: #877
2020-07-17 10:42:41 +02:00
Mark Paluch
2c1a3cf03e DATAMONGO-2536 - Polishing.
Encapsulate skipResults in AggregationOptions. Reformat code. Add override Javadoc.

Original pull request: #876.
2020-07-16 09:42:42 +02:00
Christoph Strobl
6cb89d7452 DATAMONGO-2536 - Add option to skip reading aggregation result.
Introduce dedicated AggregationPipeline to encapsulate pipeline stages.

Original pull request: #876.
2020-07-16 09:42:26 +02:00
Mark Paluch
2026f8729e DATAMONGO-2571 - Polishing.
Reduce test method visibility for JUnit 5.

Original pull request: #873.
2020-07-15 15:33:38 +02:00
Christoph Strobl
bf89400182 DATAMONGO-2571 - Fix regular expression parameter binding for String-based queries.
Original pull request: #873.
2020-07-15 15:33:30 +02:00
Mark Paluch
6c8cb9eb85 DATAMONGO-2490 - Polishing.
Remove unnecessary code. Reuse session-associated collection when logging to avoid unqualified calls to MongoDbFactory.getMongoDatabase(). Create collection before transaction in test for compatibility with older MongoDB servers.

Original pull request: #875.
2020-07-15 15:13:58 +02:00
Christoph Strobl
966504dfa6 DATAMONGO-2490 - Fix dbref fetching during ongoing transaction.
Original pull request: #875.
2020-07-15 15:13:50 +02:00
Mark Paluch
b266bd6feb DATAMONGO-2544 - After release cleanups. 2020-06-25 11:58:22 +02:00
Mark Paluch
a6a4a0b3b6 DATAMONGO-2544 - Prepare next development iteration. 2020-06-25 11:58:19 +02:00
Mark Paluch
2a66cadaa6 DATAMONGO-2544 - Release version 3.1 M1 (2020.0.0). 2020-06-25 11:48:49 +02:00
Mark Paluch
a70629697b DATAMONGO-2544 - Prepare 3.1 M1 (2020.0.0). 2020-06-25 11:48:19 +02:00
Mark Paluch
d52785533d DATAMONGO-2544 - Updated changelog. 2020-06-25 11:48:10 +02:00
Christoph Strobl
cee1d976de DATAMONGO-2285 - Polishing.
Preserve existing logic translating com.mongodb.MongoBulkWriteException that might be thrown by calling MongoOperations.insertAll(Collection), and move bulk operation translation to DefaultBulkOperations.
Along the lines remove the no longer used PersistenceExceptionTranslator from DefaultBulkOperations.

Original Pull Request: #862
2020-06-23 13:22:03 +02:00
Jacob Botuck
f907dbc559 DATAMONGO-2285 - Throw BulkOperationException instead of translated one when running bulk operations.
Return BulkOperationException unless it is a writeConcernError in which case use existing behavior.

Original Pull Request: #862
2020-06-23 13:21:24 +02:00
Christoph Strobl
613d085bb7 DATAMONGO-1569 - Polishing.
Update Javadoc and avoid unrelated index creation during tests due to class path scanning.

Original Pull Request: #738
2020-06-22 13:32:03 +02:00
Christoph Strobl
94a64a156f DATAMONGO-1569 - Add partialFilter to Indexed annotation.
Original Pull Request: #738
2020-06-22 13:32:03 +02:00
Dave Perryman
41e60e5c25 DATAMONGO-1569 - Add support for partial filter expressions in compound index annotations
Original Pull Request: #738
2020-06-22 13:32:03 +02:00
Christoph Strobl
a254576a6e DATAMONGO-2574 - Polishing.
Fix issue in reactive API and add unit tests.

Original Pull Request: #868
2020-06-22 13:32:03 +02:00
konradend
390b5e7b7e DATAMONGO-2574 - Do not overwrite contentType for GridFsFile.
Original Pull Request: #868
2020-06-22 13:32:03 +02:00
Mark Paluch
66dcb8f662 DATAMONGO-2576 - Upgrade to Hibernate Validator 5.4.3.
Add also javax.el to avoid failures in early expression language initialization.
2020-06-22 13:21:32 +02:00
Christoph Strobl
2eaa2d38af DATAMONGO-2556 - Polishing.
Original pull request: #867.
2020-06-22 13:02:14 +02:00
Christoph Strobl
1290898c2b DATAMONGO-2556 - Add estimatedCount for collections.
The newly introduced methods delegate to the drivers MongoCollection.estimatedDocumentCount.

Original pull request: #867.
2020-06-22 13:02:06 +02:00
Mark Paluch
c4ae269b14 DATAMONGO-2570 - Polishing.
Add assertions. Use method references where possible.

Original pull request: #871.
2020-06-22 10:39:44 +02:00
Mehran Behnam
a4ef46d641 DATAMONGO-2570 - Change type of count variable to primitive.
Avoiding unintentional unboxing.

Original pull request: #871.
2020-06-22 10:39:25 +02:00
Mark Paluch
a27939808b DATAMONGO-2558 - Polishing.
Fixed issue number in tests.

Original pull request: #866.
2020-06-21 20:04:09 +02:00
Mark Paluch
da7fc927fa DATAMONGO-2558 - Add integration test for RxJava 3 repositories.
Update Javadoc and reference documentation.

Original pull request: #866.
2020-06-21 20:03:30 +02:00
Christoph Strobl
f768906684 DATAMONGO-2572 - Remove trailing whitespaces from xsd files.
Original Pull Request: #870
2020-06-17 13:07:00 +02:00
Mark Paluch
61a228f8ac DATAMONGO-2572 - Remove usage of Oppressive Language.
Replaced blacklist with denylist and introduce meta keyword SECONDARY_READS as we no longer use MongoDB API with the initial replication concept.

Original Pull Request: #870
2020-06-17 13:05:45 +02:00
Mark Paluch
837b333d7a DATAMONGO-2543 - Updated changelog. 2020-06-10 14:30:55 +02:00
Mark Paluch
ffd4166b65 DATAMONGO-2533 - Updated changelog. 2020-06-10 12:29:55 +02:00
Mark Paluch
083050fc3c DATAMONGO-2532 - Updated changelog. 2020-06-10 11:48:52 +02:00
Mark Paluch
efbf9c8305 DATAMONGO-2565 - Polishing.
Add unit test to verify behavior. Cleanup code.

Original pull request: #869.
2020-06-10 10:25:25 +02:00
BraveLeeLee
4fdf171f7d DATAMONGO-2565 - Evaluate correct expression when obtaining collation from MongoPersistentEntity.
Original pull request: #869.
2020-06-10 10:25:25 +02:00
Mark Paluch
76990f4915 DATAMONGO-2564 - Use standard Spring code of conduct.
Using https://github.com/spring-projects/.github/blob/master/CODE_OF_CONDUCT.md.
2020-06-09 16:10:27 +02:00
Mark Paluch
4defad57c9 DATAMONGO-2562 - Polishing.
Fix typo in exception message.
2020-06-09 11:18:25 +02:00
Mark Paluch
431f7e6b78 DATAMONGO-2562 - Fix return type detection for suspended Kotlin methods.
See DATACMNS-1738 for further reference.
2020-06-09 11:18:25 +02:00
Mark Paluch
e5157f1c17 DATAMONGO-2560 - Upgrade MongoDB drivers to 4.0.4. 2020-06-08 15:54:53 +02:00
Mark Paluch
af091bf6be DATAMONGO-2542 - Polishing.
Fix nullable annotation.

Original pull request: #863.
2020-05-26 10:32:03 +02:00
Christoph Strobl
84ac0e8a85 DATAMONGO-2542 - Shortcut PersistentPropertyPath resolution during query mapping.
By shortcutting the path resolution we avoid checking keywords like $in against a potential path expression.

Original pull request: #863.
2020-05-26 10:31:57 +02:00
Mark Paluch
2a150adc40 DATAMONGO-2545 - Polishing.
Fix warnings and typos.

Original pull request: #864.
2020-05-26 10:14:18 +02:00
Christoph Strobl
feb3018d19 DATAMONGO-2545 - Fix full Query Document binding resulting from SpEL.
We reenabled annotated queries using a SpEL expression resulting in the actual query document.

Original pull request: #864.
2020-05-26 10:14:14 +02:00
Christoph Strobl
3af7269dbb DATAMONGO-2545 - Fix regression in String query SpEL parameter binding.
We reenabled parameter binding within SpEL using query parameter placeholders ?0, ?1,... instead of their array index [0],[1],...

Original pull request: #864.
2020-05-26 10:14:08 +02:00
Christoph Strobl
af39b422b6 DATAMONGO-2547 - Use target class ClassLoader instead of default CL when creating proxy instances.
Original pull request: #865.
2020-05-26 08:50:01 +02:00
Mark Paluch
8b50778350 DATAMONGO-2553 - Fix reference documentation links.
Remove links to removed documentations sections.
2020-05-19 11:36:37 +02:00
Mark Paluch
ce2f8a7bf5 DATAMONGO-2538 - Polishing.
Reformat code. Convert utility classes to abstract ones.

Original pull request: #861.
2020-05-14 11:02:38 +02:00
Christoph Strobl
26fa0c8285 DATAMONGO-2538 - Delombok source files.
Original pull request: #861.
2020-05-14 11:02:20 +02:00
Mark Paluch
3fda554f81 DATAMONGO-2534 - After release cleanups. 2020-05-12 12:40:30 +02:00
Mark Paluch
c6a86226e0 DATAMONGO-2534 - Prepare next development iteration. 2020-05-12 12:40:28 +02:00
176 changed files with 6098 additions and 1706 deletions

View File

@@ -1,27 +0,0 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the https://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at https://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

44
Jenkinsfile vendored
View File

@@ -3,7 +3,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/2.3.x", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.4.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -46,16 +46,16 @@ pipeline {
}
}
}
stage('Publish JDK 14 + MongoDB 4.2') {
stage('Publish JDK 15 + MongoDB 4.2') {
when {
changeset "ci/openjdk14-mongodb-4.2/**"
changeset "ci/openjdk15-mongodb-4.2/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-openjdk14-with-mongodb-4.2.0", "ci/openjdk14-mongodb-4.2/")
def image = docker.build("springci/spring-data-openjdk15-with-mongodb-4.2.0", "ci/openjdk15-mongodb-4.2/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
@@ -68,7 +68,7 @@ pipeline {
stage("test: baseline (jdk8)") {
when {
anyOf {
branch '3.0.x'
branch '3.1.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -76,6 +76,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -85,7 +88,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -94,8 +97,8 @@ pipeline {
stage("Test other configurations") {
when {
anyOf {
branch '3.0.x'
allOf {
branch '3.1.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -105,6 +108,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -114,7 +120,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -126,6 +132,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -135,18 +144,21 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
stage("test: baseline (jdk14)") {
stage("test: baseline (jdk15)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -156,7 +168,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -168,7 +180,7 @@ pipeline {
stage('Release to artifactory') {
when {
anyOf {
branch '3.0.x'
branch '3.1.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -185,7 +197,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
@@ -201,7 +213,7 @@ pipeline {
stage('Publish documentation') {
when {
branch '3.0.x'
branch '3.1.x'
}
agent {
label 'data'
@@ -216,7 +228,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +

202
LICENSE.txt Normal file
View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -10,7 +10,7 @@ Key functional areas of Spring Data MongoDB are a POJO centric model for interac
== Code of Conduct
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started

View File

@@ -1,4 +1,4 @@
FROM adoptopenjdk/openjdk14:latest
FROM adoptopenjdk/openjdk15:latest
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive

13
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.0.7.RELEASE</version>
<version>3.1.11</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.3.7.RELEASE</version>
<version>2.4.11</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.3.7.RELEASE</springdata.commons>
<mongo>4.0.5</mongo>
<springdata.commons>2.4.11</springdata.commons>
<mongo>4.1.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -158,11 +158,6 @@
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
<pluginRepository>
<id>bintray-plugins</id>
<name>bintray-plugins</name>
<url>https://jcenter.bintray.com</url>
</pluginRepository>
</pluginRepositories>
</project>

29
settings.xml Normal file
View File

@@ -0,0 +1,29 @@
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
https://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>spring-plugins-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-snapshot</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-milestone</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
</servers>
</settings>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.0.7.RELEASE</version>
<version>3.1.11</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.0.7.RELEASE</version>
<version>3.1.11</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.0.7.RELEASE</version>
<version>3.1.11</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -136,6 +136,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava3}</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
@@ -192,7 +199,14 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.2.4.Final</version>
<version>5.4.3.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b11</version>
<scope>test</scope>
</dependency>

View File

@@ -61,8 +61,8 @@ public @interface EnableMongoAuditing {
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
*
* @return empty {@link String} by default.
*/

View File

@@ -0,0 +1,70 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.context.annotation.Import;
import org.springframework.data.auditing.DateTimeProvider;
import org.springframework.data.domain.ReactiveAuditorAware;
/**
* Annotation to enable auditing in MongoDB using reactive infrastructure via annotation configuration.
*
* @author Mark Paluch
* @since 3.1
*/
@Inherited
@Documented
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Import(ReactiveMongoAuditingRegistrar.class)
public @interface EnableReactiveMongoAuditing {
/**
* Configures the {@link ReactiveAuditorAware} bean to be used to lookup the current principal.
*
* @return empty {@link String} by default.
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
*
* @return empty {@link String} by default.
*/
String dateTimeProviderRef() default "";
}

View File

@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
@@ -28,14 +27,8 @@ import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
@@ -46,9 +39,6 @@ import org.springframework.util.ClassUtils;
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
@@ -91,7 +81,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
@@ -116,68 +106,6 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEntityCallback.class.getName(), registry);
if (PROJECT_REACTOR_AVAILABLE) {
registerReactiveAuditingEntityCallback(registry, auditingHandlerDefinition.getSource());
}
}
private void registerReactiveAuditingEntityCallback(BeanDefinitionRegistry registry, Object source) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(source);
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
/**
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
*/
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
}
}
}

View File

@@ -26,7 +26,6 @@ import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
@@ -140,8 +139,7 @@ public abstract class MongoConfigurationSupport {
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document}.
*
* @param basePackage must not be {@literal null}.
* @return
@@ -161,7 +159,6 @@ public abstract class MongoConfigurationSupport {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
/**
* Simple helper to be able to wire the {@link PersistentEntities} from a {@link MappingMongoConverter} bean available
* in the application context.
*
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 3.1
*/
class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEntities> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link PersistentEntitiesFactoryBean} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public PersistentEntitiesFactoryBean(MappingMongoConverter converter) {
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;
}
}

View File

@@ -0,0 +1,97 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation.
*
* @author Mark Paluch
* @since 3.1
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(auditingHandlerDefinition.getSource());
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
}

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
@@ -30,6 +28,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
@@ -49,12 +48,18 @@ import org.springframework.util.ObjectUtils;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
AggregationUtil(QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
}
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
@@ -71,12 +76,17 @@ class AggregationUtil {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
if (!(aggregation instanceof TypedAggregation)) {
return new RelaxedTypeBasedAggregationOperationContext(Object.class, mappingContext, queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
Class<?> inputType = ((TypedAggregation) aggregation).getInputType();
if (aggregation.getPipeline().containsUnionWith()) {
return new RelaxedTypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
/**
@@ -88,7 +98,7 @@ class AggregationUtil {
*/
List<Document> createPipeline(Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
if (ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toPipeline(context);
}

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
@@ -27,6 +25,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.OperationType;
@@ -39,7 +38,6 @@ import com.mongodb.client.model.changestream.OperationType;
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") //
@@ -187,8 +185,8 @@ public class ChangeStreamEvent<T> {
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
fullDocument.getClass(), targetType));
throw new IllegalArgumentException(
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
}
/*
@@ -199,4 +197,27 @@ public class ChangeStreamEvent<T> {
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamEvent<?> that = (ChangeStreamEvent<?>) o;
if (!ObjectUtils.nullSafeEquals(this.raw, that.raw)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.targetType, that.targetType);
}
@Override
public int hashCode() {
int result = raw != null ? raw.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(targetType);
return result;
}
}

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
@@ -25,7 +23,6 @@ import org.bson.BsonDocument;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
@@ -45,7 +42,6 @@ import com.mongodb.client.model.changestream.FullDocument;
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamOptions {
private @Nullable Object filter;
@@ -156,6 +152,44 @@ public class ChangeStreamOptions {
+ ObjectUtils.nullSafeClassName(timestamp));
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamOptions that = (ChangeStreamOptions) o;
if (!ObjectUtils.nullSafeEquals(this.filter, that.filter)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeToken, that.resumeToken)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeTimestamp, that.resumeTimestamp)) {
return false;
}
return resume == that.resume;
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(filter);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp);
result = 31 * result + ObjectUtils.nullSafeHashCode(resume);
return result;
}
/**
* @author Christoph Strobl
* @since 2.2

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.RequiredArgsConstructor;
import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation;
@@ -312,7 +310,6 @@ public class CollectionOptions {
* @author Andreas Zink
* @since 2.1
*/
@RequiredArgsConstructor
public static class ValidationOptions {
private static final ValidationOptions NONE = new ValidationOptions(null, null, null);
@@ -321,6 +318,13 @@ public class CollectionOptions {
private final @Nullable ValidationLevel validationLevel;
private final @Nullable ValidationAction validationAction;
public ValidationOptions(Validator validator, ValidationLevel validationLevel, ValidationAction validationAction) {
this.validator = validator;
this.validationLevel = validationLevel;
this.validationAction = validationAction;
}
/**
* Create an empty {@link ValidationOptions}.
*

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@@ -27,8 +24,9 @@ import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -47,7 +45,9 @@ import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
@@ -64,6 +64,7 @@ import com.mongodb.client.model.*;
* @author Jens Schauder
* @author Michail Nikolaev
* @author Roman Puchkovskiy
* @author Jacob Botuck
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
@@ -73,7 +74,6 @@ class DefaultBulkOperations implements BulkOperations {
private final BulkOperationContext bulkOperationContext;
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
@@ -97,19 +97,9 @@ class DefaultBulkOperations implements BulkOperations {
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
@@ -316,11 +306,26 @@ class DefaultBulkOperations implements BulkOperations {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
try {
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
} catch (RuntimeException ex) {
if (ex instanceof MongoBulkWriteException) {
MongoBulkWriteException mongoBulkWriteException = (MongoBulkWriteException) ex;
if (mongoBulkWriteException.getWriteConcernError() != null) {
throw new DataIntegrityViolationException(ex.getMessage(), ex);
}
throw new BulkOperationException(ex.getMessage(), mongoBulkWriteException);
}
throw ex;
}
}
private WriteModel<Document> extractAndMapWriteModel(SourceAwareWriteModelHolder it) {
@@ -547,15 +552,93 @@ class DefaultBulkOperations implements BulkOperations {
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
static final class BulkOperationContext {
@NonNull BulkMode bulkMode;
@NonNull Optional<? extends MongoPersistentEntity<?>> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
ApplicationEventPublisher eventPublisher;
EntityCallbacks entityCallbacks;
private final BulkMode bulkMode;
private final Optional<? extends MongoPersistentEntity<?>> entity;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final ApplicationEventPublisher eventPublisher;
private final EntityCallbacks entityCallbacks;
BulkOperationContext(BulkOperations.BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, ApplicationEventPublisher eventPublisher,
EntityCallbacks entityCallbacks) {
this.bulkMode = bulkMode;
this.entity = entity;
this.queryMapper = queryMapper;
this.updateMapper = updateMapper;
this.eventPublisher = eventPublisher;
this.entityCallbacks = entityCallbacks;
}
public BulkMode getBulkMode() {
return this.bulkMode;
}
public Optional<? extends MongoPersistentEntity<?>> getEntity() {
return this.entity;
}
public QueryMapper getQueryMapper() {
return this.queryMapper;
}
public UpdateMapper getUpdateMapper() {
return this.updateMapper;
}
public ApplicationEventPublisher getEventPublisher() {
return this.eventPublisher;
}
public EntityCallbacks getEntityCallbacks() {
return this.entityCallbacks;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BulkOperationContext that = (BulkOperationContext) o;
if (bulkMode != that.bulkMode)
return false;
if (!ObjectUtils.nullSafeEquals(this.entity, that.entity)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.queryMapper, that.queryMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.updateMapper, that.updateMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.eventPublisher, that.eventPublisher)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.entityCallbacks, that.entityCallbacks);
}
@Override
public int hashCode() {
int result = bulkMode != null ? bulkMode.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(entity);
result = 31 * result + ObjectUtils.nullSafeHashCode(queryMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(updateMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(eventPublisher);
result = 31 * result + ObjectUtils.nullSafeHashCode(entityCallbacks);
return result;
}
public String toString() {
return "DefaultBulkOperations.BulkOperationContext(bulkMode=" + this.getBulkMode() + ", entity="
+ this.getEntity() + ", queryMapper=" + this.getQueryMapper() + ", updateMapper=" + this.getUpdateMapper()
+ ", eventPublisher=" + this.getEventPublisher() + ", entityCallbacks=" + this.getEntityCallbacks() + ")";
}
}
/**
@@ -564,10 +647,50 @@ class DefaultBulkOperations implements BulkOperations {
* @since 2.2
* @author Christoph Strobl
*/
@Value
private static class SourceAwareWriteModelHolder {
private static final class SourceAwareWriteModelHolder {
Object source;
WriteModel<Document> model;
private final Object source;
private final WriteModel<Document> model;
SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
this.source = source;
this.model = model;
}
public Object getSource() {
return this.source;
}
public WriteModel<Document> getModel() {
return this.model;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SourceAwareWriteModelHolder that = (SourceAwareWriteModelHolder) o;
if (!ObjectUtils.nullSafeEquals(this.source, that.source)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.model, that.model);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(model);
result = 31 * result + ObjectUtils.nullSafeHashCode(source);
return result;
}
public String toString() {
return "DefaultBulkOperations.SourceAwareWriteModelHolder(source=" + this.getSource() + ", model="
+ this.getModel() + ")";
}
}
}

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.Map;
import java.util.Optional;
@@ -55,12 +51,15 @@ import org.springframework.util.MultiValueMap;
* @see MongoTemplate
* @see ReactiveMongoTemplate
*/
@RequiredArgsConstructor
class EntityOperations {
private static final String ID_FIELD = "_id";
private final @NonNull MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
EntityOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this.context = context;
}
/**
* Creates a new {@link Entity} for the given bean.
@@ -69,7 +68,7 @@ class EntityOperations {
* @return new instance of {@link Entity}.
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
public <T> Entity<T> forEntity(T entity) {
<T> Entity<T> forEntity(T entity) {
Assert.notNull(entity, "Bean must not be null!");
@@ -92,7 +91,7 @@ class EntityOperations {
* @return new instance of {@link AdaptibleEntity}.
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
public <T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
<T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(conversionService, "ConversionService must not be null!");
@@ -346,11 +345,14 @@ class EntityOperations {
Number getVersion();
}
@RequiredArgsConstructor
private static class UnmappedEntity<T extends Map<String, Object>> implements AdaptibleEntity<T> {
private final T map;
protected UnmappedEntity(T map) {
this.map = map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
@@ -460,7 +462,7 @@ class EntityOperations {
private static class SimpleMappedEntity<T extends Map<String, Object>> extends UnmappedEntity<T> {
SimpleMappedEntity(T map) {
protected SimpleMappedEntity(T map) {
super(map);
}
@@ -483,12 +485,19 @@ class EntityOperations {
}
}
@RequiredArgsConstructor(access = AccessLevel.PROTECTED)
private static class MappedEntity<T> implements Entity<T> {
private final @NonNull MongoPersistentEntity<?> entity;
private final @NonNull IdentifierAccessor idAccessor;
private final @NonNull PersistentPropertyAccessor<T> propertyAccessor;
private final MongoPersistentEntity<?> entity;
private final IdentifierAccessor idAccessor;
private final PersistentPropertyAccessor<T> propertyAccessor;
protected MappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor idAccessor,
PersistentPropertyAccessor<T> propertyAccessor) {
this.entity = entity;
this.idAccessor = idAccessor;
this.propertyAccessor = propertyAccessor;
}
private static <T> MappedEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
@@ -759,11 +768,12 @@ class EntityOperations {
* {@link TypedOperations} for generic entities that are not represented with {@link PersistentEntity} (e.g. custom
* conversions).
*/
@RequiredArgsConstructor
enum UntypedOperations implements TypedOperations<Object> {
INSTANCE;
UntypedOperations() {}
@SuppressWarnings({ "unchecked", "rawtypes" })
public static <T> TypedOperations<T> instance() {
return (TypedOperations) INSTANCE;
@@ -798,10 +808,13 @@ class EntityOperations {
*
* @param <T>
*/
@RequiredArgsConstructor
static class TypedEntityOperations<T> implements TypedOperations<T> {
private final @NonNull MongoPersistentEntity<T> entity;
private final MongoPersistentEntity<T> entity;
protected TypedEntityOperations(MongoPersistentEntity<T> entity) {
this.entity = entity;
}
/*
* (non-Javadoc)

View File

@@ -15,16 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -35,10 +29,13 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableAggregationOperationSupport implements ExecutableAggregationOperation {
private final @NonNull MongoTemplate template;
private final MongoTemplate template;
ExecutableAggregationOperationSupport(MongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -56,15 +53,21 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableAggregationSupport<T>
implements AggregationWithAggregation<T>, ExecutableAggregation<T>, TerminatingAggregation<T> {
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
@Nullable Aggregation aggregation;
@Nullable String collection;
private final MongoTemplate template;
private final Class<T> domainType;
private final Aggregation aggregation;
private final String collection;
public ExecutableAggregationSupport(MongoTemplate template, Class<T> domainType, Aggregation aggregation,
String collection) {
this.template = template;
this.domainType = domainType;
this.aggregation = aggregation;
this.collection = collection;
}
/*
* (non-Javadoc)

View File

@@ -125,6 +125,11 @@ public interface ExecutableFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link MongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return total number of matching elements.
*/

View File

@@ -15,12 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.ReadPreference;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.List;
import java.util.Optional;
import java.util.stream.Stream;
@@ -37,6 +31,7 @@ import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.ReadPreference;
import com.mongodb.client.FindIterable;
/**
@@ -46,12 +41,15 @@ import com.mongodb.client.FindIterable;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableFindOperationSupport implements ExecutableFindOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull MongoTemplate template;
private final MongoTemplate template;
ExecutableFindOperationSupport(MongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -70,16 +68,23 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableFindSupport<T>
implements ExecutableFind<T>, FindWithCollection<T>, FindWithProjection<T>, FindWithQuery<T> {
@NonNull MongoTemplate template;
@NonNull Class<?> domainType;
Class<T> returnType;
@Nullable String collection;
Query query;
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
@Nullable private final String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
}
/*
* (non-Javadoc)

View File

@@ -15,11 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.ArrayList;
import java.util.Collection;
@@ -37,10 +32,13 @@ import com.mongodb.bulk.BulkWriteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
private final @NonNull MongoTemplate template;
private final MongoTemplate template;
ExecutableInsertOperationSupport(MongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -58,14 +56,20 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableInsertSupport<T> implements ExecutableInsert<T> {
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
@Nullable String collection;
@Nullable BulkMode bulkMode;
private final MongoTemplate template;
private final Class<T> domainType;
@Nullable private final String collection;
@Nullable private final BulkMode bulkMode;
ExecutableInsertSupport(MongoTemplate template, Class<T> domainType, String collection, BulkMode bulkMode) {
this.template = template;
this.domainType = domainType;
this.collection = collection;
this.bulkMode = bulkMode;
}
/*
* (non-Javadoc)

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.List;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -32,12 +29,17 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull MongoTemplate template;
private final MongoTemplate template;
ExecutableMapReduceOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
/*
* (non-Javascript)

View File

@@ -15,11 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
@@ -36,12 +31,15 @@ import com.mongodb.client.result.DeleteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull MongoTemplate tempate;
private final MongoTemplate tempate;
public ExecutableRemoveOperationSupport(MongoTemplate tempate) {
this.tempate = tempate;
}
/*
* (non-Javadoc)
@@ -59,14 +57,19 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableRemoveSupport<T> implements ExecutableRemove<T>, RemoveWithCollection<T> {
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
Query query;
@Nullable String collection;
private final MongoTemplate template;
private final Class<T> domainType;
private final Query query;
@Nullable private final String collection;
public ExecutableRemoveSupport(MongoTemplate template, Class<T> domainType, Query query, String collection) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.collection = collection;
}
/*
* (non-Javadoc)

View File

@@ -15,11 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
@@ -35,12 +30,15 @@ import com.mongodb.client.result.UpdateResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull MongoTemplate template;
private final MongoTemplate template;
ExecutableUpdateOperationSupport(MongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -58,21 +56,34 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableUpdateSupport<T>
implements ExecutableUpdate<T>, UpdateWithCollection<T>, UpdateWithQuery<T>, TerminatingUpdate<T>,
FindAndReplaceWithOptions<T>, TerminatingFindAndReplace<T>, FindAndReplaceWithProjection<T> {
@NonNull MongoTemplate template;
@NonNull Class domainType;
Query query;
@Nullable UpdateDefinition update;
@Nullable String collection;
@Nullable FindAndModifyOptions findAndModifyOptions;
@Nullable FindAndReplaceOptions findAndReplaceOptions;
@Nullable Object replacement;
@NonNull Class<T> targetType;
private final MongoTemplate template;
private final Class domainType;
private final Query query;
@Nullable private final UpdateDefinition update;
@Nullable private final String collection;
@Nullable private final FindAndModifyOptions findAndModifyOptions;
@Nullable private final FindAndReplaceOptions findAndReplaceOptions;
@Nullable private final Object replacement;
private final Class<T> targetType;
ExecutableUpdateSupport(MongoTemplate template, Class domainType, Query query, UpdateDefinition update,
String collection, FindAndModifyOptions findAndModifyOptions, FindAndReplaceOptions findAndReplaceOptions,
Object replacement, Class<T> targetType) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.update = update;
this.collection = collection;
this.findAndModifyOptions = findAndModifyOptions;
this.findAndReplaceOptions = findAndReplaceOptions;
this.replacement = replacement;
this.targetType = targetType;
}
/*
* (non-Javadoc)

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.List;
@@ -27,8 +24,6 @@ import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.StreamUtils;
import com.mongodb.client.model.Filters;
/**
* A MongoDB document in its mapped state. I.e. after a source document has been mapped using mapping information of the
* entity the source document was supposed to represent.
@@ -36,13 +31,20 @@ import com.mongodb.client.model.Filters;
* @author Oliver Gierke
* @since 2.1
*/
@RequiredArgsConstructor(staticName = "of")
public class MappedDocument {
private static final String ID_FIELD = "_id";
private static final Document ID_ONLY_PROJECTION = new Document(ID_FIELD, 1);
private final @Getter Document document;
private final Document document;
private MappedDocument(Document document) {
this.document = document;
}
public static MappedDocument of(Document document) {
return new MappedDocument(document);
}
public static Document getIdOnlyProjection() {
return ID_ONLY_PROJECTION;
@@ -91,6 +93,10 @@ public class MappedDocument {
return new MappedUpdate(Update.fromDocument(document, ID_FIELD));
}
public Document getDocument() {
return this.document;
}
/**
* An {@link UpdateDefinition} that indicates that the {@link #getUpdateObject() update object} has already been
* mapped to the specific domain type.

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
@@ -24,6 +22,7 @@ import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.WriteConcern;
@@ -171,11 +170,15 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
* @author Christoph Strobl
* @since 2.1
*/
@Value
static class ClientSessionBoundMongoDbFactory implements MongoDatabaseFactory {
static final class ClientSessionBoundMongoDbFactory implements MongoDatabaseFactory {
ClientSession session;
MongoDatabaseFactory delegate;
private final ClientSession session;
private final MongoDatabaseFactory delegate;
public ClientSessionBoundMongoDbFactory(ClientSession session, MongoDatabaseFactory delegate) {
this.session = session;
this.delegate = delegate;
}
/*
* (non-Javadoc)
@@ -256,5 +259,40 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
return targetType.cast(factory.getProxy(target.getClass().getClassLoader()));
}
public ClientSession getSession() {
return this.session;
}
public MongoDatabaseFactory getDelegate() {
return this.delegate;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ClientSessionBoundMongoDbFactory that = (ClientSessionBoundMongoDbFactory) o;
if (!ObjectUtils.nullSafeEquals(this.session, that.session)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(this.session);
result = 31 * result + ObjectUtils.nullSafeHashCode(this.delegate);
return result;
}
public String toString() {
return "MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session=" + this.getSession() + ", delegate="
+ this.getDelegate() + ")";
}
}
}

View File

@@ -21,6 +21,7 @@ import java.util.HashSet;
import java.util.Set;
import org.bson.BsonInvalidOperationException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
@@ -39,6 +40,7 @@ import org.springframework.util.ClassUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.MongoException;
import com.mongodb.MongoServerException;
import com.mongodb.MongoSocketException;
import com.mongodb.bulk.BulkWriteError;
/**
@@ -49,6 +51,7 @@ import com.mongodb.bulk.BulkWriteError;
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
* @author Brice Vandeputte
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
@@ -78,6 +81,10 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
}
if (ex instanceof MongoSocketException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DUPLICATE_KEY_EXCEPTIONS.contains(exception)) {

View File

@@ -1160,6 +1160,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -1176,6 +1182,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -1184,6 +1196,35 @@ public interface MongoOperations extends FluentMongoOperations {
*/
long count(Query query, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
default long estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
long estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
@@ -1191,6 +1232,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.

View File

@@ -17,11 +17,6 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.io.IOException;
import java.math.BigDecimal;
import java.math.RoundingMode;
@@ -33,7 +28,6 @@ import org.bson.Document;
import org.bson.conversions.Bson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -160,6 +154,7 @@ import com.mongodb.client.result.UpdateResult;
* @author Cimon Lucas
* @author Michael J. Simons
* @author Roman Puchkovskiy
* @author Yadhukrishna S Pai
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware, IndexOperationsProvider {
@@ -763,7 +758,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
new BulkOperationContext(mode, Optional.ofNullable(getPersistentEntity(entityType)), queryMapper, updateMapper,
eventPublisher, entityCallbacks));
operations.setExceptionTranslator(exceptionTranslator);
operations.setDefaultWriteConcern(writeConcern);
return operations;
@@ -1140,6 +1134,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collection -> collection.countDocuments(CountQuery.of(filter).toQueryDocument(), options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#estimatedCount(java.lang.String)
*/
@Override
public long estimatedCount(String collectionName) {
return doEstimatedCount(collectionName, new EstimatedDocumentCountOptions());
}
protected long doEstimatedCount(String collectionName, EstimatedDocumentCountOptions options) {
return execute(collectionName, collection -> collection.estimatedDocumentCount(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#insert(java.lang.Object)
@@ -1969,9 +1976,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
mappingContext, queryMapper);
return aggregate(aggregation, inputCollectionName, outputType, context);
return aggregate(aggregation, inputCollectionName, outputType, null);
}
/* (non-Javadoc)
@@ -2141,6 +2146,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
options.getComment().ifPresent(aggregateIterable::comment);
options.getHint().ifPresent(aggregateIterable::hint);
if (options.hasExecutionTimeLimit()) {
aggregateIterable = aggregateIterable.maxTime(options.getMaxTime().toMillis(), TimeUnit.MILLISECONDS);
@@ -2199,6 +2205,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
options.getComment().ifPresent(cursor::comment);
options.getHint().ifPresent(cursor::hint);
Class<?> domainType = aggregation instanceof TypedAggregation ? ((TypedAggregation) aggregation).getInputType()
: null;
@@ -2961,12 +2968,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
private class ExistsCallback implements CollectionCallback<Boolean> {
private final Document mappedQuery;
private final com.mongodb.client.model.Collation collation;
ExistsCallback(Document mappedQuery, com.mongodb.client.model.Collation collation) {
this.mappedQuery = mappedQuery;
this.collation = collation;
}
@Override
public Boolean doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
@@ -2988,7 +3000,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Document sort;
private final Optional<Collation> collation;
public FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
this.query = query;
this.fields = fields;
@@ -3014,8 +3026,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
public FindAndModifyCallback(Document query, Document fields, Document sort, Object update,
List<Document> arrayFilters, FindAndModifyOptions options) {
FindAndModifyCallback(Document query, Document fields, Document sort, Object update, List<Document> arrayFilters,
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
@@ -3124,13 +3137,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Christoph Strobl
* @author Roman Puchkovskiy
*/
@RequiredArgsConstructor
private class ReadDocumentCallback<T> implements DocumentCallback<T> {
private final @NonNull EntityReader<? super T, Bson> reader;
private final @NonNull Class<T> type;
private final EntityReader<? super T, Bson> reader;
private final Class<T> type;
private final String collectionName;
ReadDocumentCallback(EntityReader<? super T, Bson> reader, Class<T> type, String collectionName) {
this.reader = reader;
this.type = type;
this.collectionName = collectionName;
}
@Nullable
public T doWith(@Nullable Document document) {
@@ -3158,13 +3177,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param <T>
* @since 2.0
*/
@RequiredArgsConstructor
private class ProjectingReadCallback<S, T> implements DocumentCallback<T> {
private final @NonNull EntityReader<Object, Bson> reader;
private final @NonNull Class<S> entityType;
private final @NonNull Class<T> targetType;
private final @NonNull String collectionName;
private final EntityReader<Object, Bson> reader;
private final Class<S> entityType;
private final Class<T> targetType;
private final String collectionName;
ProjectingReadCallback(EntityReader<Object, Bson> reader, Class<S> entityType, Class<T> targetType,
String collectionName) {
this.reader = reader;
this.entityType = entityType;
this.targetType = targetType;
this.collectionName = collectionName;
}
/*
* (non-Javadoc)
@@ -3200,7 +3227,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Query query;
private final @Nullable Class<?> type;
public QueryCursorPreparer(Query query, @Nullable Class<?> type) {
QueryCursorPreparer(Query query, @Nullable Class<?> type) {
this.query = query;
this.type = type;
@@ -3344,7 +3371,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Thomas Darimont
* @since 1.7
*/
@AllArgsConstructor(access = AccessLevel.PACKAGE)
static class CloseableIterableCursorAdapter<T> implements CloseableIterator<T> {
private volatile @Nullable MongoCursor<Document> cursor;
@@ -3358,14 +3384,22 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param exceptionTranslator
* @param objectReadCallback
*/
public CloseableIterableCursorAdapter(MongoIterable<Document> cursor,
PersistenceExceptionTranslator exceptionTranslator, DocumentCallback<T> objectReadCallback) {
CloseableIterableCursorAdapter(MongoIterable<Document> cursor, PersistenceExceptionTranslator exceptionTranslator,
DocumentCallback<T> objectReadCallback) {
this.cursor = cursor.iterator();
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
CloseableIterableCursorAdapter(MongoCursor<Document> cursor, PersistenceExceptionTranslator exceptionTranslator,
DocumentCallback<T> objectReadCallback) {
this.cursor = cursor;
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
@Override
public boolean hasNext() {
@@ -3419,7 +3453,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
}
/**
* @deprecated since 3.1.4. Use {@link #getMongoDatabaseFactory()} instead.
* @return the {@link MongoDatabaseFactory} in use.
*/
@Deprecated
public MongoDatabaseFactory getMongoDbFactory() {
return getMongoDatabaseFactory();
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public MongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDbFactory;
}

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mapping.SimplePropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
@@ -33,11 +30,14 @@ import org.springframework.util.ClassUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
class PropertyOperations {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
PropertyOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.mappingContext = mappingContext;
}
/**
* For cases where {@code fields} is {@link Document#isEmpty() empty} include only fields that are required for
* creating the projection (target) type if the {@code targetType} is a {@literal DTO projection} or a

View File

@@ -658,7 +658,8 @@ class QueryOperations {
: mappedDocument != null ? mappedDocument.getDocument() : getMappedUpdate(domainType);
Document filterWithShardKey = new Document(filter);
getMappedShardKeyFields(domainType).forEach(key -> filterWithShardKey.putIfAbsent(key, shardKeySource.get(key)));
getMappedShardKeyFields(domainType)
.forEach(key -> filterWithShardKey.putIfAbsent(key, BsonUtils.resolveValue(shardKeySource, key)));
return filterWithShardKey;
}
@@ -707,10 +708,9 @@ class QueryOperations {
*/
List<Document> getUpdatePipeline(@Nullable Class<?> domainType) {
AggregationOperationContext context = domainType != null
? new RelaxedTypeBasedAggregationOperationContext(domainType, mappingContext, queryMapper)
: Aggregation.DEFAULT_CONTEXT;
Class<?> type = domainType != null ? domainType : Object.class;
AggregationOperationContext context = new RelaxedTypeBasedAggregationOperationContext(type, mappingContext, queryMapper);
return aggregationUtil.createPipeline((AggregationUpdate) update, context);
}

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -62,15 +58,22 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, null, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveAggregationSupport<T>
implements AggregationOperationWithAggregation<T>, ReactiveAggregation<T>, TerminatingAggregationOperation<T> {
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
Aggregation aggregation;
String collection;
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final Aggregation aggregation;
private final String collection;
ReactiveAggregationSupport(ReactiveMongoTemplate template, Class<T> domainType, Aggregation aggregation,
String collection) {
this.template = template;
this.domainType = domainType;
this.aggregation = aggregation;
this.collection = collection;
}
/*
* (non-Javadoc)

View File

@@ -106,6 +106,12 @@ public interface ReactiveFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but
* guarantees shard, session and transaction compliance. In case an inaccurate count satisfies the applications
* needs use {@link ReactiveMongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return {@link Mono} emitting total number of matching elements. Never {@literal null}.
*/

View File

@@ -15,15 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -39,12 +34,15 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveFindOperationSupport implements ReactiveFindOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull ReactiveMongoTemplate template;
private final ReactiveMongoTemplate template;
ReactiveFindOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -64,16 +62,24 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveFindSupport<T>
implements ReactiveFind<T>, FindWithCollection<T>, FindWithProjection<T>, FindWithQuery<T> {
@NonNull ReactiveMongoTemplate template;
@NonNull Class<?> domainType;
Class<T> returnType;
String collection;
Query query;
private final ReactiveMongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
private final String collection;
private final Query query;
ReactiveFindSupport(ReactiveMongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
}
/*
* (non-Javadoc)

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -34,10 +30,13 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
private final @NonNull ReactiveMongoTemplate template;
private final ReactiveMongoTemplate template;
ReactiveInsertOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -51,13 +50,18 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return new ReactiveInsertSupport<>(template, domainType, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveInsertSupport<T> implements ReactiveInsert<T> {
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
String collection;
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final String collection;
ReactiveInsertSupport(ReactiveMongoTemplate template, Class<T> domainType, String collection) {
this.template = template;
this.domainType = domainType;
this.collection = collection;
}
/*
* (non-Javadoc)

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -31,12 +29,15 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull ReactiveMongoTemplate template;
private final ReactiveMongoTemplate template;
ReactiveMapReduceOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
/*
* (non-Javascript)

View File

@@ -15,11 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import org.reactivestreams.Publisher;
import org.springframework.util.Assert;
import reactor.core.publisher.Mono;
import reactor.util.context.Context;
import java.util.function.Function;
import org.reactivestreams.Publisher;
import org.springframework.util.Assert;
import com.mongodb.reactivestreams.client.ClientSession;
/**
@@ -29,7 +33,7 @@ import com.mongodb.reactivestreams.client.ClientSession;
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see Mono#subscriberContext()
* @see Mono#deferContextual(Function)
* @see Context
*/
public class ReactiveMongoContext {
@@ -46,8 +50,14 @@ public class ReactiveMongoContext {
*/
public static Mono<ClientSession> getSession() {
return Mono.subscriberContext().filter(ctx -> ctx.hasKey(SESSION_KEY))
.flatMap(ctx -> ctx.<Mono<ClientSession>> get(SESSION_KEY));
return Mono.deferContextual(ctx -> {
if (ctx.hasKey(SESSION_KEY)) {
return ctx.<Mono<ClientSession>> get(SESSION_KEY);
}
return Mono.empty();
});
}
/**

View File

@@ -940,6 +940,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -956,6 +962,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -971,6 +983,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -980,6 +998,35 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
default Mono<Long> estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
Mono<Long> estimatedCount(String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>

View File

@@ -17,9 +17,7 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2;
@@ -120,16 +118,7 @@ import com.mongodb.CursorType;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndReplaceOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.ValidationOptions;
import com.mongodb.client.model.*;
import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.InsertOneResult;
@@ -158,6 +147,7 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* @author Christoph Strobl
* @author Roman Puchkovskiy
* @author Mathieu Ouellet
* @author Yadhukrishna S Pai
* @since 2.0
*/
public class ReactiveMongoTemplate implements ReactiveMongoOperations, ApplicationContextAware {
@@ -584,7 +574,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
ReactiveMongoTemplate.this);
return Flux.from(action.doInSession(operations)) //
.subscriberContext(ctx -> ReactiveMongoContext.setSession(ctx, Mono.just(session)));
.contextWrite(ctx -> ReactiveMongoContext.setSession(ctx, Mono.just(session)));
}
/*
@@ -1035,6 +1025,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
options.getComment().ifPresent(cursor::comment);
options.getHint().ifPresent(cursor::hint);
Optionals.firstNonEmpty(options::getCollation, () -> operations.forType(inputType).getCollation()) //
.map(Collation::toMongoCollation) //
@@ -1258,6 +1249,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#estimatedCount(java.lang.String)
*/
@Override
public Mono<Long> estimatedCount(String collectionName) {
return doEstimatedCount(collectionName, new EstimatedDocumentCountOptions());
}
/**
* Run the actual count operation against the collection with given name.
*
@@ -1272,6 +1272,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collection -> collection.countDocuments(CountQuery.of(filter).toQueryDocument(), options));
}
protected Mono<Long> doEstimatedCount(String collectionName, EstimatedDocumentCountOptions options) {
return createMono(collectionName, collection -> collection.estimatedDocumentCount(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(reactor.core.publisher.Mono)
@@ -2108,7 +2113,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
AggregationOperationContext context = agg instanceof TypedAggregation
? new TypeBasedAggregationOperationContext(((TypedAggregation<?>) agg).getInputType(),
getConverter().getMappingContext(), queryMapper)
: Aggregation.DEFAULT_CONTEXT;
: new RelaxedTypeBasedAggregationOperationContext(Object.class, mappingContext, queryMapper);
return agg.toPipeline(new PrefixingDelegatingAggregationOperationContext(context, "fullDocument",
Arrays.asList("operationType", "fullDocument", "documentKey", "updateDescription", "ns")));
@@ -2418,8 +2423,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the {@link com.mongodb.client.FindIterable} used when iterating over the result set, (apply
* limits, skips and so on).
* @param preparer allows for customization of the {@link com.mongodb.client.FindIterable} used when iterating over
* the result set, (apply limits, skips and so on).
* @return the {@link List} of converted objects.
*/
protected <T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<T> entityClass,
@@ -2725,6 +2730,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return potentiallyForceAcknowledgedWrite(wc);
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public ReactiveMongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDatabaseFactory;
}
@Nullable
private WriteConcern potentiallyForceAcknowledgedWrite(@Nullable WriteConcern wc) {
@@ -2893,7 +2906,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
private static class FindCallback implements ReactiveCollectionQueryCallback<Document> {
private final @Nullable Document query;
@@ -2903,6 +2915,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this(query, null);
}
FindCallback(Document query, Document fields) {
this.query = query;
this.fields = fields;
}
@Override
public FindPublisher<Document> doInCollection(MongoCollection<Document> collection) {
@@ -2956,7 +2974,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
/**
* @author Mark Paluch
*/
@RequiredArgsConstructor
private static class FindAndModifyCallback implements ReactiveCollectionCallback<Document> {
private final Document query;
@@ -2966,6 +2983,17 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
FindAndModifyCallback(Document query, Document fields, Document sort, Object update, List<Document> arrayFilters,
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.arrayFilters = arrayFilters;
this.options = options;
}
@Override
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
@@ -3021,7 +3049,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
private static class FindAndReplaceCallback implements ReactiveCollectionCallback<Document> {
private final Document query;
@@ -3031,6 +3058,17 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final @Nullable com.mongodb.client.model.Collation collation;
private final FindAndReplaceOptions options;
FindAndReplaceCallback(Document query, Document fields, Document sort, Document update,
com.mongodb.client.model.Collation collation, FindAndReplaceOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.collation = collation;
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveCollectionCallback#doInCollection(com.mongodb.reactivestreams.client.MongoCollection)
@@ -3147,13 +3185,20 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @author Roman Puchkovskiy
* @since 2.0
*/
@RequiredArgsConstructor
private class ProjectingReadCallback<S, T> implements DocumentCallback<T> {
private final @NonNull EntityReader<Object, Bson> reader;
private final @NonNull Class<S> entityType;
private final @NonNull Class<T> targetType;
private final @NonNull String collectionName;
private final EntityReader<Object, Bson> reader;
private final Class<S> entityType;
private final Class<T> targetType;
private final String collectionName;
ProjectingReadCallback(EntityReader<Object, Bson> reader, Class<S> entityType, Class<T> targetType,
String collectionName) {
this.reader = reader;
this.entityType = entityType;
this.targetType = targetType;
this.collectionName = collectionName;
}
@SuppressWarnings("unchecked")
public Mono<T> doWith(Document document) {
@@ -3374,11 +3419,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
}
@RequiredArgsConstructor
class IndexCreatorEventListener implements ApplicationListener<MappingContextEvent<?, ?>> {
final Consumer<Throwable> subscriptionExceptionHandler;
public IndexCreatorEventListener(Consumer<Throwable> subscriptionExceptionHandler) {
this.subscriptionExceptionHandler = subscriptionExceptionHandler;
}
@Override
public void onApplicationEvent(MappingContextEvent<?, ?> event) {

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -35,12 +31,15 @@ import com.mongodb.client.result.DeleteResult;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull ReactiveMongoTemplate tempate;
private final ReactiveMongoTemplate tempate;
ReactiveRemoveOperationSupport(ReactiveMongoTemplate tempate) {
this.tempate = tempate;
}
/*
* (non-Javadoc)
@@ -54,14 +53,20 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(tempate, domainType, ALL_QUERY, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveRemoveSupport<T> implements ReactiveRemove<T>, RemoveWithCollection<T> {
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
Query query;
String collection;
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final Query query;
private final String collection;
ReactiveRemoveSupport(ReactiveMongoTemplate template, Class<T> domainType, Query query, String collection) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.collection = collection;
}
/*
* (non-Javadoc)

View File

@@ -15,13 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Mono;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -35,12 +32,15 @@ import com.mongodb.client.result.UpdateResult;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull ReactiveMongoTemplate template;
private final ReactiveMongoTemplate template;
ReactiveUpdateOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
/*
* (non-Javadoc)
@@ -54,21 +54,34 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
return new ReactiveUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveUpdateSupport<T>
implements ReactiveUpdate<T>, UpdateWithCollection<T>, UpdateWithQuery<T>, TerminatingUpdate<T>,
FindAndReplaceWithOptions<T>, FindAndReplaceWithProjection<T>, TerminatingFindAndReplace<T> {
@NonNull ReactiveMongoTemplate template;
@NonNull Class<?> domainType;
Query query;
org.springframework.data.mongodb.core.query.UpdateDefinition update;
@Nullable String collection;
@Nullable FindAndModifyOptions findAndModifyOptions;
@Nullable FindAndReplaceOptions findAndReplaceOptions;
@Nullable Object replacement;
@NonNull Class<T> targetType;
private final ReactiveMongoTemplate template;
private final Class<?> domainType;
private final Query query;
private final org.springframework.data.mongodb.core.query.UpdateDefinition update;
@Nullable private final String collection;
@Nullable private final FindAndModifyOptions findAndModifyOptions;
@Nullable private final FindAndReplaceOptions findAndReplaceOptions;
@Nullable private final Object replacement;
private final Class<T> targetType;
ReactiveUpdateSupport(ReactiveMongoTemplate template, Class<?> domainType, Query query, UpdateDefinition update,
String collection, FindAndModifyOptions findAndModifyOptions, FindAndReplaceOptions findAndReplaceOptions,
Object replacement, Class<T> targetType) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.update = update;
this.collection = collection;
this.findAndModifyOptions = findAndModifyOptions;
this.findAndReplaceOptions = findAndReplaceOptions;
this.replacement = replacement;
this.targetType = targetType;
}
/*
* (non-Javadoc)
@@ -123,7 +136,9 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
String collectionName = getCollectionName();
return template.findAndModify(query, update, findAndModifyOptions != null ? findAndModifyOptions : FindAndModifyOptions.none(), targetType, collectionName);
return template.findAndModify(query, update,
findAndModifyOptions != null ? findAndModifyOptions : FindAndModifyOptions.none(), targetType,
collectionName);
}
/*

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import reactor.core.publisher.Mono;
import org.bson.codecs.configuration.CodecRegistry;
@@ -27,6 +26,7 @@ import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.ConnectionString;
@@ -175,11 +175,16 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
* @author Christoph Strobl
* @since 2.1
*/
@Value
static class ClientSessionBoundMongoDbFactory implements ReactiveMongoDatabaseFactory {
static final class ClientSessionBoundMongoDbFactory implements ReactiveMongoDatabaseFactory {
ClientSession session;
ReactiveMongoDatabaseFactory delegate;
private final ClientSession session;
private final ReactiveMongoDatabaseFactory delegate;
ClientSessionBoundMongoDbFactory(ClientSession session, ReactiveMongoDatabaseFactory delegate) {
this.session = session;
this.delegate = delegate;
}
/*
* (non-Javadoc)
@@ -268,5 +273,40 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
return targetType.cast(factory.getProxy(target.getClass().getClassLoader()));
}
public ClientSession getSession() {
return this.session;
}
public ReactiveMongoDatabaseFactory getDelegate() {
return this.delegate;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ClientSessionBoundMongoDbFactory that = (ClientSessionBoundMongoDbFactory) o;
if (!ObjectUtils.nullSafeEquals(this.session, that.session)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(this.session);
result = 31 * result + ObjectUtils.nullSafeHashCode(this.delegate);
return result;
}
public String toString() {
return "SimpleReactiveMongoDatabaseFactory.ClientSessionBoundMongoDbFactory(session=" + this.getSession()
+ ", delegate=" + this.getDelegate() + ")";
}
}
}

View File

@@ -29,8 +29,11 @@ import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Support class for {@link AggregationExpression} implementations.
*
* @author Christoph Strobl
* @author Matt Morrissette
* @author Mark Paluch
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
@@ -49,7 +52,6 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return toDocument(this.value, context);
}
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
return new Document(getMongoMethod(), unpack(value, context));
}
@@ -101,17 +103,19 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return value;
}
@SuppressWarnings("unchecked")
protected List<Object> append(Object value, Expand expandList) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<Object>((List) this.value);
List<Object> clone = new ArrayList<>((List<Object>) this.value);
if (value instanceof Collection && Expand.EXPAND_VALUES.equals(expandList)) {
clone.addAll((Collection<?>) value);
} else {
clone.add(value);
}
return clone;
}
@@ -129,25 +133,72 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return append(value, Expand.EXPAND_VALUES);
}
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> append(String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
java.util.Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.put(key, value);
return clone;
}
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> remove(String key) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.remove(key);
return clone;
}
/**
* Append the given key at the position in the underlying {@link LinkedHashMap}.
*
* @param index
* @param key
* @param value
* @return
* @since 3.1
*/
@SuppressWarnings({ "unchecked" })
protected Map<String, Object> appendAt(int index, String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>();
int i = 0;
for (Map.Entry<String, Object> entry : ((Map<String, Object>) this.value).entrySet()) {
if (i == index) {
clone.put(key, value);
}
if (!entry.getKey().equals(key)) {
clone.put(entry.getKey(), entry.getValue());
}
i++;
}
if (i <= index) {
clone.put(key, value);
}
return clone;
}
@SuppressWarnings({ "rawtypes" })
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<>(Collections.singletonList(value));
}
@@ -177,7 +228,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return (T) ((java.util.Map<String, Object>) this.value).get(key);
return (T) ((Map<String, Object>) this.value).get(key);
}
/**
@@ -187,11 +238,11 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
* @return
*/
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> argumentMap() {
protected Map<String, Object> argumentMap() {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return Collections.unmodifiableMap((java.util.Map) value);
return Collections.unmodifiableMap((java.util.Map<String, Object>) value);
}
/**
@@ -208,7 +259,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return false;
}
return ((java.util.Map<String, Object>) this.value).containsKey(key);
return ((Map<String, Object>) this.value).containsKey(key);
}
protected abstract String getMongoMethod();

View File

@@ -32,6 +32,7 @@ import org.springframework.util.Assert;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @author Yadhukrishna S Pai
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
@@ -45,12 +46,14 @@ public class AggregationOptions {
private static final String COLLATION = "collation";
private static final String COMMENT = "comment";
private static final String MAX_TIME = "maxTimeMS";
private static final String HINT = "hint";
private final boolean allowDiskUse;
private final boolean explain;
private final Optional<Document> cursor;
private final Optional<Collation> collation;
private final Optional<String> comment;
private final Optional<Document> hint;
private Duration maxTime = Duration.ZERO;
private ResultOptions resultOptions = ResultOptions.READ;
@@ -71,13 +74,13 @@ public class AggregationOptions {
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @since 2.0
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation) {
this(allowDiskUse, explain, cursor, collation, null);
this(allowDiskUse, explain, cursor, collation, null, null);
}
/**
@@ -86,19 +89,37 @@ public class AggregationOptions {
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @param comment execution comment. Can be {@literal null}.
* @since 2.2
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation, @Nullable String comment) {
this(allowDiskUse, explain, cursor, collation, comment, null);
}
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @param comment execution comment. Can be {@literal null}.
* @param hint can be {@literal null}, used to provide an index that would be forcibly used by query optimizer.
* @since 3.1
*/
private AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation, @Nullable String comment, @Nullable Document hint) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = Optional.ofNullable(cursor);
this.collation = Optional.ofNullable(collation);
this.comment = Optional.ofNullable(comment);
this.hint = Optional.ofNullable(hint);
}
/**
@@ -130,8 +151,9 @@ public class AggregationOptions {
Collation collation = document.containsKey(COLLATION) ? Collation.from(document.get(COLLATION, Document.class))
: null;
String comment = document.getString(COMMENT);
Document hint = document.get(HINT, Document.class);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment, hint);
if (document.containsKey(MAX_TIME)) {
options.maxTime = Duration.ofMillis(document.getLong(MAX_TIME));
}
@@ -212,6 +234,16 @@ public class AggregationOptions {
return comment;
}
/**
* Get the hint used to to fulfill the aggregation.
*
* @return never {@literal null}.
* @since 3.1
*/
public Optional<Document> getHint() {
return hint;
}
/**
* @return the time limit for processing. {@link Duration#ZERO} is used for the default unbounded behavior.
* @since 3.0
@@ -248,6 +280,10 @@ public class AggregationOptions {
result.put(EXPLAIN, explain);
}
if (result.containsKey(HINT)) {
hint.ifPresent(val -> result.append(HINT, val));
}
if (!result.containsKey(CURSOR)) {
cursor.ifPresent(val -> result.put(CURSOR, val));
}
@@ -277,6 +313,7 @@ public class AggregationOptions {
cursor.ifPresent(val -> document.put(CURSOR, val));
collation.ifPresent(val -> document.append(COLLATION, val.toDocument()));
comment.ifPresent(val -> document.append(COMMENT, val));
hint.ifPresent(val -> document.append(HINT, val));
if (hasExecutionTimeLimit()) {
document.append(MAX_TIME, maxTime.toMillis());
@@ -318,6 +355,7 @@ public class AggregationOptions {
private @Nullable Document cursor;
private @Nullable Collation collation;
private @Nullable String comment;
private @Nullable Document hint;
private @Nullable Duration maxTime;
private @Nullable ResultOptions resultOptions;
@@ -396,11 +434,24 @@ public class AggregationOptions {
return this;
}
/**
* Define a hint that is used by query optimizer to to fulfill the aggregation.
*
* @param hint can be {@literal null}.
* @return this.
* @since 3.1
*/
public Builder hint(@Nullable Document hint) {
this.hint = hint;
return this;
}
/**
* Set the time limit for processing.
*
* @param maxTime {@link Duration#ZERO} is used for the default unbounded behavior. {@link Duration#isNegative()
* Negative} values will be ignored.
* Negative} values will be ignored.
* @return this.
* @since 3.0
*/
@@ -431,7 +482,7 @@ public class AggregationOptions {
*/
public AggregationOptions build() {
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment, hint);
if (maxTime != null) {
options.maxTime = maxTime;
}

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.function.Predicate;
import org.bson.Document;
import org.springframework.util.Assert;
@@ -26,6 +27,7 @@ import org.springframework.util.Assert;
* The {@link AggregationPipeline} holds the collection of {@link AggregationOperation aggregation stages}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.0.2
*/
public class AggregationPipeline {
@@ -45,6 +47,8 @@ public class AggregationPipeline {
* @param aggregationOperations must not be {@literal null}.
*/
public AggregationPipeline(List<AggregationOperation> aggregationOperations) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
pipeline = new ArrayList<>(aggregationOperations);
}
@@ -82,30 +86,77 @@ public class AggregationPipeline {
*/
public boolean isOutOrMerge() {
if (pipeline.isEmpty()) {
if (isEmpty()) {
return false;
}
String operator = pipeline.get(pipeline.size() - 1).getOperator();
return operator.equals("$out") || operator.equals("$merge");
AggregationOperation operation = pipeline.get(pipeline.size() - 1);
return isOut(operation) || isMerge(operation);
}
void verify() {
// check $out/$merge is the last operation if it exists
for (AggregationOperation aggregationOperation : pipeline) {
for (AggregationOperation operation : pipeline) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation)) {
if (isOut(operation) && !isLast(operation)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
if (aggregationOperation instanceof MergeOperation && !isLast(aggregationOperation)) {
if (isMerge(operation) && !isLast(operation)) {
throw new IllegalArgumentException("The $merge operator must be the last stage in the pipeline.");
}
}
}
/**
* Return whether this aggregation pipeline defines a {@code $unionWith} stage that may contribute documents from
* other collections. Checking for presence of union stages is useful when attempting to determine the aggregation
* element type for mapping metadata computation.
*
* @return {@literal true} the aggregation pipeline makes use of {@code $unionWith}.
* @since 3.1
*/
public boolean containsUnionWith() {
return containsOperation(AggregationPipeline::isUnionWith);
}
/**
* @return {@literal true} if the pipeline does not contain any stages.
* @since 3.1
*/
public boolean isEmpty() {
return pipeline.isEmpty();
}
private boolean containsOperation(Predicate<AggregationOperation> predicate) {
if (isEmpty()) {
return false;
}
for (AggregationOperation element : pipeline) {
if (predicate.test(element)) {
return true;
}
}
return false;
}
private boolean isLast(AggregationOperation aggregationOperation) {
return pipeline.indexOf(aggregationOperation) == pipeline.size() - 1;
}
private static boolean isUnionWith(AggregationOperation operator) {
return operator instanceof UnionWithOperation || operator.getOperator().equals("$unionWith");
}
private static boolean isMerge(AggregationOperation operator) {
return operator instanceof MergeOperation || operator.getOperator().equals("$merge");
}
private static boolean isOut(AggregationOperation operator) {
return operator instanceof OutOperation || operator.getOperator().equals("$out");
}
}

View File

@@ -104,7 +104,7 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
return new Document(getOperator(), options);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/

View File

@@ -1835,7 +1835,7 @@ public class DateOperators {
* @param millisecond must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal millisecond} is {@literal null}
* @deprecated since 3.0.7, use {@link #millisecond(Object)} instead.
* @deprecated since 3.1.3, use {@link #millisecond(Object)} instead.
*/
@Deprecated
default T milliseconds(Object millisecond) {
@@ -1849,7 +1849,7 @@ public class DateOperators {
* @param millisecond must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal millisecond} is {@literal null}
* @since 3.0.7
* @since 3.1.3
*/
T millisecond(Object millisecond);
@@ -1859,7 +1859,7 @@ public class DateOperators {
* @param fieldReference must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal fieldReference} is {@literal null}.
* @deprecated since 3.0.7,use {@link #millisecondOf(String)} instead.
* @deprecated since 3.1.3,use {@link #millisecondOf(String)} instead.
*/
@Deprecated
default T millisecondsOf(String fieldReference) {
@@ -1872,7 +1872,7 @@ public class DateOperators {
* @param fieldReference must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal fieldReference} is {@literal null}.
* @since 3.0.7
* @since 3.1.3
*/
default T millisecondOf(String fieldReference) {
return milliseconds(Fields.field(fieldReference));
@@ -1884,7 +1884,7 @@ public class DateOperators {
* @param expression must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal expression} is {@literal null}.
* @deprecated since 3.0.7, use {@link #millisecondOf(AggregationExpression)} instead.
* @deprecated since 3.1.3, use {@link #millisecondOf(AggregationExpression)} instead.
*/
@Deprecated
default T millisecondsOf(AggregationExpression expression) {
@@ -1897,7 +1897,7 @@ public class DateOperators {
* @param expression must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal expression} is {@literal null}.
* @since 3.0.7
* @since 3.1.3
*/
default T millisecondOf(AggregationExpression expression) {
return milliseconds(expression);

View File

@@ -149,4 +149,12 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
}
return null;
}
/**
* @return obtain the root context used to resolve references.
* @since 3.1
*/
AggregationOperationContext getRootContext() {
return rootContext;
}
}

View File

@@ -52,7 +52,7 @@ public class LimitOperation implements AggregationOperation {
return new Document(getOperator(), Long.valueOf(maxElements));
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/

View File

@@ -0,0 +1,587 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorBuilder;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorInitBuilder;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
/**
* Gateway to {@literal $function} and {@literal $accumulator} aggregation operations.
* <p />
* Using {@link ScriptOperators} as part of the {@link Aggregation} requires MongoDB server to have
* <a href="https://docs.mongodb.com/master/core/server-side-javascript/">server-side JavaScript</a> execution
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.1
*/
public class ScriptOperators {
/**
* Create a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function<a /> in JavaScript.
*
* @param body The function definition. Must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
return Function.function(body);
}
/**
* Create a custom <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator
* operator</a> in Javascript.
*
* @return new instance of {@link AccumulatorInitBuilder}.
*/
public static AccumulatorInitBuilder accumulatorBuilder() {
return new AccumulatorBuilder();
}
/**
* {@link Function} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function</a> in JavaScript.
* <p />
* <code class="java">
* {
* $function: {
* body: ...,
* args: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Function} cannot be used as part of {@link org.springframework.data.mongodb.core.schema.MongoJsonSchema
* schema} validation query expression. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">MongoDB Documentation:
* $function</a>
*/
public static class Function extends AbstractAggregationExpression {
private Function(Map<String, Object> values) {
super(values);
}
/**
* Create a new {@link Function} with the given function definition.
*
* @param body must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
Assert.notNull(body, "Function body must not be null!");
Map<String, Object> function = new LinkedHashMap<>(2);
function.put(Fields.BODY.toString(), body);
function.put(Fields.ARGS.toString(), Collections.emptyList());
function.put(Fields.LANG.toString(), "js");
return new Function(function);
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(Object... args) {
return args(Arrays.asList(args));
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(List<Object> args) {
Assert.notNull(args, "Args must not be null! Use an empty list instead.");
return new Function(appendAt(1, Fields.ARGS.toString(), args));
}
/**
* The language used in the body.
*
* @param lang must not be {@literal null} nor empty.
* @return new instance of {@link Function}.
*/
public Function lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
return new Function(appendAt(2, Fields.LANG.toString(), lang));
}
@Nullable
List<Object> getArgs() {
return get(Fields.ARGS.toString());
}
String getBody() {
return get(Fields.BODY.toString());
}
String getLang() {
return get(Fields.LANG.toString());
}
@Override
protected String getMongoMethod() {
return "$function";
}
enum Fields {
BODY, ARGS, LANG;
@Override
public String toString() {
return name().toLowerCase();
}
}
}
/**
* {@link Accumulator} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator operator</a>,
* one that maintains its state (e.g. totals, maximums, minimums, and related data) as documents progress through the
* pipeline, in JavaScript.
* <p />
* <code class="java">
* {
* $accumulator: {
* init: ...,
* intArgs: ...,
* accumulate: ...,
* accumulateArgs: ...,
* merge: ...,
* finalize: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Accumulator} can be used as part of {@link GroupOperation $group}, {@link BucketOperation $bucket} and
* {@link BucketAutoOperation $bucketAuto} pipeline stages. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">MongoDB Documentation:
* $accumulator</a>
*/
public static class Accumulator extends AbstractAggregationExpression {
private Accumulator(Map<String, Object> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$accumulator";
}
enum Fields {
ACCUMULATE("accumulate"), //
ACCUMULATE_ARGS("accumulateArgs"), //
FINALIZE("finalize"), //
INIT("init"), //
INIT_ARGS("initArgs"), //
LANG("lang"), //
MERGE("merge"); //
private String field;
Fields(String field) {
this.field = field;
}
@Override
public String toString() {
return field;
}
}
public interface AccumulatorInitBuilder {
/**
* Define the {@code init} {@link Function} for the {@link Accumulator accumulators} initial state. The function
* receives its arguments from the {@link Function#args(Object...) initArgs} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder init(Function function) {
return init(function.getBody()).initArgs(function.getArgs());
}
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link AccumulatorInitArgsBuilder#initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorInitArgsBuilder init(String function);
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
AccumulatorInitBuilder lang(String lang);
}
public interface AccumulatorInitArgsBuilder extends AccumulatorAccumulateBuilder {
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder initArgs(Object... args) {
return initArgs(Arrays.asList(args));
}
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateBuilder initArgs(List<Object> args);
}
public interface AccumulatorAccumulateBuilder {
/**
* Set the {@code accumulate} {@link Function} that updates the state for each document. The functions first
* argument is the current {@code state}, additional arguments can be defined via {@link Function#args(Object...)
* accumulateArgs}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulate(Function function) {
return accumulate(function.getBody()).accumulateArgs(function.getArgs());
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via
* {@link AccumulatorAccumulateArgsBuilder#accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateArgsBuilder accumulate(String function);
}
public interface AccumulatorAccumulateArgsBuilder extends AccumulatorMergeBuilder {
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulateArgs(Object... args) {
return accumulateArgs(Arrays.asList(args));
}
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorMergeBuilder accumulateArgs(List<Object> args);
}
public interface AccumulatorMergeBuilder {
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorFinalizeBuilder merge(String function);
}
public interface AccumulatorFinalizeBuilder {
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
Accumulator finalize(String function);
/**
* Build the {@link Accumulator} object without specifying a {@link #finalize(String) finalize function}.
*
* @return new instance of {@link Accumulator}.
*/
Accumulator build();
}
static class AccumulatorBuilder
implements AccumulatorInitBuilder, AccumulatorInitArgsBuilder, AccumulatorAccumulateBuilder,
AccumulatorAccumulateArgsBuilder, AccumulatorMergeBuilder, AccumulatorFinalizeBuilder {
private List<Object> initArgs;
private String initFunction;
private List<Object> accumulateArgs;
private String accumulateFunction;
private String mergeFunction;
private String finalizeFunction;
private String lang = "js";
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link #initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder init(String function) {
this.initFunction = function;
return this;
}
/**
* Define the optional {@code initArgs} for the {@link #init(String)} function.
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder initArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.initArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via {@link #accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulate(String function) {
Assert.notNull(function, "Accumulate function must not be null");
this.accumulateFunction = function;
return this;
}
/**
* Define additional {@code accumulateArgs} for the {@link #accumulate(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulateArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.accumulateArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder merge(String function) {
Assert.notNull(function, "Merge function must not be null");
this.mergeFunction = function;
return this;
}
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
public AccumulatorBuilder lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
this.lang = lang;
return this;
}
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
@Override
public Accumulator finalize(String function) {
Assert.notNull(function, "Finalize function must not be null");
this.finalizeFunction = function;
Map<String, Object> args = createArgumentMap();
args.put(Fields.FINALIZE.toString(), finalizeFunction);
return new Accumulator(args);
}
@Override
public Accumulator build() {
return new Accumulator(createArgumentMap());
}
private Map<String, Object> createArgumentMap() {
Map<String, Object> args = new LinkedHashMap<>();
args.put(Fields.INIT.toString(), initFunction);
if (!CollectionUtils.isEmpty(initArgs)) {
args.put(Fields.INIT_ARGS.toString(), initArgs);
}
args.put(Fields.ACCUMULATE.toString(), accumulateFunction);
if (!CollectionUtils.isEmpty(accumulateArgs)) {
args.put(Fields.ACCUMULATE_ARGS.toString(), accumulateArgs);
}
args.put(Fields.MERGE.toString(), mergeFunction);
args.put(Fields.LANG.toString(), lang);
return args;
}
}
}
}

View File

@@ -78,7 +78,7 @@ public class SortByCountOperation implements AggregationOperation {
: groupByExpression.toDocument(context));
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/

View File

@@ -21,6 +21,7 @@ import java.util.ArrayList;
import java.util.List;
import org.bson.Document;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
@@ -29,6 +30,7 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldRefe
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -46,6 +48,7 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
private final Class<?> type;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
private final Lazy<MongoPersistentEntity<?>> entity;
/**
* Creates a new {@link TypeBasedAggregationOperationContext} for the given type, {@link MappingContext} and
@@ -65,6 +68,7 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
this.type = type;
this.mappingContext = mappingContext;
this.mapper = mapper;
this.entity = Lazy.of(() -> mappingContext.getPersistentEntity(type));
}
/*
@@ -133,15 +137,32 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
*/
@Override
public AggregationOperationContext continueOnMissingFieldReference() {
return continueOnMissingFieldReference(type);
}
/**
* This toggle allows the {@link AggregationOperationContext context} to use any given field name without checking for
* its existence. Typically the {@link AggregationOperationContext} fails when referencing unknown fields, those that
* are not present in one of the previous stages or the input source, throughout the pipeline.
*
* @param type The domain type to map fields to.
* @return a more relaxed {@link AggregationOperationContext}.
* @since 3.1
*/
public AggregationOperationContext continueOnMissingFieldReference(Class<?> type) {
return new RelaxedTypeBasedAggregationOperationContext(type, mappingContext, mapper);
}
protected FieldReference getReferenceFor(Field field) {
if(entity.getNullable() == null) {
return new DirectFieldReference(new ExposedField(field, true));
}
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext
.getPersistentPropertyPath(field.getTarget(), type);
.getPersistentPropertyPath(field.getTarget(), type);
Field mappedField = field(field.getName(),
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE));
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE));
return new DirectFieldReference(new ExposedField(mappedField, true));
}

View File

@@ -0,0 +1,164 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* The <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">$unionWith</a> aggregation
* stage (available since MongoDB 4.4) performs a union of two collections by combining pipeline results, potentially
* containing duplicates, into a single result set that is handed over to the next stage. <br />
* In order to remove duplicates it is possible to append a {@link GroupOperation} right after
* {@link UnionWithOperation}.
* <p />
* If the {@link UnionWithOperation} uses a
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/#unionwith-pipeline">pipeline</a>
* to process documents, field names within the pipeline will be treated as is. In order to map domain type property
* names to actual field names (considering potential {@link org.springframework.data.mongodb.core.mapping.Field}
* annotations) make sure the enclosing aggregation is a {@link TypedAggregation} and provide the target type for the
* {@code $unionWith} stage via {@link #mapFieldsTo(Class)}.
*
* @author Christoph Strobl
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">Aggregation Pipeline Stage:
* $unionWith</a>
* @since 3.1
*/
public class UnionWithOperation implements AggregationOperation {
private final String collection;
private final @Nullable AggregationPipeline pipeline;
private final @Nullable Class<?> domainType;
public UnionWithOperation(String collection, @Nullable AggregationPipeline pipeline, @Nullable Class<?> domainType) {
Assert.notNull(collection, "Collection must not be null!");
this.collection = collection;
this.pipeline = pipeline;
this.domainType = domainType;
}
/**
* Set the name of the collection from which pipeline results should be included in the result set.<br />
* The collection name is used to set the {@code coll} parameter of {@code $unionWith}.
*
* @param collection the MongoDB collection name. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public static UnionWithOperation unionWith(String collection) {
return new UnionWithOperation(collection, null, null);
}
/**
* Set the {@link AggregationPipeline} to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param pipeline the {@link AggregationPipeline} that computes the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationPipeline pipeline) {
return new UnionWithOperation(collection, pipeline, domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(List<AggregationOperation> aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(aggregationStages), domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationOperation... aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(Arrays.asList(aggregationStages)), domainType);
}
/**
* Set domain type used for field name mapping of property references used by the {@link AggregationPipeline}.
* Remember to also use a {@link TypedAggregation} in the outer pipeline.<br />
* If not set, field names used within {@link AggregationOperation pipeline operations} are taken as is.
*
* @param domainType the domain type to map field names used in pipeline operations to. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation mapFieldsTo(Class<?> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
return new UnionWithOperation(collection, pipeline, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
Document $unionWith = new Document("coll", collection);
if (pipeline == null || pipeline.isEmpty()) {
return new Document(getOperator(), $unionWith);
}
$unionWith.append("pipeline", pipeline.toDocuments(computeContext(context)));
return new Document(getOperator(), $unionWith);
}
private AggregationOperationContext computeContext(AggregationOperationContext source) {
if (source instanceof TypeBasedAggregationOperationContext) {
return ((TypeBasedAggregationOperationContext) source).continueOnMissingFieldReference(domainType != null ? domainType : Object.class);
}
if (source instanceof ExposedFieldsAggregationOperationContext) {
return computeContext(((ExposedFieldsAggregationOperationContext) source).getRootContext());
}
return source;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$unionWith";
}
}

View File

@@ -108,7 +108,7 @@ public class UnwindOperation
return new Document(getOperator(), unwindArgs);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/

View File

@@ -17,17 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Iterator;
import java.util.Map;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
@@ -110,28 +109,7 @@ class DocumentAccessor {
*/
@Nullable
public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
Map<String, Object> map = BsonUtils.asMap(document);
if (!fieldName.contains(".")) {
return map.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<String, Object> source = map;
Object result = null;
while (source != null && parts.hasNext()) {
result = source.get(parts.next());
if (parts.hasNext()) {
source = getAsMap(result);
}
}
return result;
return BsonUtils.resolveValue(document, property.getFieldName());
}
/**
@@ -157,71 +135,7 @@ class DocumentAccessor {
Assert.notNull(property, "Property must not be null!");
String fieldName = property.getFieldName();
if (this.document instanceof Document) {
if (((Document) this.document).containsKey(fieldName)) {
return true;
}
} else if (this.document instanceof DBObject) {
if (((DBObject) this.document).containsField(fieldName)) {
return true;
}
}
if (!fieldName.contains(".")) {
return false;
}
String[] parts = fieldName.split("\\.");
Map<String, Object> source;
if (this.document instanceof Document) {
source = ((Document) this.document);
} else {
source = ((DBObject) this.document).toMap();
}
Object result = null;
for (int i = 1; i < parts.length; i++) {
result = source.get(parts[i - 1]);
source = getAsMap(result);
if (source == null) {
return false;
}
}
return source.containsKey(parts[parts.length - 1]);
}
/**
* Returns the given source object as map, i.e. {@link Document}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return can be {@literal null}.
*/
@Nullable
@SuppressWarnings("unchecked")
private static Map<String, Object> getAsMap(Object source) {
if (source instanceof Document) {
return (Document) source;
}
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
}
if (source instanceof Map) {
return (Map<String, Object>) source;
}
return null;
return BsonUtils.hasValue(document, property.getFieldName());
}
/**

View File

@@ -75,6 +75,7 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -182,6 +183,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
* object to fail. If further customization of the translation is needed, have a look at
* {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}.
* <p>
* {@code mapKeyDotReplacement} is used as-is during replacement operations without further processing (i.e. regex or
* normalization).
*
* @param mapKeyDotReplacement the mapKeyDotReplacement to set. Can be {@literal null}.
*/
@@ -900,7 +904,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
return StringUtils.replace(source, ".", mapKeyDotReplacement);
}
/**
@@ -928,7 +932,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return
*/
protected String potentiallyUnescapeMapKey(String source) {
return mapKeyDotReplacement == null ? source : source.replaceAll(mapKeyDotReplacement, "\\.");
return mapKeyDotReplacement == null ? source : StringUtils.replace(source, mapKeyDotReplacement, ".");
}
/**

View File

@@ -32,6 +32,9 @@ import java.util.concurrent.atomic.AtomicLong;
import org.bson.BsonTimestamp;
import org.bson.Document;
import org.bson.UuidRepresentation;
import org.bson.codecs.Codec;
import org.bson.internal.CodecRegistryHelper;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -45,11 +48,12 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoClientSettings;
/**
* Wrapper class to contain useful converters for the usage with Mongo.
*
@@ -236,9 +240,13 @@ abstract class MongoConverters {
INSTANCE;
private final Codec<Document> codec = CodecRegistryHelper
.createRegistry(MongoClientSettings.getDefaultCodecRegistry(), UuidRepresentation.JAVA_LEGACY)
.get(Document.class);
@Override
public String convert(Document source) {
return source.toJson();
return source.toJson(codec);
}
}

View File

@@ -19,11 +19,14 @@ import java.util.*;
import java.util.Map.Entry;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
@@ -63,9 +66,12 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author David Julia
*/
public class QueryMapper {
protected static final Logger LOGGER = LoggerFactory.getLogger(QueryMapper.class);
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final Document META_TEXT_SCORE = new Document("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
@@ -378,6 +384,10 @@ public class QueryMapper {
}
}
if (value == null) {
return null;
}
if (isNestedKeyword(value)) {
return getMappedKeyword(new Keyword((Bson) value), documentField.getPropertyEntity());
}
@@ -635,7 +645,7 @@ public class QueryMapper {
* @param candidate
* @return
*/
protected boolean isNestedKeyword(Object candidate) {
protected boolean isNestedKeyword(@Nullable Object candidate) {
if (!(candidate instanceof Document)) {
return false;
@@ -680,12 +690,13 @@ public class QueryMapper {
* converted one by one.
*
* @param documentField the field and its meta data
* @param value the actual value
* @param value the actual value. Can be {@literal null}.
* @return the potentially converted target value.
*/
private Object applyFieldTargetTypeHintToValue(Field documentField, Object value) {
@Nullable
private Object applyFieldTargetTypeHintToValue(Field documentField, @Nullable Object value) {
if (documentField.getProperty() == null || !documentField.getProperty().hasExplicitWriteTarget()) {
if (value == null || documentField.getProperty() == null || !documentField.getProperty().hasExplicitWriteTarget()) {
return value;
}
@@ -716,7 +727,6 @@ public class QueryMapper {
*/
static class Keyword {
private static final String N_OR_PATTERN = "\\$.*or";
private static final Set<String> NON_DBREF_CONVERTING_KEYWORDS = new HashSet<>(
Arrays.asList("$", "$size", "$slice", "$gt", "$lt"));
@@ -747,7 +757,7 @@ public class QueryMapper {
}
public boolean isOrOrNor() {
return key.matches(N_OR_PATTERN);
return key.equalsIgnoreCase("$or") || key.equalsIgnoreCase("$nor");
}
/**
@@ -1086,8 +1096,8 @@ public class QueryMapper {
removePlaceholders(DOT_POSITIONAL_PATTERN, pathExpression));
if (sourceProperty != null && sourceProperty.getOwner().equals(entity)) {
return mappingContext
.getPersistentPropertyPath(PropertyPath.from(sourceProperty.getName(), entity.getTypeInformation()));
return mappingContext.getPersistentPropertyPath(
PropertyPath.from(Pattern.quote(sourceProperty.getName()), entity.getTypeInformation()));
}
PropertyPath path = forName(rawPath);
@@ -1095,29 +1105,47 @@ public class QueryMapper {
return null;
}
try {
PersistentPropertyPath<MongoPersistentProperty> propertyPath = tryToResolvePersistentPropertyPath(path);
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
if (propertyPath == null) {
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
boolean associationDetected = false;
if (QueryMapper.LOGGER.isInfoEnabled()) {
while (iterator.hasNext()) {
String types = StringUtils.collectionToDelimitedString(
path.stream().map(it -> it.getType().getSimpleName()).collect(Collectors.toList()), " -> ");
QueryMapper.LOGGER.info(
"Could not map '{}'. Maybe a fragment in '{}' is considered a simple type. Mapper continues with {}.",
path, types, pathExpression);
}
return null;
}
MongoPersistentProperty property = iterator.next();
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
boolean associationDetected = false;
if (property.isAssociation()) {
associationDetected = true;
continue;
}
while (iterator.hasNext()) {
if (associationDetected && !property.isIdProperty()) {
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
}
MongoPersistentProperty property = iterator.next();
if (property.isAssociation()) {
associationDetected = true;
continue;
}
return propertyPath;
} catch (InvalidPersistentPropertyPath e) {
if (associationDetected && !property.isIdProperty()) {
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
}
}
return propertyPath;
}
@Nullable
private PersistentPropertyPath<MongoPersistentProperty> tryToResolvePersistentPropertyPath(PropertyPath path) {
try {
return mappingContext.getPersistentPropertyPath(path);
} catch (MappingException e) {
return null;
}
}
@@ -1146,13 +1174,21 @@ public class QueryMapper {
return forName(path.substring(0, path.length() - 3) + "id");
}
// Ok give it another try quoting
try {
return PropertyPath.from(Pattern.quote(path), entity.getTypeInformation());
} catch (PropertyReferenceException | InvalidPersistentPropertyPath ex) {
}
return null;
}
}
private boolean isPathToJavaLangClassProperty(PropertyPath path) {
if (path.getType().equals(Class.class) && path.getLeafProperty().getOwningType().getType().equals(Class.class)) {
if ((path.getType() == Class.class || path.getType().equals(Object.class))
&& path.getLeafProperty().getType() == Class.class) {
return true;
}
return false;
@@ -1238,12 +1274,17 @@ public class QueryMapper {
static class KeyMapper {
private final Iterator<String> iterator;
private int currentIndex;
private String currentPropertyRoot;
private final List<String> pathParts;
public KeyMapper(String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.iterator = Arrays.asList(key.split("\\.")).iterator();
this.iterator.next();
this.pathParts = Arrays.asList(key.split("\\."));
this.iterator = pathParts.iterator();
this.currentPropertyRoot = iterator.next();
this.currentIndex = 0;
}
/**
@@ -1255,21 +1296,31 @@ public class QueryMapper {
protected String mapPropertyName(MongoPersistentProperty property) {
StringBuilder mappedName = new StringBuilder(PropertyToFieldNameConverter.INSTANCE.convert(property));
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
currentIndex++;
boolean isPositional = (isPositionalParameter(partial) && (property.isMap() || property.isCollectionLike()));
boolean isPositional = isPositionalParameter(partial) && property.isCollectionLike() ;
if(property.isMap() && currentPropertyRoot.equals(partial) && iterator.hasNext()){
partial = iterator.next();
currentIndex++;
}
if (isPositional) {
if (isPositional || property.isMap() && !currentPropertyRoot.equals(partial)) {
mappedName.append(".").append(partial);
}
inspect = isPositional && iterator.hasNext();
}
if(currentIndex + 1 < pathParts.size()) {
currentIndex++;
currentPropertyRoot = pathParts.get(currentIndex);
}
return mappedName.toString();
}

View File

@@ -16,7 +16,7 @@
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in <a href="https://geojson.org/>GeoJSON</a> format.
* Interface definition for structures defined in <a href="https://geojson.org/">GeoJSON</a> format.
*
* @author Christoph Strobl
* @since 1.7

View File

@@ -46,6 +46,7 @@ import java.lang.annotation.Target;
* @author Philipp Schneider
* @author Johno Crawford
* @author Christoph Strobl
* @author Dave Perryman
*/
@Target({ ElementType.TYPE })
@Documented
@@ -95,7 +96,8 @@ public @interface CompoundIndex {
boolean unique() default false;
/**
* If set to true index will skip over any document that is missing the indexed field.
* If set to true index will skip over any document that is missing the indexed field. <br />
* Must not be used with {@link #partialFilter()}.
*
* @return {@literal false} by default.
* @see <a href=
@@ -170,4 +172,14 @@ public @interface CompoundIndex {
*/
boolean background() default false;
/**
* Only index the documents in a collection that meet a specified {@link IndexFilter filter expression}. <br />
* Must not be used with {@link #sparse() sparse = true}.
*
* @return empty by default.
* @see <a href=
* "https://docs.mongodb.com/manual/core/index-partial/">https://docs.mongodb.com/manual/core/index-partial/</a>
* @since 3.1
*/
String partialFilter() default "";
}

View File

@@ -53,7 +53,8 @@ public @interface Indexed {
IndexDirection direction() default IndexDirection.ASCENDING;
/**
* If set to true index will skip over any document that is missing the indexed field.
* If set to true index will skip over any document that is missing the indexed field. <br />
* Must not be used with {@link #partialFilter()}.
*
* @return {@literal false} by default.
* @see <a href=
@@ -170,4 +171,15 @@ public @interface Indexed {
* @since 2.2
*/
String expireAfter() default "";
/**
* Only index the documents in a collection that meet a specified {@link IndexFilter filter expression}. <br />
* Must not be used with {@link #sparse() sparse = true}.
*
* @return empty by default.
* @see <a href=
* "https://docs.mongodb.com/manual/core/index-partial/">https://docs.mongodb.com/manual/core/index-partial/</a>
* @since 3.1
*/
String partialFilter() default "";
}

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core.index;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.time.Duration;
import java.util.ArrayList;
import java.util.Arrays;
@@ -33,7 +29,6 @@ import java.util.stream.Collectors;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.Association;
@@ -51,6 +46,7 @@ import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.EvaluationContext;
@@ -74,6 +70,7 @@ import org.springframework.util.StringUtils;
* @author Thomas Darimont
* @author Martin Macko
* @author Mark Paluch
* @author Dave Perryman
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
@@ -385,6 +382,10 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexDefinition.background();
}
if (StringUtils.hasText(index.partialFilter())) {
indexDefinition.partial(evaluatePartialFilter(index.partialFilter(), entity));
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
@@ -474,9 +475,25 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
}
if (StringUtils.hasText(index.partialFilter())) {
indexDefinition.partial(evaluatePartialFilter(index.partialFilter(), persistentProperty.getOwner()));
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
private PartialIndexFilter evaluatePartialFilter(String filterExpression, PersistentEntity<?,?> entity) {
Object result = evaluate(filterExpression, getEvaluationContextForProperty(entity));
if (result instanceof org.bson.Document) {
return PartialIndexFilter.of((org.bson.Document) result);
}
return PartialIndexFilter.of(BsonUtils.parse(filterExpression, null));
}
/**
* Creates {@link HashedIndex} wrapped in {@link IndexDefinitionHolder} out of {@link HashIndexed} for a given
* {@link MongoPersistentProperty}.
@@ -733,8 +750,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
* @author Christoph Strobl
* @author Mark Paluch
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
static class Path {
private static final Path EMPTY = new Path(Collections.emptyList(), false);
@@ -742,6 +757,11 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
private final List<PersistentProperty<?>> elements;
private final boolean cycle;
private Path(List<PersistentProperty<?>> elements, boolean cycle) {
this.elements = elements;
this.cycle = cycle;
}
/**
* @return an empty {@link Path}.
* @since 1.10.8
@@ -843,6 +863,28 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return builder.toString();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Path that = (Path) o;
if (this.cycle != that.cycle) {
return false;
}
return ObjectUtils.nullSafeEquals(this.elements, that.elements);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(elements);
result = 31 * result + (cycle ? 1 : 0);
return result;
}
}
}

View File

@@ -16,14 +16,8 @@
package org.springframework.data.mongodb.core.index;
import org.bson.Document;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import com.mongodb.DBObject;
import org.springframework.util.Assert;
/**
* {@link IndexFilter} implementation for usage with plain {@link Document} as well as {@link CriteriaDefinition} filter
@@ -32,10 +26,16 @@ import com.mongodb.DBObject;
* @author Christoph Strobl
* @since 1.10
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public class PartialIndexFilter implements IndexFilter {
private final @NonNull Object filterExpression;
private final Object filterExpression;
private PartialIndexFilter(Object filterExpression) {
Assert.notNull(filterExpression, "FilterExpression must not be null!");
this.filterExpression = filterExpression;
}
/**
* Create new {@link PartialIndexFilter} for given {@link Document filter expression}.

View File

@@ -15,13 +15,12 @@
*/
package org.springframework.data.mongodb.core.mapping.event;
import reactor.core.publisher.Mono;
import org.reactivestreams.Publisher;
import org.springframework.beans.factory.ObjectFactory;
import org.springframework.core.Ordered;
import org.springframework.data.auditing.AuditingHandler;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.util.Assert;
@@ -34,7 +33,7 @@ import org.springframework.util.Assert;
*/
public class ReactiveAuditingEntityCallback implements ReactiveBeforeConvertCallback<Object>, Ordered {
private final ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory;
private final ObjectFactory<ReactiveIsNewAwareAuditingHandler> auditingHandlerFactory;
/**
* Creates a new {@link ReactiveAuditingEntityCallback} using the given {@link MappingContext} and
@@ -42,7 +41,7 @@ public class ReactiveAuditingEntityCallback implements ReactiveBeforeConvertCall
*
* @param auditingHandlerFactory must not be {@literal null}.
*/
public ReactiveAuditingEntityCallback(ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory) {
public ReactiveAuditingEntityCallback(ObjectFactory<ReactiveIsNewAwareAuditingHandler> auditingHandlerFactory) {
Assert.notNull(auditingHandlerFactory, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandlerFactory = auditingHandlerFactory;
@@ -54,7 +53,7 @@ public class ReactiveAuditingEntityCallback implements ReactiveBeforeConvertCall
*/
@Override
public Publisher<Object> onBeforeConvert(Object entity, String collection) {
return Mono.just(auditingHandlerFactory.getObject().markAudited(entity));
return auditingHandlerFactory.getObject().markAudited(entity);
}
/*

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.AllArgsConstructor;
import java.time.Instant;
import java.util.Arrays;
import java.util.Collections;
@@ -217,12 +215,17 @@ class ChangeStreamTask extends CursorReadingTask<ChangeStreamDocument<Document>,
*
* @since 2.1
*/
@AllArgsConstructor
static class ChangeStreamEventMessage<T> implements Message<ChangeStreamDocument<Document>, T> {
private final ChangeStreamEvent<T> delegate;
private final MessageProperties messageProperties;
ChangeStreamEventMessage(ChangeStreamEvent<T> delegate, MessageProperties messageProperties) {
this.delegate = delegate;
this.messageProperties = messageProperties;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.messaging.Message#getRaw()

View File

@@ -15,10 +15,6 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.time.Duration;
import java.util.LinkedHashMap;
import java.util.Map;
@@ -34,6 +30,7 @@ import org.springframework.data.mongodb.core.messaging.SubscriptionRequest.Reque
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ErrorHandler;
import org.springframework.util.ObjectUtils;
/**
* Simple {@link Executor} based {@link MessageListenerContainer} implementation for running {@link Task tasks} like
@@ -259,7 +256,6 @@ public class DefaultMessageListenerContainer implements MessageListenerContainer
* @author Christoph Strobl
* @since 2.1
*/
@EqualsAndHashCode
static class TaskSubscription implements Subscription {
private final Task task;
@@ -286,19 +282,39 @@ public class DefaultMessageListenerContainer implements MessageListenerContainer
public void cancel() throws DataAccessResourceFailureException {
task.cancel();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
TaskSubscription that = (TaskSubscription) o;
return ObjectUtils.nullSafeEquals(this.task, that.task);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(task);
}
}
/**
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
private static class DecoratingLoggingErrorHandler implements ErrorHandler {
private final Log logger = LogFactory.getLog(DecoratingLoggingErrorHandler.class);
private final ErrorHandler delegate;
DecoratingLoggingErrorHandler(ErrorHandler delegate) {
this.delegate = delegate;
}
@Override
public void handleError(Throwable t) {

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.ToString;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.util.ClassUtils;
@@ -26,7 +24,6 @@ import org.springframework.util.ClassUtils;
* @author Mark Paluch
* @since 2.1
*/
@ToString(of = { "delegate", "targetType" })
class LazyMappingDelegatingMessage<S, T> implements Message<S, T> {
private final Message<S, ?> delegate;
@@ -82,4 +79,8 @@ class LazyMappingDelegatingMessage<S, T> implements Message<S, T> {
public MessageProperties getProperties() {
return delegate.getProperties();
}
public String toString() {
return "LazyMappingDelegatingMessage(delegate=" + this.delegate + ", targetType=" + this.targetType + ")";
}
}

View File

@@ -15,11 +15,9 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.EqualsAndHashCode;
import lombok.ToString;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* General message abstraction for any type of Event / Message published by MongoDB server to the client. This might be
@@ -65,8 +63,6 @@ public interface Message<S, T> {
* @author Christoph Strobl
* @since 2.1
*/
@ToString
@EqualsAndHashCode
class MessageProperties {
private static final MessageProperties EMPTY = new MessageProperties();
@@ -111,6 +107,34 @@ public interface Message<S, T> {
return new MessagePropertiesBuilder();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
MessageProperties that = (MessageProperties) o;
if (!ObjectUtils.nullSafeEquals(this.databaseName, that.databaseName)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.collectionName, that.collectionName);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(databaseName);
result = 31 * result + ObjectUtils.nullSafeHashCode(collectionName);
return result;
}
public String toString() {
return "Message.MessageProperties(databaseName=" + this.getDatabaseName() + ", collectionName="
+ this.getCollectionName() + ")";
}
/**
* Builder for {@link MessageProperties}.
*

View File

@@ -15,11 +15,9 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.EqualsAndHashCode;
import lombok.ToString;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Trivial {@link Message} implementation.
@@ -27,8 +25,6 @@ import org.springframework.util.Assert;
* @author Christoph Strobl
* @since 2.1
*/
@EqualsAndHashCode
@ToString
class SimpleMessage<S, T> implements Message<S, T> {
private @Nullable final S raw;
@@ -75,4 +71,35 @@ class SimpleMessage<S, T> implements Message<S, T> {
public MessageProperties getProperties() {
return properties;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SimpleMessage<?, ?> that = (SimpleMessage<?, ?>) o;
if (!ObjectUtils.nullSafeEquals(this.raw, that.raw)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.body, that.body)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.properties, that.properties);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(raw);
result = 31 * result + ObjectUtils.nullSafeHashCode(body);
result = 31 * result + ObjectUtils.nullSafeHashCode(properties);
return result;
}
public String toString() {
return "SimpleMessage(raw=" + this.getRaw() + ", body=" + this.getBody() + ", properties=" + this.getProperties()
+ ")";
}
}

View File

@@ -67,6 +67,7 @@ public class BasicUpdate extends Update {
}
@Override
@Deprecated
public Update pushAll(String key, Object[] values) {
Document keyValue = new Document();
keyValue.put(key, values);

View File

@@ -15,11 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Locale;
import java.util.Optional;
@@ -515,8 +510,6 @@ public class Collation {
*
* @since 2.0
*/
@AllArgsConstructor(access = AccessLevel.PACKAGE)
@Getter
static class ICUComparisonLevel implements ComparisonLevel {
private final int level;
@@ -526,6 +519,24 @@ public class Collation {
ICUComparisonLevel(int level) {
this(level, Optional.empty(), Optional.empty());
}
ICUComparisonLevel(int level, Optional<CaseFirst> caseFirst, Optional<Boolean> caseLevel) {
this.level = level;
this.caseFirst = caseFirst;
this.caseLevel = caseLevel;
}
public int getLevel() {
return this.level;
}
public Optional<CaseFirst> getCaseFirst() {
return this.caseFirst;
}
public Optional<Boolean> getCaseLevel() {
return this.caseLevel;
}
}
/**
@@ -650,7 +661,6 @@ public class Collation {
/**
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public static class CaseFirst {
private static final CaseFirst UPPER = new CaseFirst("upper");
@@ -659,6 +669,10 @@ public class Collation {
private final String state;
private CaseFirst(String state) {
this.state = state;
}
/**
* Sort uppercase before lowercase.
*
@@ -690,7 +704,6 @@ public class Collation {
/**
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
public static class Alternate {
private static final Alternate NON_IGNORABLE = new Alternate("non-ignorable", Optional.empty());
@@ -698,6 +711,11 @@ public class Collation {
final String alternate;
final Optional<String> maxVariable;
Alternate(String alternate, Optional<String> maxVariable) {
this.alternate = alternate;
this.maxVariable = maxVariable;
}
/**
* Consider Whitespace and punctuation as base characters.
*
@@ -761,12 +779,17 @@ public class Collation {
* @see <a href="http://site.icu-project.org">ICU - International Components for Unicode</a>
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public static class CollationLocale {
private final String language;
private final Optional<String> variant;
private CollationLocale(String language, Optional<String> variant) {
this.language = language;
this.variant = variant;
}
/**
* Create new {@link CollationLocale} for given language.
*

View File

@@ -20,8 +20,10 @@ import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
@@ -58,6 +60,7 @@ import com.mongodb.BasicDBList;
* @author Christoph Strobl
* @author Mark Paluch
* @author Andreas Zink
* @author Clément Petit
*/
public class Criteria implements CriteriaDefinition {
@@ -895,15 +898,15 @@ public class Criteria implements CriteriaDefinition {
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
private boolean isEqual(@Nullable Object left, @Nullable Object right) {
if (left == null) {
return right == null;
}
if (Pattern.class.isInstance(left)) {
if (left instanceof Pattern) {
if (!Pattern.class.isInstance(right)) {
if (!(right instanceof Pattern)) {
return false;
}
@@ -914,6 +917,52 @@ public class Criteria implements CriteriaDefinition {
&& leftPattern.flags() == rightPattern.flags();
}
if (left instanceof Document) {
if (!(right instanceof Document)) {
return false;
}
Document leftDocument = (Document) left;
Document rightDocument = (Document) right;
Iterator<Entry<String, Object>> leftIterator = leftDocument.entrySet().iterator();
Iterator<Entry<String, Object>> rightIterator = rightDocument.entrySet().iterator();
while (leftIterator.hasNext() && rightIterator.hasNext()) {
Map.Entry<String, Object> leftEntry = leftIterator.next();
Map.Entry<String, Object> rightEntry = rightIterator.next();
if (!isEqual(leftEntry.getKey(), rightEntry.getKey())
|| !isEqual(leftEntry.getValue(), rightEntry.getValue())) {
return false;
}
}
return !leftIterator.hasNext() && !rightIterator.hasNext();
}
if (Collection.class.isAssignableFrom(left.getClass())) {
if (!Collection.class.isAssignableFrom(right.getClass())) {
return false;
}
Collection<?> leftCollection = (Collection<?>) left;
Collection<?> rightCollection = (Collection<?>) right;
Iterator<?> leftIterator = leftCollection.iterator();
Iterator<?> rightIterator = rightCollection.iterator();
while (leftIterator.hasNext() && rightIterator.hasNext()) {
if (!isEqual(leftIterator.next(), rightIterator.next())) {
return false;
}
}
return !leftIterator.hasNext() && !rightIterator.hasNext();
}
return ObjectUtils.nullSafeEquals(left, right);
}

View File

@@ -15,54 +15,134 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.EqualsAndHashCode;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Field projection.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Patryk Wasik
* @author Christoph Strobl
* @author Mark Paluch
* @author Owen Q
*/
@EqualsAndHashCode
public class Field {
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private final Map<String, Integer> criteria = new HashMap<>();
private final Map<String, Object> slices = new HashMap<>();
private final Map<String, Criteria> elemMatchs = new HashMap<>();
private @Nullable String positionKey;
private int positionValue;
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
/**
* Include a single {@code field} to be returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field include(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 1);
return this;
}
public Field exclude(String key) {
criteria.put(key, Integer.valueOf(0));
/**
* Include one or more {@code fields} to be returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field include(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 1);
}
return this;
}
public Field slice(String key, int size) {
slices.put(key, Integer.valueOf(size));
/**
* Exclude a single {@code field} from being returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field exclude(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 0);
return this;
}
public Field slice(String key, int offset, int size) {
slices.put(key, new Integer[] { Integer.valueOf(offset), Integer.valueOf(size) });
/**
* Exclude one or more {@code fields} from being returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field exclude(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 0);
}
return this;
}
public Field elemMatch(String key, Criteria elemMatchCriteria) {
elemMatchs.put(key, elemMatchCriteria);
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements.
*
* @param field the document field name to project, must be an array field.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int size) {
Assert.notNull(field, "Key must not be null!");
slices.put(field, size);
return this;
}
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements starting at
* {@code offset}.
*
* @param field the document field name to project, must be an array field.
* @param offset the offset to start at.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int offset, int size) {
slices.put(field, new Integer[] { offset, size });
return this;
}
public Field elemMatch(String field, Criteria elemMatchCriteria) {
elemMatchs.put(field, elemMatchCriteria);
return this;
}
@@ -72,7 +152,7 @@ public class Field {
*
* @param field query array field, must not be {@literal null} or empty.
* @param value
* @return
* @return {@code this} field projection instance.
*/
public Field position(String field, int value) {
@@ -103,4 +183,40 @@ public class Field {
return document;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Field field = (Field) o;
if (positionValue != field.positionValue) {
return false;
}
if (!ObjectUtils.nullSafeEquals(criteria, field.criteria)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(slices, field.slices)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(elemMatchs, field.elemMatchs)) {
return false;
}
return ObjectUtils.nullSafeEquals(positionKey, field.positionKey);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(criteria);
result = 31 * result + ObjectUtils.nullSafeHashCode(slices);
result = 31 * result + ObjectUtils.nullSafeHashCode(elemMatchs);
result = 31 * result + ObjectUtils.nullSafeHashCode(positionKey);
result = 31 * result + positionValue;
return result;
}
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.query;
import java.util.regex.Pattern;
import org.bson.BsonRegularExpression;
import org.springframework.lang.Nullable;
/**
@@ -102,6 +103,15 @@ public enum MongoRegexCreator {
}
}
/**
* @param source
* @return
* @since 2.2.14
*/
public Object toCaseInsensitiveMatch(Object source) {
return source instanceof String ? new BsonRegularExpression(Pattern.quote((String) source), "i") : source;
}
private String prepareAndEscapeStringBeforeApplyingLikeRegex(String source, MatchMode matcherType) {
if (MatchMode.REGEX == matcherType) {

View File

@@ -43,7 +43,7 @@ import org.springframework.util.ObjectUtils;
* <p />
* In other words: <br />
* Assume you've got 5 Documents like the ones below <br />
*
*
* <pre>
* <code>
* {
@@ -73,10 +73,10 @@ import org.springframework.util.ObjectUtils;
* }
* </code>
* </pre>
*
*
* Fetching all Documents within a 400 Meter radius from {@code [-73.99171, 40.738868] } would look like this using
* {@literal GeoJSON}:
*
*
* <pre>
* <code>
* {
@@ -92,9 +92,9 @@ import org.springframework.util.ObjectUtils;
*
* </code>
* </pre>
*
*
* resulting in the following 3 Documents.
*
*
* <pre>
* <code>
* {
@@ -117,11 +117,11 @@ import org.springframework.util.ObjectUtils;
* }
* </code>
* </pre>
*
*
* Using legacy coordinate pairs one operates upon radians as discussed before. Assume we use {@link Metrics#KILOMETERS}
* when constructing the geoNear command. The {@link Metric} will make sure the distance multiplier is set correctly, so
* the command is rendered like
*
*
* <pre>
* <code>
* {
@@ -137,12 +137,12 @@ import org.springframework.util.ObjectUtils;
* }
* </code>
* </pre>
*
*
* Please note the calculated distance now uses {@literal Kilometers} instead of {@literal Meters} as unit of measure,
* so we need to take it times 1000 to match up to {@literal Meters} as in the {@literal GeoJSON} variant. <br />
* Still as we've been requesting the {@link Distance} in {@link Metrics#KILOMETERS} the {@link Distance#getValue()}
* reflects exactly this.
*
*
* <pre>
* <code>
* {
@@ -558,7 +558,7 @@ public final class NearQuery {
/**
* Get the {@link Collation} to use along with the {@link #query(Query)}.
*
*
* @return the {@link Collation} if set. {@literal null} otherwise.
* @since 2.2
*/

View File

@@ -15,14 +15,10 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.Set;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.util.ObjectUtils;
/**
* {@link ExampleMatcher} implementation for query by example (QBE). Unlike plain {@link ExampleMatcher} this untyped
@@ -33,11 +29,13 @@ import org.springframework.data.domain.ExampleMatcher;
* @author Mark Paluch
* @since 2.0
*/
@EqualsAndHashCode
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public class UntypedExampleMatcher implements ExampleMatcher {
private final @NonNull ExampleMatcher delegate;
private final ExampleMatcher delegate;
private UntypedExampleMatcher(ExampleMatcher delegate) {
this.delegate = delegate;
}
/*
* (non-Javadoc)
@@ -224,4 +222,21 @@ public class UntypedExampleMatcher implements ExampleMatcher {
return delegate.getMatchMode();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
UntypedExampleMatcher that = (UntypedExampleMatcher) o;
return ObjectUtils.nullSafeEquals(delegate, that.delegate);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(delegate);
}
}

View File

@@ -15,10 +15,8 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* Value object representing a MongoDB-specific JSON schema which is the default {@link MongoJsonSchema} implementation.
@@ -27,10 +25,15 @@ import org.bson.Document;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class DefaultMongoJsonSchema implements MongoJsonSchema {
private final @NonNull JsonSchemaObject root;
private final JsonSchemaObject root;
DefaultMongoJsonSchema(JsonSchemaObject root) {
Assert.notNull(root, "Root must not be null!");
this.root = root;
}
/*
* (non-Javadoc)

View File

@@ -15,10 +15,8 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* JSON schema backed by a {@link org.bson.Document} object.
@@ -26,10 +24,15 @@ import org.bson.Document;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class DocumentJsonSchema implements MongoJsonSchema {
private final @NonNull Document document;
private final Document document;
DocumentJsonSchema(Document document) {
Assert.notNull(document, "Document must not be null!");
this.document = document;
}
/*
* (non-Javadoc)

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.math.BigDecimal;
import java.util.Arrays;
import java.util.Collection;
@@ -44,6 +41,7 @@ import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.String
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.TimestampJsonSchemaObject;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
/**
* Interface that can be implemented by objects that know how to serialize themselves to JSON schema using
@@ -500,12 +498,14 @@ public interface JsonSchemaObject {
* @author Christpoh Strobl
* @since 2.1
*/
@RequiredArgsConstructor
@EqualsAndHashCode
class JsonType implements Type {
private final String name;
public JsonType(String name) {
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type#representation()
@@ -523,18 +523,37 @@ public interface JsonSchemaObject {
public String value() {
return name;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
JsonType jsonType = (JsonType) o;
return ObjectUtils.nullSafeEquals(name, jsonType.name);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(name);
}
}
/**
* @author Christpoh Strobl
* @since 2.1
*/
@RequiredArgsConstructor
@EqualsAndHashCode
class BsonType implements Type {
private final String name;
BsonType(String name) {
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type#representation()
@@ -552,6 +571,24 @@ public interface JsonSchemaObject {
public String value() {
return name;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BsonType bsonType = (BsonType) o;
return ObjectUtils.nullSafeEquals(name, bsonType.name);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(name);
}
}
}
}

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.NumericJsonSchemaObject;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.ObjectJsonSchemaObject;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.*;
@@ -239,11 +236,14 @@ public interface JsonSchemaProperty extends JsonSchemaObject {
/**
* Builder for {@link IdentifiableJsonSchemaProperty}.
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
class JsonSchemaPropertyBuilder {
private final String identifier;
JsonSchemaPropertyBuilder(String identifier) {
this.identifier = identifier;
}
/**
* Configure a {@link Type} for the property.
*

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
@@ -182,7 +179,6 @@ public class UntypedJsonSchemaObject implements JsonSchemaObject {
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
static class Restrictions {
private final Collection<? extends Object> possibleValues;
@@ -191,6 +187,16 @@ public class UntypedJsonSchemaObject implements JsonSchemaObject {
private final Collection<JsonSchemaObject> oneOf;
private final @Nullable JsonSchemaObject notMatch;
Restrictions(Collection<? extends Object> possibleValues, Collection<JsonSchemaObject> allOf,
Collection<JsonSchemaObject> anyOf, Collection<JsonSchemaObject> oneOf, JsonSchemaObject notMatch) {
this.possibleValues = possibleValues;
this.allOf = allOf;
this.anyOf = anyOf;
this.oneOf = oneOf;
this.notMatch = notMatch;
}
/**
* @return new empty {@link Restrictions}.
*/

View File

@@ -15,15 +15,12 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link Validator} implementation based on {@link CriteriaDefinition query expressions}.
@@ -34,12 +31,14 @@ import org.springframework.util.Assert;
* @see Criteria
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/#query-expressions">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class CriteriaValidator implements Validator {
private final CriteriaDefinition criteria;
private CriteriaValidator(CriteriaDefinition criteria) {
this.criteria = criteria;
}
/**
* Creates a new {@link Validator} object, which is basically setup of query operators, based on a
* {@link CriteriaDefinition} instance.
@@ -72,4 +71,21 @@ class CriteriaValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument());
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
CriteriaValidator that = (CriteriaValidator) o;
return ObjectUtils.nullSafeEquals(criteria, that.criteria);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(criteria);
}
}

View File

@@ -15,13 +15,10 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Most trivial {@link Validator} implementation using plain {@link Document} to describe the desired document structure
@@ -32,12 +29,14 @@ import org.springframework.util.Assert;
* @since 2.1
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class DocumentValidator implements Validator {
private final Document validatorObject;
private DocumentValidator(Document validatorObject) {
this.validatorObject = validatorObject;
}
/**
* Create new {@link DocumentValidator} defining validation rules via a plain {@link Document}.
*
@@ -68,4 +67,22 @@ class DocumentValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(validatorObject);
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
DocumentValidator that = (DocumentValidator) o;
return ObjectUtils.nullSafeEquals(validatorObject, that.validatorObject);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(validatorObject);
}
}

View File

@@ -15,14 +15,11 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link Validator} implementation based on {@link MongoJsonSchema JSON Schema}.
@@ -32,12 +29,14 @@ import org.springframework.util.Assert;
* @since 2.1
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/#json-schema">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class JsonSchemaValidator implements Validator {
private final MongoJsonSchema schema;
private JsonSchemaValidator(MongoJsonSchema schema) {
this.schema = schema;
}
/**
* Create new {@link JsonSchemaValidator} defining validation rules via {@link MongoJsonSchema}.
*
@@ -68,4 +67,22 @@ class JsonSchemaValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument());
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
JsonSchemaValidator that = (JsonSchemaValidator) o;
return ObjectUtils.nullSafeEquals(schema, that.schema);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(schema);
}
}

View File

@@ -129,7 +129,7 @@ public interface GridFsOperations extends ResourcePatternResolver {
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param contentType can be {@literal null}. If not empty, may override content type within {@literal metadata}.
* @param metadata can be {@literal null}.
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
@@ -140,12 +140,12 @@ public interface GridFsOperations extends ResourcePatternResolver {
if (StringUtils.hasText(filename)) {
uploadBuilder.filename(filename);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
return store(uploadBuilder.build());
}

View File

@@ -135,7 +135,7 @@ public interface ReactiveGridFsOperations {
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param contentType can be {@literal null}. If not empty, may override content type within {@literal metadata}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
@@ -148,12 +148,12 @@ public interface ReactiveGridFsOperations {
if (StringUtils.hasText(filename)) {
uploadBuilder.filename(filename);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
return store(uploadBuilder.build());
}

View File

@@ -16,6 +16,8 @@
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind;
@@ -30,10 +32,14 @@ import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.ExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.MongoDatabase;
/**
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
@@ -47,7 +53,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoOperations operations;
private final ExecutableFind<?> executableFind;
private final SpelExpressionParser expressionParser;
private final ExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
/**
@@ -58,7 +64,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @param expressionParser must not be {@literal null}.
* @param evaluationContextProvider must not be {@literal null}.
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations, SpelExpressionParser expressionParser,
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations, ExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(operations, "MongoOperations must not be null!");
@@ -208,6 +214,29 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return applyQueryMetaAttributesWhenPresent(createQuery(accessor));
}
/**
* Obtain a the {@link EvaluationContext} suitable to evaluate expressions backed by the given dependencies.
*
* @param dependencies must not be {@literal null}.
* @param accessor must not be {@literal null}.
* @return the {@link SpELExpressionEvaluator}.
* @since 2.4
*/
protected SpELExpressionEvaluator getSpELExpressionEvaluatorFor(ExpressionDependencies dependencies,
ConvertingParameterAccessor accessor) {
return new DefaultSpELExpressionEvaluator(expressionParser, evaluationContextProvider
.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues(), dependencies));
}
/**
* @return the {@link CodecRegistry} used.
* @since 2.4
*/
protected CodecRegistry getCodecRegistry() {
return operations.execute(MongoDatabase::getCodecRegistry);
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*

View File

@@ -15,13 +15,15 @@
*/
package org.springframework.data.mongodb.repository.query;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.codecs.configuration.CodecRegistry;
import org.reactivestreams.Publisher;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.model.EntityInstantiators;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery;
@@ -33,14 +35,17 @@ import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecu
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingConverter;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingExecution;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.ExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.MongoClientSettings;
/**
* Base class for reactive {@link RepositoryQuery} implementations for MongoDB.
*
@@ -54,8 +59,8 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
private final ReactiveMongoOperations operations;
private final EntityInstantiators instantiators;
private final FindWithProjection<?> findOperationWithProjection;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final ExpressionParser expressionParser;
private final ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider;
/**
* Creates a new {@link AbstractReactiveMongoQuery} from the given {@link MongoQueryMethod} and
@@ -67,12 +72,12 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public AbstractReactiveMongoQuery(ReactiveMongoQueryMethod method, ReactiveMongoOperations operations,
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
ExpressionParser expressionParser, ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(method, "MongoQueryMethod must not be null!");
Assert.notNull(operations, "ReactiveMongoOperations must not be null!");
Assert.notNull(expressionParser, "SpelExpressionParser must not be null!");
Assert.notNull(evaluationContextProvider, "QueryMethodEvaluationContextProvider must not be null!");
Assert.notNull(evaluationContextProvider, "ReactiveEvaluationContextExtension must not be null!");
this.method = method;
this.operations = operations;
@@ -98,25 +103,21 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* (non-Javadoc)
* @see org.springframework.data.repository.query.RepositoryQuery#execute(java.lang.Object[])
*/
public Object execute(Object[] parameters) {
public Publisher<Object> execute(Object[] parameters) {
return method.hasReactiveWrapperParameter() ? executeDeferred(parameters)
: execute(new MongoParametersParameterAccessor(method, parameters));
}
@SuppressWarnings("unchecked")
private Object executeDeferred(Object[] parameters) {
private Publisher<Object> executeDeferred(Object[] parameters) {
ReactiveMongoParameterAccessor parameterAccessor = new ReactiveMongoParameterAccessor(method, parameters);
if (getQueryMethod().isCollectionQuery()) {
return Flux.defer(() -> (Publisher<Object>) execute(parameterAccessor));
}
return Mono.defer(() -> (Mono<Object>) execute(parameterAccessor));
return execute(parameterAccessor);
}
private Object execute(MongoParameterAccessor parameterAccessor) {
private Publisher<Object> execute(MongoParameterAccessor parameterAccessor) {
ConvertingParameterAccessor accessor = new ConvertingParameterAccessor(operations.getConverter(),
parameterAccessor);
@@ -141,24 +142,26 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor for providing invocation arguments. Never {@literal null}.
* @param typeToRead the desired component target type. Can be {@literal null}.
*/
protected Object doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
protected Publisher<Object> doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
ConvertingParameterAccessor accessor, @Nullable Class<?> typeToRead) {
Query query = createQuery(accessor);
return createQuery(accessor).flatMapMany(it -> {
applyQueryMetaAttributesWhenPresent(query);
query = applyAnnotatedDefaultSortIfPresent(query);
query = applyAnnotatedCollationIfPresent(query, accessor);
Query query = it;
applyQueryMetaAttributesWhenPresent(query);
query = applyAnnotatedDefaultSortIfPresent(query);
query = applyAnnotatedCollationIfPresent(query, accessor);
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
String collection = method.getEntityInformation().getCollectionName();
String collection = method.getEntityInformation().getCollectionName();
ReactiveMongoQueryExecution execution = getExecution(accessor,
new ResultProcessingConverter(processor, operations, instantiators), find);
return execution.execute(query, processor.getReturnedType().getDomainType(), collection);
ReactiveMongoQueryExecution execution = getExecution(accessor,
new ResultProcessingConverter(processor, operations, instantiators), find);
return execution.execute(query, processor.getReturnedType().getDomainType(), collection);
});
}
/**
@@ -254,8 +257,37 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor must not be {@literal null}.
* @return
*/
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return applyQueryMetaAttributesWhenPresent(createQuery(accessor));
protected Mono<Query> createCountQuery(ConvertingParameterAccessor accessor) {
return createQuery(accessor).map(this::applyQueryMetaAttributesWhenPresent);
}
/**
* Obtain a {@link Mono publisher} emitting the {@link SpELExpressionEvaluator} suitable to evaluate expressions
* backed by the given dependencies.
*
* @param dependencies must not be {@literal null}.
* @param accessor must not be {@literal null}.
* @return a {@link Mono} emitting the {@link SpELExpressionEvaluator} when ready.
* @since 2.4
*/
protected Mono<SpELExpressionEvaluator> getSpelEvaluatorFor(ExpressionDependencies dependencies,
ConvertingParameterAccessor accessor) {
return evaluationContextProvider
.getEvaluationContextLater(getQueryMethod().getParameters(), accessor.getValues(), dependencies)
.map(evaluationContext -> (SpELExpressionEvaluator) new DefaultSpELExpressionEvaluator(expressionParser,
evaluationContext))
.defaultIfEmpty(DefaultSpELExpressionEvaluator.unsupported());
}
/**
* @return a {@link Mono} emitting the {@link CodecRegistry} when ready.
* @since 2.4
*/
protected Mono<CodecRegistry> getCodecRegistry() {
return Mono.from(operations.execute(db -> Mono.just(db.getCodecRegistry())))
.defaultIfEmpty(MongoClientSettings.getDefaultCodecRegistry());
}
/**
@@ -264,7 +296,7 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
protected abstract Mono<Query> createQuery(ConvertingParameterAccessor accessor);
/**
* Returns whether the query should get a count projection applied.

View File

@@ -15,10 +15,7 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
import java.time.Duration;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
@@ -27,18 +24,16 @@ import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.ExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
@@ -49,10 +44,9 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.2
*/
@UtilityClass
class AggregationUtils {
abstract class AggregationUtils {
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private AggregationUtils() {}
/**
* Apply a collation extracted from the given {@literal collationExpression} to the given
@@ -64,12 +58,12 @@ class AggregationUtils {
* @param accessor must not be {@literal null}.
* @return the {@link Query} having proper {@link Collation}.
* @see AggregationOptions#getCollation()
* @see CollationUtils#computeCollation(String, ConvertingParameterAccessor, MongoParameters, SpelExpressionParser,
* @see CollationUtils#computeCollation(String, ConvertingParameterAccessor, MongoParameters, ExpressionParser,
* QueryMethodEvaluationContextProvider)
*/
static AggregationOptions.Builder applyCollation(AggregationOptions.Builder builder,
@Nullable String collationExpression, ConvertingParameterAccessor accessor, MongoParameters parameters,
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
Collation collation = CollationUtils.computeCollation(collationExpression, accessor, parameters, expressionParser,
evaluationContextProvider);
@@ -105,32 +99,6 @@ class AggregationUtils {
return builder;
}
/**
* Compute the {@link AggregationOperation aggregation} pipeline for the given {@link MongoQueryMethod}. The raw
* {@link org.springframework.data.mongodb.repository.Aggregation#pipeline()} is parsed with a
* {@link ParameterBindingDocumentCodec} to obtain the MongoDB native {@link Document} representation returned by
* {@link AggregationOperation#toDocument(AggregationOperationContext)} that is mapped against the domain type
* properties.
*
* @param method
* @param accessor
* @param expressionParser
* @param evaluationContextProvider
* @return
*/
static List<AggregationOperation> computePipeline(MongoQueryMethod method, ConvertingParameterAccessor accessor,
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
() -> evaluationContextProvider.getEvaluationContext(method.getParameters(), accessor.getValues()));
List<AggregationOperation> target = new ArrayList<>(method.getAnnotatedAggregation().length);
for (String source : method.getAnnotatedAggregation()) {
target.add(ctx -> ctx.getMappedObject(CODEC.decode(source, bindingContext), method.getDomainClass()));
}
return target;
}
/**
* Append {@code $sort} aggregation stage if {@link ConvertingParameterAccessor#getSort()} is present.
*
@@ -196,9 +164,9 @@ class AggregationUtils {
* @throws IllegalArgumentException when none of the above rules is met.
*/
@Nullable
static <T> T extractSimpleTypeResult(Document source, Class<T> targetType, MongoConverter converter) {
static <T> T extractSimpleTypeResult(@Nullable Document source, Class<T> targetType, MongoConverter converter) {
if (source.isEmpty()) {
if (ObjectUtils.isEmpty(source)) {
return null;
}

View File

@@ -15,16 +15,17 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
/**
* Utility class containing methods to interact with boolean values.
*
* @author Mark Paluch
* @since 2.0.9
*/
@UtilityClass
class BooleanUtil {
final class BooleanUtil {
private BooleanUtil() {
throw new UnsupportedOperationException("This is a utility class and cannot be instantiated");
}
/**
* Count the number of {@literal true} values.

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
import java.util.Locale;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
@@ -26,6 +24,7 @@ import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.NumberUtils;
@@ -39,12 +38,14 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.2
*/
@UtilityClass
class CollationUtils {
abstract class CollationUtils {
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private static final Pattern PARAMETER_BINDING_PATTERN = Pattern.compile("\\?(\\d+)");
private CollationUtils() {
}
/**
* Compute the {@link Collation} by inspecting the {@link ConvertingParameterAccessor#getCollation() parameter
* accessor} or parsing a potentially given {@literal collationExpression}.
@@ -59,7 +60,7 @@ class CollationUtils {
*/
@Nullable
static Collation computeCollation(@Nullable String collationExpression, ConvertingParameterAccessor accessor,
MongoParameters parameters, SpelExpressionParser expressionParser,
MongoParameters parameters, ExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
if (accessor.getCollation() != null) {
@@ -72,8 +73,9 @@ class CollationUtils {
if (StringUtils.trimLeadingWhitespace(collationExpression).startsWith("{")) {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue),
expressionParser, () -> evaluationContextProvider.getEvaluationContext(parameters, accessor.getValues()));
ParameterBindingContext bindingContext = ParameterBindingContext.forExpressions(accessor::getBindableValue,
expressionParser, dependencies -> evaluationContextProvider.getEvaluationContext(parameters,
accessor.getValues(), dependencies));
return Collation.from(CODEC.decode(collationExpression, bindingContext));
}

View File

@@ -0,0 +1,73 @@
/*
* Copyright 2020-2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.ExpressionParser;
/**
* Simple {@link SpELExpressionEvaluator} implementation using {@link ExpressionParser} and {@link EvaluationContext}.
*
* @author Mark Paluch
* @since 3.1
*/
class DefaultSpELExpressionEvaluator implements SpELExpressionEvaluator {
private final ExpressionParser parser;
private final EvaluationContext context;
DefaultSpELExpressionEvaluator(ExpressionParser parser, EvaluationContext context) {
this.parser = parser;
this.context = context;
}
/**
* Return a {@link SpELExpressionEvaluator} that does not support expression evaluation.
*
* @return a {@link SpELExpressionEvaluator} that does not support expression evaluation.
* @since 3.1
*/
public static SpELExpressionEvaluator unsupported() {
return NoOpExpressionEvaluator.INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.SpELExpressionEvaluator#evaluate(java.lang.String)
*/
@Override
@SuppressWarnings("unchecked")
public <T> T evaluate(String expression) {
return (T) parser.parseExpression(expression).getValue(context, Object.class);
}
/**
* {@link SpELExpressionEvaluator} that does not support SpEL evaluation.
*
* @author Mark Paluch
* @since 3.1
*/
enum NoOpExpressionEvaluator implements SpELExpressionEvaluator {
INSTANCE;
@Override
public <T> T evaluate(String expression) {
throw new UnsupportedOperationException("Expression evaluation not supported");
}
}
}

View File

@@ -25,7 +25,6 @@ import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Range;
import org.springframework.data.domain.Range.Bound;
import org.springframework.data.domain.Sort;
@@ -51,8 +50,10 @@ import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.IgnoreCaseType;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.data.util.Streamable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
/**
* Custom query creator to create Mongo criterias.
@@ -196,9 +197,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters));
return criteria.nin(nextAsList(parameters, part));
case IN:
return criteria.in(nextAsArray(parameters));
return criteria.in(nextAsList(parameters, part));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
@@ -337,7 +338,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
Iterator<Object> parameters) {
if (property.isCollectionLike()) {
return criteria.in(nextAsArray(parameters));
return criteria.in(nextAsList(parameters, part));
}
return addAppropriateLikeRegexTo(criteria, part, parameters.next());
@@ -400,17 +401,24 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
String.format("Expected parameter type of %s but got %s!", type, parameter.getClass()));
}
private Object[] nextAsArray(Iterator<Object> iterator) {
private java.util.List<?> nextAsList(Iterator<Object> iterator, Part part) {
Object next = iterator.next();
if (next instanceof Collection) {
return ((Collection<?>) next).toArray();
} else if (next != null && next.getClass().isArray()) {
return (Object[]) next;
Streamable<?> streamable = asStreamable(iterator.next());
if (!isSimpleComparisionPossible(part)) {
streamable = streamable.map(MongoRegexCreator.INSTANCE::toCaseInsensitiveMatch);
}
return new Object[] { next };
return streamable.toList();
}
private Streamable<?> asStreamable(Object value) {
if (value instanceof Collection) {
return Streamable.of((Collection<?>) value);
} else if (ObjectUtils.isArray(value)) {
return Streamable.of((Object[]) value);
}
return Streamable.of(value);
}
private String toLikeRegex(String source, Part part) {

View File

@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.List;
import org.springframework.data.domain.Page;
@@ -30,6 +27,7 @@ import org.springframework.data.geo.GeoPage;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.ExecutableFindOperation;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -37,6 +35,7 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.support.PageableExecutionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.client.result.DeleteResult;
@@ -62,11 +61,19 @@ interface MongoQueryExecution {
* @author Christoph Strobl
* @since 1.5
*/
@RequiredArgsConstructor
final class SlicedExecution implements MongoQueryExecution {
private final @NonNull FindWithQuery<?> find;
private final @NonNull Pageable pageable;
private final FindWithQuery<?> find;
private final Pageable pageable;
public SlicedExecution(ExecutableFindOperation.FindWithQuery<?> find, Pageable pageable) {
Assert.notNull(find, "Find must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
this.find = find;
this.pageable = pageable;
}
/*
* (non-Javadoc)
@@ -95,11 +102,19 @@ interface MongoQueryExecution {
* @author Mark Paluch
* @author Christoph Strobl
*/
@RequiredArgsConstructor
final class PagedExecution implements MongoQueryExecution {
private final @NonNull FindWithQuery<?> operation;
private final @NonNull Pageable pageable;
private final FindWithQuery<?> operation;
private final Pageable pageable;
public PagedExecution(ExecutableFindOperation.FindWithQuery<?> operation, Pageable pageable) {
Assert.notNull(operation, "Operation must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
this.operation = operation;
this.pageable = pageable;
}
/*
* (non-Javadoc)
@@ -133,12 +148,23 @@ interface MongoQueryExecution {
*
* @author Oliver Gierke
*/
@RequiredArgsConstructor
class GeoNearExecution implements MongoQueryExecution {
private final @NonNull FindWithQuery<?> operation;
private final @NonNull MongoQueryMethod method;
private final @NonNull MongoParameterAccessor accessor;
private final FindWithQuery<?> operation;
private final MongoQueryMethod method;
private final MongoParameterAccessor accessor;
public GeoNearExecution(ExecutableFindOperation.FindWithQuery<?> operation, MongoQueryMethod method,
MongoParameterAccessor accessor) {
Assert.notNull(operation, "Operation must not be null!");
Assert.notNull(method, "Method must not be null!");
Assert.notNull(accessor, "Accessor must not be null!");
this.operation = operation;
this.method = method;
this.accessor = accessor;
}
/*
* (non-Javadoc)
@@ -236,11 +262,19 @@ interface MongoQueryExecution {
* @author Christoph Strobl
* @since 1.5
*/
@RequiredArgsConstructor
final class DeleteExecution implements MongoQueryExecution {
private final @NonNull MongoOperations operations;
private final @NonNull MongoQueryMethod method;
private final MongoOperations operations;
private final MongoQueryMethod method;
public DeleteExecution(MongoOperations operations, MongoQueryMethod method) {
Assert.notNull(operations, "Operations must not be null!");
Assert.notNull(method, "Method must not be null!");
this.operations = operations;
this.method = method;
}
/*
* (non-Javadoc)

View File

@@ -32,7 +32,7 @@ import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.repository.query.ReturnedType;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.ExpressionParser;
import org.springframework.util.StringUtils;
/**
@@ -59,7 +59,7 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations,
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);

Some files were not shown because too many files have changed in this diff Show More