Compare commits

..

263 Commits

Author SHA1 Message Date
Mark Paluch
f0c0a86118 Release version 4.0 M2 (2022.0.0).
See #3937
2022-03-21 16:35:08 +01:00
Mark Paluch
3d82e12e6b Prepare 4.0 M2 (2022.0.0).
See #3937
2022-03-21 16:34:37 +01:00
Mark Paluch
8672808222 Polishing.
Reformat code. Tweak documentation wording.

See #3596
Original pull request: #3982.
2022-03-21 09:20:47 +01:00
Christoph Strobl
29fb085d8b Add support for PropertyValueConverters.
Closes: #3596
Original pull request: #3982.
2022-03-21 09:20:34 +01:00
Mark Paluch
15cac49f9c Polishing.
Refine API naming towards merge/property instead of combine/specify. Tweak documentation. Introduce Resolution.ofValue(…) for easier creation.

See #3870
Original pull request: #3986.
2022-03-18 14:11:35 +01:00
Christoph Strobl
946deac48c Support generating JsonSchema for Polymorphic fields.
This commit introduces MergedJsonSchema and MergedJsonSchemaProperty that can be used to merge properties of multiple objects into one as long as the additions do not conflict with another (eg. due to usage of different types).
To resolve previously mentioned errors it is required to provide a ConflictResolutionFunction.

Closes #3870
Original pull request: #3986.
2022-03-18 14:11:35 +01:00
Mark Paluch
02229f291c Polishing.
Reformat code.

See #3998
2022-03-16 16:44:06 +01:00
Mark Paluch
1009491920 Create a new conversion context for projection properties.
We now create a new conversion context to ensure that we use the correct property type to avoid type retention when mapping complex objects within a projection.

Closes #3998
2022-03-16 16:29:14 +01:00
Mark Paluch
05730ded1b Fix CI trigger.
Use correct Spring Data Commons version as build trigger.

See #3973
2022-03-15 14:38:54 +01:00
Mark Paluch
612845f59c Polishing.
Extract CreateCollectionOptions conversion to EntityOperations to unify collection creation. Adopt tests.

See #3984
Original pull request: #3990.
2022-03-11 15:21:05 +01:00
Mark Paluch
1f06954952 Polishing.
Add missing Override annotations to template API methods.

See #3984
2022-03-11 15:20:20 +01:00
Christoph Strobl
7bcf0322d2 Propagate time series options correctly.
This commit fixes an issue when creating a collection via MongoTemplate without passing on type information. In this case potential time series information was lost.

Closes #3984
Original pull request: #3990.
2022-03-11 15:18:58 +01:00
Mark Paluch
7dd2f350eb Polishing.
Remove duplicate dependency declaration.

See: #3522
2022-03-11 14:10:49 +01:00
Mark Paluch
e433375cac Polishing.
Reorder methods. Add links to Javadoc. Tweak wording.

See: #3522
Original pull request: #3951.
2022-03-11 14:09:58 +01:00
Christoph Strobl
d16013aa6b Allow to estimate document count.
This commit introduce an option that allows users to opt in on using estimatedDocumentCount instead of countDocuments in case the used filter query is empty.
To still be able to retrieve the exact number of matching documents we also introduced MongoTemplate#exactCount.

Closes: #3522
Original pull request: #3951.
2022-03-11 14:06:30 +01:00
Christoph Strobl
dab5473740 Modify visibility of methods in TypedJsonSchemaObject.
Change visibility to public as it should have been in first place.

Closes: #3989
2022-03-10 09:23:25 +01:00
sangyongchoi
e6fce75dfd Remove duplicate condition in GeoConverters.
Closes: #3981
2022-03-03 13:29:24 +01:00
Mark Paluch
e75f022844 Update CI properties.
See #3937
2022-02-23 14:33:13 -06:00
Christoph Strobl
611ece049b Serialize values for debug output safely in AbstractMongoEventListener.
We now make sure that codec configuration will not cause an exception when debug logging is turned on.

Resolves: #3968
Original Pull Request: #3970
2022-02-18 10:15:28 +01:00
Christoph Strobl
be2286edf7 Update copyright year to 2022.
See: #3966
2022-02-17 10:49:44 +01:00
Christoph Strobl
b99648672b Introduce Update annotation.
Switch update execution to an annotation based model that allows usage of both the classic update as well as the aggregation pipeline variant. Add the reactive variant of it.
Make sure to allow parameter binding for update expressions and verify method return types.
Update Javadoc and reference documentation.

See: #2107
Original Pull Request: #284
2022-02-17 10:31:46 +01:00
Thomas Darimont
28708ce24e Add support for modifying documents via repository method.
We now support findAndModify operations on derived query methods.

Closes: #2107
Original Pull Request: #284
2022-02-17 10:29:37 +01:00
Christoph Strobl
1c6c703640 Deprecate mapReduce.
Closes: #3945
2022-02-16 14:45:01 +01:00
blu10ph
67b1fe5fbc Avoid obtaining mapped sort multiple times for mapReduce.
Apply already mapped sort for map reduce instead of running the source document through the mapping layer again.

Closes: #3960
2022-02-16 14:42:57 +01:00
Christoph Strobl
4f6501f140 Update GeoJson section in reference documentation.
Mention the relation of Point/GeoJsonPoint x/y coordinates to longitude/latitude.

Original Pull Request: #3956
2022-02-16 14:42:19 +01:00
sangyongchoi
2a3f746cb6 Update GeoJsonPoint Javadoc.
Mention x -> longitude, y -> latitude relation.

Closes: #3956
2022-02-16 14:41:33 +01:00
Mark Paluch
90b8ba7246 Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-16 13:36:39 +01:00
Oliver Drotbohm
f12648af4c Adapt to API changes regarding object creation metadata. 2022-02-15 17:37:58 +01:00
Greg L. Turnquist
e812f89b47 Update CI properties.
See #3937
2022-02-15 09:00:21 -06:00
Christoph Strobl
f96d700d8d Favor Base64Utils over bson internal Base64 type.
org.bson.internal.Base64 is no longer available in MongoDB driver 4.5.0.

Related to: #3962
2022-02-14 11:09:26 +01:00
Christoph Strobl
32e7f2032d Upgrade to MongoDB driver 4.5.0
Closes: #3962
2022-02-14 11:09:14 +01:00
Mark Paluch
43ac1984ab Adapt repository to List-based interface variants.
Closes #3964
2022-02-14 11:07:27 +01:00
Greg L. Turnquist
28f262309c Use Harbor Proxy for containers.
Leverage internal infrastructure for pulling Docker container images. Reduces pressure on Docker Hub and reduces risk of hitting rate limits.

See #3954.
Related https://github.com/spring-projects/spring-data-build/issues/1630.
2022-02-07 10:59:50 -06:00
Mark Paluch
eefe6b3b21 Update CI properties.
See #3937
2022-02-07 09:32:15 +01:00
Mark Paluch
d0f2ca9efc Polishing.
Refine build script.

See #3949
2022-02-04 08:49:49 +01:00
Mark Paluch
e6c8ee037a Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-04 08:46:05 +01:00
Christoph Strobl
e63013deac Remove previously deprecated API.
This commit removes and moves off deprecated API.
Additionally some blocks got deprecated due to changes in MongoDB server API.

Resolves: #3952
2022-02-03 16:44:03 +01:00
Mark Paluch
2367379b6d Use Java 8 Stream as return type for Template operations returning a stream.
We now use Stream instead of CloseableIterator for easier stream creation.

Closes: #3944
Original Pull Request: #3946
2022-02-03 08:30:27 +01:00
Mark Paluch
a1c483f2e1 After release cleanups.
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
64b8b500ae Prepare next development iteration.
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
2d15e37bc7 Release version 4.0 M1 (2022.0.0).
See #3927
2022-02-03 08:08:50 +01:00
Mark Paluch
54655b88c0 Prepare 4.0 M1 (2022.0.0).
See #3927
2022-02-03 08:08:49 +01:00
Jens Schauder
cd395e3324 Remove Eclipse Non-Javadoc comments.
Closes #3924
2022-02-03 08:08:00 +01:00
Mark Paluch
33bdbbe851 Polishing. 2022-02-03 08:08:00 +01:00
Christoph Strobl
9f1448df44 Drop support for RxJava 1 and 2.
Closes: #3839
2022-02-03 08:08:00 +01:00
Christoph Strobl
e3a4bada63 Move to Jakarta EE9
Closes: #3830
2022-02-03 08:08:00 +01:00
Christoph Strobl
dcdf3a2365 Prepare Spring Data MongoDB 4.x branch.
Upgrade to data-commons 3.0 and Java 17 (still source level 16 due to asm).
Remove support for threeten, joda-time.
Transition to PersistentEntitiesFactoryBean from data-commons.
Update build to MongoDB 4.4 and 5 with Java17. Remove Java8 setup.
Fix javadoc tooling error on cdi 1 vs. 2 version mix.
Disabled internal package cycle analysis as this requires transition to ArchUnit.
2022-02-03 08:07:59 +01:00
Mihail Cornescu
423e10b7bc Add IgnoreCase to repository queries documentation.
Update reference documentaion and add missing IgnoreCase keyword.

Closes: #3916
Original Pull Request: #3950
2022-02-02 13:13:10 +01:00
Greg L. Turnquist
f62feac421 Externalize build properties.
By reading a properties file from an external location, it is possible to inject a consistent set of properties from Spring Data Build. This also supports repeatable builds.

Closes #3949.
2022-01-31 16:33:45 -06:00
Christoph Strobl
bcbefa9264 Support aggregation operators $first and $last via expression method reference.
This commit registers the first(...) and last(...) methods for transformation via SpEL.
Also update reference and java documentation and add issue reference to tests.

Original Pull Request: #3866
2022-01-24 09:22:33 +01:00
Divya Srivastava
a2243536b2 Add support for $first & $last aggregation operators.
Closes: #3694
Original Pull Request: #3866
2022-01-24 09:22:27 +01:00
Christoph Strobl
a36e292158 Update aggregation StringOperators documentation.
Update reference and java documentation.
Add issue reference to tests.
Align method names and format code.

Original Pull Request: #3861
2022-01-21 10:42:19 +01:00
Divya Srivastava
494c22b192 Add support for $replaceOne & $replaceAll aggregation operators.
Closes: #3695
Original Pull Request: #3861
2022-01-21 10:40:54 +01:00
Christoph Strobl
030f12023c Support aggregation operators $acos and $acosh via expression method reference.
This commit registers the acos(...) and acosh(...) methods for transformation via SpEL.

Original Pull Request: #3858
2022-01-20 16:18:23 +01:00
Christoph Strobl
31f640a398 Polishing.
Update java doc and add issue references to tests.

Original Pull Request: #3858
2022-01-20 16:18:10 +01:00
Divya Srivastava
54f098a906 Add support for $acos & $acosh aggregation operators.
Resolves: #3707
Original Pull Request: #3858
2022-01-20 16:17:55 +01:00
Christoph Strobl
885d05965b Revert query modification in json parsing tests.
Add tests and move json string treatment into the ParameterBindingDocumentCodec.
Finally add issue references and format code.

Original Pull Request: #3907
2022-01-20 14:49:39 +01:00
rolag-it
a8a0fb5dba Fix expression defining entire query in annotated repository methods.
This fix enables defining an entire JSON-based query in Query and Aggregate annotations using a single parameter or SpEL Expression.

Resolves: #3871
Original Pull Request: #3907
2022-01-20 13:43:32 +01:00
Christoph Strobl
67edae8602 After release cleanups.
See #3936
2022-01-18 09:21:33 +01:00
Christoph Strobl
249e7746d5 Prepare next development iteration.
See #3936
2022-01-18 09:21:29 +01:00
Christoph Strobl
6a979088b5 Release version 3.4 M2 (2021.2.0).
See #3936
2022-01-18 09:09:36 +01:00
Christoph Strobl
fca629c117 Prepare 3.4 M2 (2021.2.0).
See #3936
2022-01-18 09:09:00 +01:00
Mark Paluch
dfbd1bded5 Polishing.
Run mvnw -version command in Artifactory build to display Java version.

See #3882
2022-01-17 13:55:53 +01:00
Christoph Strobl
f9e98669b9 After release cleanups.
See #3882
2022-01-14 11:08:02 +01:00
Christoph Strobl
96d4abdf24 Prepare next development iteration.
See #3882
2022-01-14 11:07:59 +01:00
Christoph Strobl
23442ef639 Release version 3.4 M1 (2021.2.0).
See #3882
2022-01-14 10:58:16 +01:00
Christoph Strobl
01b571dec9 Prepare 3.4 M1 (2021.2.0).
See #3882
2022-01-14 10:57:50 +01:00
Christoph Strobl
04ec49eb9e Avoid schema keyId uuid representation errors.
To avoid driver configuration specific UUID representation format errors (binary subtype 3 vs. subtype 4) we now directly convert the given key into its subtype 4 format.

Resolves: #3929
Original pull request: #3931.
2022-01-13 15:26:53 +01:00
Mark Paluch
d61cf8f57e Polishing.
Simplify assertions, reformat code.

See #3921
Original pull request: #3930.
2022-01-13 11:01:48 +01:00
Christoph Strobl
50a12121f2 Use index instead of iterator to map position and map keys for updates.
This commit removes usage of the iterator and replaces map key and positional parameter mappings with an index based token lookup.

Closes #3921
Original pull request: #3930.
2022-01-13 11:01:35 +01:00
Mark Paluch
998bd1f9bb Polishing.
Reformat code. Tweak documentation wording and callout syntax.

See #3914, see #3901
Original pull request: #3915.
2022-01-12 15:59:14 +01:00
Christoph Strobl
e0a57fa19b Avoid creating invalid index definitions for Map-like properties.
This commit makes sure to exclude Map like structures from index inspection unless annotated with WilcardIndexed.

Closes #3914, closes #3901
Original pull request: #3915.
2022-01-12 15:57:19 +01:00
Mark Paluch
9c78802c47 Polishing.
Add author tags, extend copyright license years, simplify tests.

See #3892
2022-01-12 15:30:36 +01:00
rolag-it
a958ffb5c8 Fix pagination with reactive fluent Querydsl query definition.
Pageable object was not passed to Query, so fetchPage retrieved erroneously the whole dataset as Page content.

Closes #3892
2022-01-12 15:30:31 +01:00
Hett
c31872d979 Avoid double call of fetch method in DefaultReferenceResolver.
This commit fixes an issue where the fetch method is called twice when looking up singe value references.

Resolves: #3918
Original Pull Request: #3919
2022-01-11 08:57:02 +01:00
Mark Paluch
212509f56a Upgrade to MongoDB driver 4.4.1.
Closes #3926
2022-01-10 08:52:57 +01:00
Mark Paluch
b348bb6679 Adopt to Mockito changes.
Closes #3923
2022-01-04 14:41:47 +01:00
Christoph Strobl
8be5dd3909 Adapt to changes in data-commons.
See: spring-projects/spring-data-commons#2514
Related to: #3894
2021-12-14 15:31:45 +01:00
Christoph Strobl
f2c4370584 Fix meta field mapping when computing fields for projections.
Related to: #3894
2021-12-14 14:58:53 +01:00
Christoph Strobl
fdff74f7b5 Polishing.
Add tests for projections on dbref properties update java- and reference documentation.

Original Pull Request: #3894
2021-12-14 11:31:00 +01:00
Mark Paluch
0070b12f95 Add general support for direct projections.
Closes: #3894
2021-12-14 11:30:45 +01:00
Mark Paluch
bafc2bebf2 Polishing.
Tweak Javadoc.

See #3898
Original pull request: #3904.
2021-12-14 09:36:19 +01:00
Christoph Strobl
7146fb33e9 Fix field inclusion in aggregation project operation.
Closes #3898
Original pull request: #3904.
2021-12-14 09:36:14 +01:00
Mark Paluch
75999d9e36 Propagate Bean ClassLoader to MongoTypeMapper.
We now set the ClassLoader from the ApplicationContext to the type mapper to ensure the type mapper has access to entities. Previously, `SimpleTypeInformationMapper` used the contextual class loader and that failed in Fork/Join-Pool threads such as parallel streams as ForkJoinPool uses the system classloader. Running e.g. a packaged Boot application sets up an application ClassLoader that has access to packaged code while the system ClassLoader does not.

Also, consistently access the MongoTypeMapper through its getter.

Closes #3905
2021-12-09 11:33:47 +01:00
John Blum
7a64025669 Edit README.
Fix punctation, change wording and correct case in build from source section.
2021-12-07 17:58:54 -08:00
John Blum
113106037a Polish for retaining the sort order when using text search sort by score.
Closes gh-3896.
2021-12-07 17:37:24 -08:00
Christoph Strobl
132834b1e6 Retain sort order when using text search sort by score.
We now make sure to capture the position to apply sort by score.
2021-12-07 17:37:10 -08:00
Christoph Strobl
36a4b7f727 Polishing - Update outdated links in Javadoc.
Original Pull Request #3883
2021-11-16 14:26:21 +01:00
Mark Paluch
a8432f5bf1 Polishing.
Remove unused fields.

Original Pull Request #3883
2021-11-16 14:26:02 +01:00
Christoph Strobl
e5a295bb8f Guard potentially expensive log message computations.
Original Pull Request #3883
2021-11-16 14:25:40 +01:00
Mark Paluch
f7cbd4264a Migrate off SLF4J to Spring JCL.
Closes #3881
Original Pull Request #3883
2021-11-16 14:25:18 +01:00
Jens Schauder
0c37a20a0b After release cleanups.
See #3865
2021-11-12 10:59:43 +01:00
Jens Schauder
230775f98e Prepare next development iteration.
See #3865
2021-11-12 10:59:41 +01:00
Jens Schauder
b28da2eed3 Release version 3.3 GA (2021.1.0).
See #3865
2021-11-12 10:49:36 +01:00
Jens Schauder
02de914993 Prepare 3.3 GA (2021.1.0).
See #3865
2021-11-12 10:49:15 +01:00
Christoph Strobl
5b498f809e Upgrade to MongoDB 4.4.0 Drivers.
Closes: #3875
2021-11-12 07:23:57 +01:00
Mark Paluch
f94a7ee742 After release cleanups.
See #3829
2021-10-18 13:55:41 +02:00
Mark Paluch
ab0ffab488 Prepare next development iteration.
See #3829
2021-10-18 13:55:38 +02:00
Mark Paluch
85b47d66f1 Release version 3.3 RC1 (2021.1.0).
See #3829
2021-10-18 13:48:13 +02:00
Mark Paluch
0d8fe46f3b Prepare 3.3 RC1 (2021.1.0).
See #3829
2021-10-18 13:47:46 +02:00
Christoph Strobl
0bc78f99dd Upgrade to MongoDB 4.4.0-beta1 Drivers.
Closes: #3860
2021-10-14 16:21:32 +02:00
Mark Paluch
e4f450f667 Upgrade to Maven Wrapper 3.8.3.
See #3859
2021-10-11 14:30:24 +02:00
Mark Paluch
49cd44295c Add support for fluent Querydsl and Query by Example query definition.
We now support the functional fluent query definition API for imperative and reactive usage with Querydsl and Query by Example.

Page<PersonProjection> first = repository.findBy(Example.of(probe),
		it -> it.as(PersonProjection.class).project("firstname").page(PageRequest.of(0, 1, Sort.by("firstname"))));

Closes #3757
Original pull request: #3788.
2021-10-08 11:37:26 +02:00
Mark Paluch
767d97a831 Polishing.
Remove code duplications. Reuse target type computation for enum types. Refine method names.

See #3766
Closes #3837
2021-10-07 15:57:08 +02:00
Christoph Strobl
bbe8410979 Render items for collection-like properties when deriving $jsonSchema.
Closes #3766
Original pull request: #3837.
2021-10-07 15:57:08 +02:00
Mark Paluch
c0a4bdb548 Polishing.
Reformat code.

See #3853
Original pull request: #3856.
2021-10-07 15:23:44 +02:00
Christoph Strobl
673a81af0e Fix query/update document reference computation for non-id properties.
We now consider using the target keyword when computing document references. This fixes an issue where query/update statements had not been rendered correctly for references like { 'name' : ?#{#target} }.

Closes #3853
Original pull request: #3856.
2021-10-07 15:23:29 +02:00
Christoph Strobl
977032620e Upgrade to MongoDB 4.3.3 Drivers.
Closes: #3855
2021-10-06 10:51:12 +02:00
Mark Paluch
c7263e5b11 Polishing.
Reformat code.

See #3847
Original pull request: #3848.
2021-10-05 10:08:33 +02:00
Christoph Strobl
eed9b2470a Fix id conversion when storing document reference.
This commit makes sure that the string to ObjectId conversion when storing document references follows the general conversion rules.

Closes #3847
Original pull request: #3848.
2021-10-05 10:08:19 +02:00
Mark Paluch
c350be1f52 Polishing.
Reformat code.

See #3842
Original pull request: #3844.
2021-10-05 09:42:25 +02:00
Christoph Strobl
65b058ffd9 Fix SpEL evaluation in document reference lookup.
Closes #3842
Original pull request: #3844.
2021-10-05 09:42:19 +02:00
Christoph Strobl
4a5789d67e DocumentReference should consider Reference annotations.
Closes #3851
Original pull request: #3852.
2021-10-04 14:30:40 +02:00
Christoph Strobl
7b05cfad94 Update readme.
Shorten MongoDB setup section and add missing anchors.

Original Pull Request: #3833
2021-10-04 10:43:58 +02:00
John Blum
a16a9fe1fe Edit README to include complete steps for building from source.
Closes: #3833
2021-10-04 10:43:04 +02:00
Christoph Strobl
2f98a6656b Fix javadoc errors and warnings
Closes: #3835
2021-09-27 11:13:08 +02:00
Christoph Strobl
9e2f6055a3 Refine CI job triggers.
See #3696
Original pull request: #3753.
2021-09-21 15:20:36 +02:00
Mark Paluch
7f58538292 Use HTTPS in Dockerfiles for package download.
See #3696
Original pull request: #3753.
2021-09-21 15:20:31 +02:00
Christoph Strobl
2f208d712c Update CI to cover MongoDB Server 5.0.
MongoDB has alpha releases in a slightly different location on their distribution server. And they use different keys for signing these alpha releases compared to the overall package listing.

Closes #3696
Original pull request: #3753.
2021-09-21 15:20:26 +02:00
Christoph Strobl
63d9875576 Update test for MongoDB Server 5.0.
Update assertions for changed return types, add a bit of think time and disable tests for no longer supported features.

See #3696
Original pull request: #3753.
2021-09-21 15:20:01 +02:00
Mark Paluch
b7ffff4769 After release cleanups.
See #3771
2021-09-17 09:52:21 +02:00
Mark Paluch
715ae26f3c Prepare next development iteration.
See #3771
2021-09-17 09:52:18 +02:00
Mark Paluch
00350edd32 Release version 3.3 M3 (2021.1.0).
See #3771
2021-09-17 09:44:56 +02:00
Mark Paluch
38e1d0d92d Prepare 3.3 M3 (2021.1.0).
See #3771
2021-09-17 09:44:34 +02:00
Christoph Strobl
8f00ffd291 Change visibility of PersistentEntitiesFactoryBean.
Closes: #3825
2021-09-15 15:30:30 +02:00
Mark Paluch
0af8d6839e Polishing.
Reformat code, fix ticket references in tests.

See #3820
Original pull request: #3821.
2021-09-14 09:12:21 +02:00
Christoph Strobl
9b02897db5 Add configuration support for MongoDB ServerApiVersion.
Introduce FactoryBean and required options to set the ServerAPI.
Update namespace xsd and parsing.

Closes: #3820
Original pull request: #3821.
2021-09-14 09:12:09 +02:00
Christoph Strobl
99203b397a Add support for deriving json schema for encrypted properties.
This commit introduces support for creating a MongoJsonSchema containing encrypted fields for a given type based on mapping metadata.
Using the Encrypted annotation allows to derive required encryptMetadata and encrypt properties within a given (mapping)context.

@Document
@Encrypted(keyId = "...")
static class Patient {

    // ...

    @Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic")
    private Integer ssn;

}

MongoJsonSchemaCreator schemaCreator = MongoJsonSchemaCreator.create(mappingContext);
MongoJsonSchema patientSchema = schemaCreator
    .filter(MongoJsonSchemaCreator.encryptedOnly())
    .createSchemaFor(Patient.class);

Closes: #3800
Original pull request: #3801.
2021-09-14 08:56:53 +02:00
Christoph Strobl
eda1c79315 Move and add tests to UpdateMapper.
Also update author information.

Original Pull Request: #3815
2021-09-13 14:26:46 +02:00
divyajnu08
e7150f525e Fix update mapping using nested integer keys on map structures.
Closes: #3775
Original Pull Request: #3815
2021-09-13 14:26:12 +02:00
Mark Paluch
7d6b5ae5fe Upgrade to Maven Wrapper 3.8.2.
See #3818
2021-09-10 15:37:59 +02:00
Christoph Strobl
d70e459ffe Upgrade to MongoDB Java Drivers 4.3.2
Closes: #3816
2021-09-10 10:48:22 +02:00
Mark Paluch
a26e780957 Reduce allocations in query and update mapping.
Introduce EmptyDocument and utility methods in BsonUtils. Avoid entrySet and iterator creation for document iterations/inspections.

Relates to: #3760
Original Pull Request: #3809
2021-09-09 08:00:07 +02:00
Mark Paluch
8fb0e1326b Introduce SessionSynchronization.NEVER to disable transactional participation.
SessionSynchronization.NEVER bypasses all transactional integration in cases where applications do not want to make use of transactions so that transaction inspection overhead is avoided.

Closes: #3760
Original Pull Request: #3809
2021-09-09 07:58:34 +02:00
Christoph Strobl
9014f770d8 Fix slice argument in query fields projection.
We now use a Collection instead of an Array to pass on $slice projection values for offset and limit.

Closes: #3811
Original pull request: #3812.
2021-09-08 14:46:31 +02:00
Christoph Strobl
f128e6df15 Fix @DocumentReference resolution for properties used in constructor.
This commit fixes an issue that prevented referenced entities from being used as constructor arguments.

Closes: #3806
Original pull request: #3810.
2021-09-08 14:35:43 +02:00
Mark Paluch
270456ed81 Polishing.
Extract query that yields no hits into constant. Guard Map-typed reference properties against empty $or.

See #3805
Original pull request: #3807.
2021-09-08 14:18:35 +02:00
Christoph Strobl
4e960a9682 Fix document reference on empty reference arrays.
This commit fixes an issue caused by empty reference arrays.

Closes #3805
Original pull request: #3807.
2021-09-08 14:18:30 +02:00
Mark Paluch
061c28f84a Polishing.
Add ticket reference to tests.

See #3803
2021-09-08 13:59:11 +02:00
Oliver Drotbohm
cba7eaba4c Polishing.
Formatting and indentation in parent project's pom.xml.

See #3803
2021-09-08 13:59:06 +02:00
Oliver Drotbohm
ada7e199a4 Properly detect all supported identifier annotations as explicitly annotated.
We now simply delegate to AnnotationBasedPersistentProperty.isIdProperty() for the detection of annotated identifiers. The previous, manual identifier check was preventing additional identifier annotations, supported by ABP, to be considered, too.

Fixes #3803.
2021-09-08 13:56:49 +02:00
Mark Paluch
977e5e4c5c Polishing.
Tweak reference documentation wording. Extract self/target source dereferencing into utility methods.

See: #3798
Original pull request: #3802.
2021-09-08 13:53:03 +02:00
Christoph Strobl
c8307d5a39 Allow one-to-many style lookups with via @DocumentReference.
This commit adds support for relational style One-To-Many references using a combination of ReadonlyProperty and @DocumentReference.
It allows to link types without explicitly storing the linking values within the document itself.

@Document
class Publisher {

  @Id
  ObjectId id;
  // ...

  @ReadOnlyProperty
  @DocumentReference(lookup="{'publisherId':?#{#self._id} }")
  List<Book> books;
}

Closes: #3798
Original pull request: #3802.
2021-09-08 13:52:46 +02:00
Mark Paluch
dcf184888e Polishing.
Add since and author tags. Update reference docs. Fix format of ticket references in tests.

See #3708
Original pull request: #3796.
2021-09-07 09:56:47 +02:00
divyajnu08
59d0042d13 Add support for $asin and $asinh aggregation operators.
Closes #3708
Original pull request: #3796.
2021-09-07 09:56:30 +02:00
Mark Paluch
8af904b81f Polishing.
Add author and since tags. Tweak Javadoc format.

See #3709
Original pull request: #3794.
2021-09-06 15:50:58 +02:00
divya srivastava
ffceed8da9 Add support for $atan, $atan2 and $atanh aggregation operators.
Closes #3709
Original pull request: #3794.
2021-09-06 15:50:44 +02:00
Mark Paluch
34d66a276a Polishing.
Add license headers. Update Javadoc, author, and since tags. Add tests. Add toCriteriaDefinition method.

See #3790
2021-09-06 15:08:18 +02:00
divyajnu08
e71ec874ab Add support for $expr operator.
Also, allow construction of $match with an AggregationExpression.

Closes #3790
2021-09-06 15:07:06 +02:00
Mark Paluch
f24e8e5361 Avoid nested Document conversion to primitive types for fields with an explicit write target.
We now no longer attempt to convert query Documents into primitive types to avoid e.g. Document to String conversion.

Closes: #3783
Original Pull Request: #3797
2021-09-02 09:50:37 +02:00
Christoph Strobl
bf86f39b2d Fix id field target type conversion for document references.
This commit fixes an issue where a defined custom target type conversion for the id field was not properly considered when writing a document reference.  Previously an eg. String was not being converted into an ObjectId correctly causing lookup queries to return empty results.
Converting the id property value on write solves the issue.
Includes a minor polish in the mapping centralizing pointer creation within the DocumentPointerFactory.

Closes: #3782
Original pull request: #3785.
2021-08-27 09:40:11 +02:00
Mark Paluch
f662d7ca0d Polishing.
Tweak Javadoc. Add since tag, reformat code. Simplify tests. Move documentation bits into the right place.

See #3726.
Original pull request: #3765.
2021-08-27 09:35:33 +02:00
James McNee
62eb719b1e Add support for $sampleRate criteria.
Closes #3726
Original pull request: #3765.
2021-08-27 09:35:12 +02:00
Mark Paluch
69b582823a Polishing.
Add support for Pattern. Extract Regex flags translation from Criteria into RegexFlags utility class. Add since and author tags. Simplify tests. Update reference documentation.

See #3725.
Original pull request: #3781.
2021-08-27 08:47:20 +02:00
divya srivastava
297ef98239 Add support for $regexFind, $regexFindAll, and $regexMatch aggregation operators.
Closes #3725
Original pull request: #3781.
2021-08-26 12:21:55 +02:00
Mark Paluch
f71f107445 Polishing.
Reorder methods. Add since tag. Simplify assertions. Use diamond syntax.

See: #3776
Original pull request: #3777.
2021-08-25 14:58:00 +02:00
Ivan Volzhev
36e2d80d71 Relax requirement for GeoJsonMultiPoint construction allowing creation using a single point.
Only 1 point is required per GeoJson RFC and Mongo works just fine with 1 point as well.

Closes #3776
Original pull request: #3777.
2021-08-25 14:57:08 +02:00
Mark Paluch
467536cb34 Polishing.
Update since version. Reformat code.

See: #3761.
2021-08-25 14:33:36 +02:00
sangyongchoi
302c8031f9 Add Criteria infix functions for maxDistance and minDistance.
Closes: #3761
2021-08-25 14:33:10 +02:00
Mark Paluch
7c6e951c7c Polishing.
Add author tags, tweak Javadoc style. Simplify tests. Document operator.

See #3724
Original pull request: #3759.
2021-08-25 11:13:02 +02:00
Mushtaq Ahmed
92cc2a582a Add support for $rand aggregation operator.
Closes #3724
Original pull request: #3759
2021-08-25 11:08:16 +02:00
Mark Paluch
24171b3ae2 Polishing.
Introduce factory methods to convert TimeZone/ZoneId/ZoneOffset into Mongo Timezone. Introduce TemporalUnit abstraction and converters to convert ChronoUnit and TimeUnit into TemporalUnit for date operators accepting a unit parameter.

See #3713
Original pull request: #3748.
2021-08-25 11:01:34 +02:00
Christoph Strobl
456c1ad26a Add shortcut for date aggregation operators working with timezone.
See: #3713
Original pull request: #3748.
2021-08-25 11:01:27 +02:00
Christoph Strobl
fc41793d5d Add support for $dateDiff aggregation operator.
Closes: #3713
Original pull request: #3748.
2021-08-25 11:01:17 +02:00
Christoph Strobl
afef243634 Add support for $dateAdd aggregation operator.
Closes: #3713
Original pull request: #3748.
2021-08-25 11:01:00 +02:00
Mark Paluch
869b88702d Polishing.
Fix typo in reference docs.

See #3758
2021-08-25 10:15:13 +02:00
Ryan Gibb
aca403c112 Fix a typo in MongoConverter javadoc.
Original pull request: #3758.
2021-08-25 10:14:20 +02:00
Mark Paluch
df0372eee1 Polishing.
Rename AngularDimension to AngularUnit. Tweak Javadoc. Simplify tests. Update reference docs.

See: #3710, #3714, #3728, #3730
Original pull request: #3755.
2021-08-24 16:18:17 +02:00
Christoph Strobl
c4c6267d91 Add support for $cos and $cosh aggregation operators.
Closes: #3710
Original pull request: #3755.
2021-08-24 16:18:17 +02:00
Christoph Strobl
73d5886aae Add support for $tan and $tanh aggregation operators.
Closes: #3730
Original pull request: #3755.
2021-08-24 16:18:17 +02:00
Christoph Strobl
0db47169cf Add support for $sin and $sinh aggregation operators.
Closes: #3728
Original pull request: #3755.
2021-08-24 16:18:14 +02:00
Christoph Strobl
ec16b873b7 Add support for $degreesToRadians aggregation operator.
Closes: #3714
Original pull request: #3755.
2021-08-24 16:16:01 +02:00
Mark Paluch
2a3a4cf030 Polishing.
Fix method order from earlier merges. Add missing Javadoc. Simplify tests. Update documentation.

See #3721
Original pull request: #3746.
2021-08-24 15:01:54 +02:00
Christoph Strobl
df2b2a2f68 Add support for $integral aggregation operator.
Closes: #3721
Original pull request: #3746.
2021-08-24 15:01:49 +02:00
Mark Paluch
fd0a402c99 Polishing.
See #3720
Original pull request: #3745.
2021-08-24 14:34:43 +02:00
Christoph Strobl
6bd0f758fe Extend support for $ifNull to cover multiple conditions.
Closes: #3720
Original pull request: #3745.
2021-08-24 14:34:43 +02:00
Mark Paluch
10c0203605 Polishing.
Accept window units in addition to plain strings. Document operator.

See: #3716
Original pull request: #3742.
2021-08-24 14:30:52 +02:00
Christoph Strobl
82b33331fc Add support for $derivative aggregation operator.
Closes: #3716
Original pull request: #3742.
2021-08-24 14:30:52 +02:00
Mark Paluch
75b5a548b6 Polishing.
Fix asterisk callouts.

See #3786
2021-08-24 11:23:42 +02:00
Mark Paluch
0c481feb72 Extract Aggregation Framework and GridFS docs in own source files.
Closes #3786
2021-08-24 11:09:06 +02:00
Mark Paluch
c8a791d367 Polishing.
Make fields final where possible. Update javadoc. Simplify assertions. Update reference docs.

See: #3715, See #3717, See #3727
Original pull request: #3741.
2021-08-24 11:07:40 +02:00
Christoph Strobl
510028a834 Add support for $shift aggregation Operator.
Closes: #3727
Original pull request: #3741.
2021-08-24 11:07:31 +02:00
Christoph Strobl
1a86761e2e Add support for $documentNumber aggregation operator.
Closes: #3717
Original pull request: #3741.
2021-08-24 11:07:22 +02:00
Christoph Strobl
30da62181f Add support for $rank and $denseRank aggregation operators.
Closes: #3715
Original pull request: #3741.
2021-08-24 11:06:59 +02:00
Christoph Strobl
a977b8a790 Change visibility of Reactive/MongoRepositoryFactoryBean setters.
Setters of the FactoryBean should be public.

Closes: #3779
Original pull request: #3780.
2021-08-24 10:26:44 +02:00
Christoph Strobl
f3e067f59f Add support for $expMovingAvg aggregation operator.
The SpEL support for this one is missing due to the differing argument map (N, alpha).

Closes: #3718
Original pull request: #3744.
2021-08-23 14:44:14 +02:00
Christoph Strobl
dbfd4e5c62 Polishing.
Reformat code.

See #3712
Original pull request: #3740.
2021-08-23 14:29:16 +02:00
Christoph Strobl
c574e5cf8a Add support for $covariancePop and $covarianceSamp aggregation expressions.
Closes: #3712
Original pull request: #3740.
2021-08-23 14:25:41 +02:00
Mark Paluch
f9f4c4621b Polishing.
Update javadoc and add assertions.

See #3711
Original pull request: #3739.
2021-08-23 11:21:35 +02:00
Christoph Strobl
23254c10dc Add support for $setWindowFields aggregation stage.
Add a SetWindowFieldsOperation to the aggregation framework.

The builder API allows fluent declaration of the aggregation stage as shown in the sample below.

SetWindowFieldsOperation.builder()
	.partitionByField("state")
	.sortBy(Sort.by(Direction.ASC, "date"))
	.output(AccumulatorOperators.valueOf("qty").sum())
	.within(Windows.documents().fromUnbounded().toCurrent().build())
	.as("cumulativeQuantityForState")
	.build();

Closes #3711
Original pull request: #3739.
2021-08-23 11:21:35 +02:00
Christoph Strobl
255491c446 Upgrade to MongoDB 4.3.1 Drivers.
Closes: #3778
2021-08-23 10:49:22 +02:00
Christoph Strobl
1d943d62a3 Fix build on Java 16.
Make sure to use an initialized MappingContext.

Closes: #3749
Original pull request: #3752.
2021-08-23 10:29:35 +02:00
Jens Schauder
7538b1a1a5 After release cleanups.
See #3736
2021-08-12 15:16:23 +02:00
Jens Schauder
828c074167 Prepare next development iteration.
See #3736
2021-08-12 15:16:21 +02:00
Jens Schauder
87ab1ac48c Release version 3.3 M2 (2021.1.0).
See #3736
2021-08-12 15:03:17 +02:00
Jens Schauder
454afd9877 Prepare 3.3 M2 (2021.1.0).
See #3736
2021-08-12 15:02:56 +02:00
Mark Paluch
45971b212c Polishing.
Move off deprecated classes. Add unpaged testcase for query by example.

Original Pull Request: #3754
2021-07-26 15:16:05 +02:00
Mark Paluch
68370c16fb Run unpaged query using Pageable.unpaged() through QuerydslMongoPredicateExecutor.findAll(…).
We now correctly consider unpaged queries if the Pageable is unpaged.

Closes: #3751
Original Pull Request: #3754
2021-07-26 15:15:37 +02:00
Christoph Strobl
d2c9b47366 Fix issues related to Querydsl 5.0 upgrade.
Remove overridden methods no longer available in public api.

Closes: #3738
2021-07-22 09:03:10 +02:00
Jens Schauder
4d7ee0e741 After release cleanups.
See #3631
2021-07-16 14:19:58 +02:00
Jens Schauder
e7f3a2436d Prepare next development iteration.
See #3631
2021-07-16 14:19:56 +02:00
Jens Schauder
4ef1ff6aff Release version 3.3 M1 (2021.1.0).
See #3631
2021-07-16 14:09:25 +02:00
Jens Schauder
b6ad32d7d4 Prepare 3.3 M1 (2021.1.0).
See #3631
2021-07-16 14:08:59 +02:00
Jens Schauder
e875f9ea33 Updated changelog.
See #3631
2021-07-16 14:08:43 +02:00
Jens Schauder
9db9d16cf8 Updated changelog.
See #3681
2021-07-16 10:48:16 +02:00
Mark Paluch
f00991dc29 Polishing.
Rename Granularities/Granularity to Granularity and GranularityDefinition to proivide a more natural wording towards using predefined granularities.

Validate presence of referenced properties through the TimeSeries annotation.

Tweak Javadoc, reformat code, add unit tests.

See #3731
Original pull request: #3732.
2021-07-16 09:44:43 +02:00
Christoph Strobl
bacbd7133e Add support for creating Time Series collection.
Introduce time series to CollectionOptions and add dedicated TimeSeries annotation to derive values from.

Closes #3731
Original pull request: #3732.
2021-07-16 09:44:35 +02:00
Mark Paluch
f38f6d67ab Polishing.
Support DBObject and Map that as source for entity materialization and map conversion.

See #3702
Original pull request: #3704.
2021-07-15 10:00:33 +02:00
Christoph Strobl
3f27e8e152 Fix raw document conversion in Collection like properties.
Along the lines make sure to convert map like structures correctly if they do not come as a Document, eg. cause they got converted to a plain Map in a post load, pre convert event.

Closes #3702
Original pull request: #3704.
2021-07-15 10:00:31 +02:00
Christoph Strobl
23177fef0c Custom Converter should also be applicable for simple types.
This commit fixes a regression that prevented custom converters from being applied to types considered store native ones.

Original pull request: #3703.
Fixes #3670
2021-07-15 08:54:31 +02:00
Mark Paluch
f3b90c2b8a Polishing.
Reformat code. Tweak javadoc. Reject wildcard projection usage on properties with a MappingException. Omit wildcard projections when declared on document types that are used as subdocument.

See #3225
Original pull request: #3671.
2021-07-14 15:16:00 +02:00
Christoph Strobl
d57c5a9529 Add support for Wildcard Index.
Add WildcardIndexed annotation and the programatic WildcardIndex.

Closes #3225
Original pull request: #3671.
2021-07-14 15:16:00 +02:00
Mark Paluch
986ea39f90 Upgrade to Querydsl 5.0.
Move off our own Querydsl copies, as Querydsl 5.0 ships MongoDB Document API support.
Remove package-private duplicates of Querydsl code.
Introduce SpringDataMongodbQuerySupport to provide a well-formatted toString representation of the actual query.

Original Pull Request: #3674
2021-07-14 10:15:12 +02:00
Christoph Strobl
5bd7ff1413 Upgrade to MongoDB 4.3.0 Drivers.
Closes: #3706
2021-07-14 07:44:00 +02:00
Mark Paluch
93b9f23b07 Polishing.
Fix proxy comparison.

See #3705
2021-07-13 08:42:24 +02:00
Mark Paluch
42ab7d2f63 Adapt to changes in AssertJ 3.20.
Closes #3705
2021-07-13 08:27:51 +02:00
Christoph Strobl
a6a2f0bde9 Upgrade to MongoDB 4.3.0-beta4 Drivers.
Closes: #3693
2021-07-12 14:56:44 +02:00
Oliver Drotbohm
7d0b070d1f Adapt to API consolidation in Spring Data Commons' PersistentProperty.
Closes: #3700
Original Pull Request: #3701
Related to: spring-projects/spring-data-commons#2408
2021-07-12 08:09:26 +02:00
Christoph Strobl
81bc3c599b Disable tests on Java 16 that require class-based proxies.
Original pull request: #3687.
Resolves #3656
2021-07-08 12:08:54 +02:00
Christoph Strobl
403f0019d5 Fix Optional handling in query creation and result processing.
Original pull request: #3687.
Resolves #3656
2021-07-08 12:08:50 +02:00
Christoph Strobl
4f65bb0810 Fix mapping context setup to include simple type holder.
Original pull request: #3687.
Resolves #3656
2021-07-08 12:08:32 +02:00
Christoph Strobl
ef29e69a87 Polishing.
Simplify KeyMapper current property/index setup.

Original Pull Request: #3689
2021-07-06 11:46:35 +02:00
David Julia
5cffb3c07c Fix Regression in generating queries with nested maps with numeric keys.
While maps that have numeric keys work if there is only one map with an integer key, when there are multiple maps with numeric keys in a given query, it fails.

Take the following example for a map called outer with numeric keys holding reference to another object with a map called inner with numeric keys: Updates that are meant to generate {"$set": {"outerMap.1234.inner.5678": "hello"}} are instead generating {"$set": {"outerMap.1234.inner.inner": "hello"}}, repeating the later map property name instead of using the integer key value.

This commit adds unit tests both for the UpdateMapper and QueryMapper, which check multiple consecutive maps with numeric keys, and adds a fix in the KeyMapper. Because we cannot easily change the path parsing to somehow parse path parts corresponding to map keys differently, we address the issue in the KeyMapper. We keep track of the partial path corresponding to the current property and use it to skip adding the duplicated property name for the map to the query, and instead add the key.

This is a bit redundant in that we now have both an iterator and an index-based way of accessing the path parts, but it gets the tests passing and fixes the issue without making a large change to the current approach.

Fixes: #3688
Original Pull Request: #3689
2021-07-06 11:36:11 +02:00
Christoph Strobl
61d3a0bd1f Fix NPE when reading/mapping null value inside collection.
Closes: #3686
2021-07-01 10:50:57 +02:00
Christoph Strobl
82d67c1dbb Favor ObjectUtils over Objects for equals/hashCode.
Original Pull Request: #3684
2021-06-24 13:32:48 +02:00
Gatto
85a30ec915 Add equals and hashCode to UnwrappedMongoPersistentProperty.
Fixes #3683
Original Pull Request: #3684
2021-06-24 13:32:40 +02:00
Mark Paluch
c70c29b2c7 Updated changelog.
See #3650
2021-06-22 16:07:25 +02:00
Mark Paluch
2a5ae0da37 Updated changelog.
See #3649
2021-06-22 15:29:50 +02:00
Mark Paluch
826015e9c1 Update reference docs to use correct MongoClient.
Closes #3666
2021-06-22 14:37:05 +02:00
larsw
9dda0a2f93 Add closing quote to GeoJson javadoc.
Closes #3677
2021-06-21 13:58:07 +02:00
Christoph Strobl
7dfe460433 Fix field projection value conversion.
The field projection conversion should actually only map field names and avoid value conversion. In the MongoId case an inclusion parameter (1) was unintentionally converted into its String representation which causes trouble on Mongo 4.4 servers.

Fixes: #3668
Original pull request: #3678.
2021-06-21 13:45:46 +02:00
Christoph Strobl
73a0f04933 Polishing.
Fix typo in class name and make sure MongoTestTemplate uses the configured simple types.
Remove superfluous junit extension.

See: #3659
Original pull request: #3661.
2021-06-18 14:12:48 +02:00
Christoph Strobl
a1c165921d Fix query mapper path resolution for types considered simple ones.
spring-projects/spring-data-commons#2293 changed how PersistentProperty paths get resolved and considers potentially registered converters for those, which made the path resolution fail in during the query mapping process.
This commit makes sure to capture the according exception and continue with the given user input.

Fixes: #3659
Original pull request: #3661.
2021-06-18 14:12:39 +02:00
Christoph Strobl
3872b379cd Fix $or / $nor keyword mapping in query mapper.
This commit fixes an issue with the pattern used for detecting $or / $nor which also matched other keywords like $floor.

Closes: #3635
Original pull request: #3637.
2021-06-18 13:48:29 +02:00
Mark Paluch
98fe043b95 Directly import JSR305 jar.
Closes #3672
2021-06-17 09:41:46 +02:00
Mark Paluch
c217618d9d Polishing.
Reorder methods and types. Rename MongoPersistentProperty.isOmitNullProperty to writeNullValues. Adapt caching MongoPersistentProperty and add tests.

Tweak Javadoc wording, add author and since tags.

See #3407
Original pull request: #3646.
2021-06-14 11:53:24 +02:00
Divya Srivastava
b1020d19ba Add an option to @Field annotation to include/exclude null values on write.
Properties can be annotated with `@Field(write=…)` to control whether a property with a null value should be included or omitted (default) during conversion in the target Document.

Closes #3407
Original pull request: #3646.
2021-06-14 11:53:09 +02:00
Mark Paluch
a481636429 Polishing.
Add nullability annotation. Return early on null value conversion.

See #3633
Original pull request: #3643.
2021-06-09 14:13:42 +02:00
Christoph Strobl
efa9a2d408 Add Criteria.isNullValue() as alternative to Criteria.is(null).
See #3633
Original pull request: #3643.
2021-06-09 14:13:42 +02:00
Christoph Strobl
149a703ecc Fix NPE in QueryMapper when trying to apply target type on null value.
Closes #3633
Original pull request: #3643.
2021-06-09 14:13:42 +02:00
Mark Paluch
2b715c54d3 Polishing.
Reformat code.

See #3660.
Original pull request: #3662.
2021-06-09 11:33:12 +02:00
Christoph Strobl
ece261aadb Fix conversion for types having a converter registered.
Fixes: #3660
Original pull request: #3662.
2021-06-09 11:33:00 +02:00
Mark Paluch
dae0ac3b4d Remove duplicate LazyLoadingInterceptor code by reusing LazyLoadingProxyFactory.
Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:01 +02:00
Mark Paluch
5ab75eb65a Polishing.
Reduce dependencies in tests by using NoOpDbRefResolver.
Add since tags.

Tweak documentation. Extract entity references into own documentation fragment.

Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:01 +02:00
Christoph Strobl
e96ef8e18f Avoid capturing lambdas, update javadoc and add tests.
Also allow direct usage of (at)Reference from data commons to define associations.

Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:00 +02:00
Mark Paluch
82af678cab Polishing
Rename ReferenceReader to ReferenceLookupDelegate.
Rename LazyLoadingProxyGenerator to LazyLoadingProxyFactory.
Rename DefaultReferenceLoader to MongoDatabaseFactoryReferenceLoader.

Reduce scope of LookupFunction and move it to ReferenceLookupDelegate.

Extract some checks into methods to reflect the underlying concepts. Simplify code, convert variables to constants where possible.

Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:00 +02:00
Christoph Strobl
6ed274bd9b Update entity linking support to derive document pointer from lookup query.
Simplify usage by computing the pointer from the lookup.
Update the reference documentation, add JavaDoc and refine API.

Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:00 +02:00
Mark Paluch
48ac7e75ba First pass of review polishing.
Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:00 +02:00
Christoph Strobl
a51c96298f Enhance support for linking entities.
Add initial support for an alternative to the existing DBRef scenario.
The enhancement allows to store and retrieve linked entites via their id or a customizable lookup query.

Original pull request: #3647.
Closes #3602.
2021-05-21 14:20:00 +02:00
Mark Paluch
f1354c4508 Revise DocumentCallback nullability constraints.
DocumentCallback is now generally non-nullable for both, the input Document and the returned result expecting EntityReader to always return a non-null object.

Also, use try-with-resources where applicable.

Closes #3648
2021-05-18 15:11:06 +02:00
Mark Paluch
ff7588f648 Updated changelog.
See #3629
2021-05-14 12:36:34 +02:00
Mark Paluch
124036fe36 Updated changelog.
See #3628
2021-05-14 12:06:37 +02:00
Greg L. Turnquist
80c5b536df Polishing. 2021-05-13 16:15:04 -05:00
Greg L. Turnquist
2ee33b1444 Polishing. 2021-05-13 15:56:39 -05:00
Greg L. Turnquist
eec6cea507 Update CI to JDK 16.
See #3603.
2021-05-13 15:52:30 -05:00
Mark Paluch
90d03d92d8 Polishing.
Let appendLimitAndOffsetIfPresent accept unary operators for adjusting limit/offset values instead of appendModifiedLimitAndOffsetIfPresent. Apply simple type extraction for Slice. Add support for aggregation result streaming.

Extend tests, add author tags, update docs.

See #3543.
Original pull request: #3645.
2021-05-11 11:50:40 +02:00
divya_jnu08
9a48e32565 Aggregation query method should be able to return Slice and Stream.
Aggregation query methods can not return Slice and Stream.

interface PersonRepository extends CrudReppsitory<Person, String> {

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : ?0 } } }")
  Slice<PersonAggregate> groupByLastnameAnd(String property, Pageable page);

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $firstname } } }")
  Stream<PersonAggregate> groupByLastnameAndFirstnamesAsStream();
}

Closes #3543.
Original pull request: #3645.
2021-05-11 11:50:10 +02:00
Mark Paluch
ede6927b65 Introduce template method for easier customization of fragments.
Closes #3638.
2021-04-27 10:45:43 +02:00
Greg L. Turnquist
2edc29f758 Authenticate with artifactory.
See #3616.
2021-04-22 14:59:20 -05:00
Clément Petit
5bd9bcca75 Fix bullet points in aggregation framework reference documentation.
Closes: #3632
2021-04-20 08:19:46 +02:00
Greg L. Turnquist
54f75e653b Migrate to main branch.
See #3616.
2021-04-16 12:27:26 -05:00
Mark Paluch
7b33f56e33 After release cleanups.
See #3616
2021-04-14 14:30:14 +02:00
Mark Paluch
829eed7d6c Prepare next development iteration.
See #3616
2021-04-14 14:30:11 +02:00
535 changed files with 24605 additions and 13466 deletions

View File

@@ -1,2 +1,2 @@
#Tue Feb 22 13:55:00 CET 2022
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.4/apache-maven-3.8.4-bin.zip
#Mon Oct 11 14:30:24 CEST 2021
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.3/apache-maven-3.8.3-bin.zip

View File

@@ -1,6 +1,6 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Moore%20(main)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]

View File

@@ -1,3 +1,3 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/main/CONTRIBUTING.adoc[here].

72
Jenkinsfile vendored
View File

@@ -9,7 +9,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/2.5.x", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/3.0.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -20,10 +20,10 @@ pipeline {
stages {
stage("Docker images") {
parallel {
stage('Publish JDK (main) + MongoDB 4.0') {
stage('Publish JDK (Java 17) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.0/**"
changeset "ci/openjdk17-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
}
@@ -32,17 +32,17 @@ pipeline {
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.0.version']} ci/openjdk8-mongodb-4.0/")
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (main) + MongoDB 4.4') {
stage('Publish JDK (Java 17) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.4/**"
changeset "ci/openjdk17-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
@@ -51,24 +51,7 @@ pipeline {
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk8-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK LTS + MongoDB 4.4') {
when {
changeset "ci/openjdk17-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}", "--build-arg BASE=${p['docker.java.lts.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
@@ -78,10 +61,11 @@ pipeline {
}
}
stage("test: baseline (main)") {
stage("test: baseline (Java 17)") {
when {
beforeAgent(true)
anyOf {
branch '3.2.x'
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
not { triggeredBy 'UpstreamCause' }
}
}
@@ -95,7 +79,7 @@ pipeline {
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
@@ -110,37 +94,15 @@ pipeline {
stage("Test other configurations") {
when {
beforeAgent(true)
allOf {
branch '3.2.x'
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: mongodb 4.4 (main)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
stage("test: baseline (LTS)") {
stage("test: mongodb 5.0 (Java 17)") {
agent {
label 'data'
}
@@ -151,7 +113,7 @@ pipeline {
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}").inside(p['docker.java.inside.basic']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
@@ -168,8 +130,9 @@ pipeline {
stage('Release to artifactory') {
when {
beforeAgent(true)
anyOf {
branch '3.2.x'
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
not { triggeredBy 'UpstreamCause' }
}
}
@@ -186,6 +149,7 @@ pipeline {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -v'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +

View File

@@ -1,6 +1,6 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -8,10 +8,12 @@ The Spring Data MongoDB project aims to provide a familiar and consistent Spring
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
[[code-of-conduct]]
== Code of Conduct
This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
[[getting-started]]
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
@@ -59,6 +61,7 @@ class ApplicationConfig extends AbstractMongoClientConfiguration {
}
----
[[maven-configuration]]
=== Maven configuration
Add the Maven dependency:
@@ -68,24 +71,25 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
<version>${version}</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository
and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
<version>${version}-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<id>spring-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/libs-snapshot</url>
<url>https://repo.spring.io/snapshot</url>
</repository>
----
@@ -98,7 +102,7 @@ Some of the changes affect the initial setup configuration as well as compile/ru
.Changed XML Namespace Elements and Attributes:
|===
Element / Attribute | 2.x | 3.x
| Element / Attribute | 2.x | 3.x
| `<mongo:mongo-client />`
| Used to create a `com.mongodb.MongoClient`
@@ -116,7 +120,7 @@ Use `<mongo:client-settings cluster-hosts="..." />` instead
.Removed XML Namespace Elements and Attributes:
|===
Element / Attribute | Replacement in 3.x | Comment
| Element / Attribute | Replacement in 3.x | Comment
| `<mongo:db-factory mongo-ref="..." />`
| `<mongo:db-factory mongo-client-ref="..." />`
@@ -133,7 +137,7 @@ Element / Attribute | Replacement in 3.x | Comment
.New XML Namespace Elements and Attributes:
|===
Element | Comment
| Element | Comment
| `<mongo:db-factory mongo-client-ref="..." />`
| Replacement for `<mongo:db-factory mongo-ref="..." />`
@@ -153,7 +157,7 @@ Element | Comment
.Java API changes
|===
Type | Comment
| Type | Comment
| `MongoClientFactoryBean`
| Creates `com.mongodb.client.MongoClient` instead of `com.mongodb.MongoClient` +
@@ -174,7 +178,7 @@ Uses `MongoClientSettings` instead of `MongoClientOptions`.
.Removed Java API:
|===
2.x | Replacement in 3.x | Comment
| 2.x | Replacement in 3.x | Comment
| `MongoClientOptionsFactoryBean`
| `MongoClientSettingsFactoryBean`
@@ -226,6 +230,7 @@ static class Config extends AbstractMongoClientConfiguration {
----
====
[[getting-help]]
== Getting Help
Having trouble with Spring Data? Wed love to help!
@@ -239,6 +244,7 @@ If you are just starting out with Spring, try one of the https://spring.io/guide
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://github.com/spring-projects/spring-data-mongodb/issues[github.com/spring-projects/spring-data-mongodb/issues].
[[reporting-issues]]
== Reporting Issues
Spring Data uses Github as issue tracking system to record bugs and feature requests.
@@ -249,10 +255,86 @@ If you want to raise an issue, please follow the recommendations below:
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using, the JVM version, Stacktrace, etc.
* If you need to paste code, or include a stack trace use https://guides.github.com/features/mastering-markdown/[Markdown] code fences +++```+++.
[[guides]]
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
[[examples]]
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
[[building-from-source]]
== Building from Source
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
You do not need to build from source to use Spring Data. Binaries are available in https://repo.spring.io[repo.spring.io]
and accessible from Maven using the Maven configuration noted <<maven-configuration,above>>.
NOTE: Configuration for Gradle is similar to Maven.
The best way to get started is by creating a Spring Boot project using MongoDB on https://start.spring.io[start.spring.io].
Follow this https://start.spring.io/#type=maven-project&language=java&platformVersion=2.5.4&packaging=jar&jvmVersion=1.8&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb[link]
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=2.5.4&packaging=jar&jvmVersion=1.8&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
to build a reactive one.
However, if you want to try out the latest and greatest, Spring Data MongoDB can be easily built with the https://github.com/takari/maven-wrapper[Maven wrapper]
and minimally, JDK 8 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
In order to build Spring Data MongoDB, you will need to https://www.mongodb.com/try/download/community[download]
and https://docs.mongodb.com/manual/installation/[install a MongoDB distribution].
Once you have installed MongoDB, you need to start a MongoDB server. It is convenient to set an environment variable to
your MongoDB installation directory (e.g. `MONGODB_HOME`).
To run the full test suite, a https://docs.mongodb.com/manual/tutorial/deploy-replica-set/[MongoDB Replica Set]
is required.
To run the MongoDB server enter the following command from a command-line:
[source,bash]
----
$ $MONGODB_HOME/bin/mongod --dbpath $MONGODB_HOME/runtime/data --ipv6 --port 27017 --replSet rs0
...
"msg":"Successfully connected to host"
----
Once the MongoDB server starts up, you should see the message (`msg`), "_Successfully connected to host_".
Notice the `--dbpath` option to the `mongod` command. You can set this to anything you like, but in this case, we set
the absolute path to a sub-directory (`runtime/data/`) under the MongoDB installation directory (in `$MONGODB_HOME`).
You need to initialize the MongoDB replica set only once on the first time the MongoDB server is started.
To initialize the replica set, start a mongo client:
[source,bash]
----
$ $MONGODB_HOME/bin/mongo
MongoDB server version: 5.0.0
...
----
Then enter the following command:
[source,bash]
----
mongo> rs.initiate({ _id: 'rs0', members: [ { _id: 0, host: '127.0.0.1:27017' } ] })
----
Finally, on UNIX-based system (for example, Linux or Mac OS X) you may need to adjust the `ulimit`.
In case you need to, you can adjust the `ulimit` with the following command (32768 is just a recommendation):
[source,bash]
----
$ ulimit -n 32768
----
You can use `ulimit -a` again to verify the `ulimit` for "_open files_" was set appropriately.
Now you are ready to build Spring Data MongoDB. Simply enter the following `mvnw` (Maven Wrapper) command:
[source,bash]
----
@@ -261,7 +343,8 @@ You also need JDK 1.8.
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular, please sign
the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
@@ -274,17 +357,7 @@ Building the documentation builds also the project without running tests.
The generated documentation is available from `target/site/reference/html/index.html`.
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
[[license]]
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

View File

@@ -7,6 +7,9 @@ ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/http/https/g' /etc/apt/sources.list ; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \

View File

@@ -7,12 +7,17 @@ ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/http/https/g' /etc/apt/sources.list ; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget ; \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 ; \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
RUN apt-get update; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;

View File

@@ -1,18 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;

View File

@@ -1,20 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
ln -T /bin/true /usr/bin/systemctl ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
rm /usr/bin/systemctl ; \
apt-get clean ; \
rm -rf /var/lib/apt/lists/* ;

View File

@@ -1,17 +1,10 @@
# Java versions
java.main.tag=8u322-b06-jdk
java.11.tag=11.0.14.1_1-jdk
java.15.tag=15.0.2_7-jdk-hotspot
java.lts.tag=17.0.2_8-jdk
java.main.tag=17.0.2_8-jdk
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.11.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.11.tag}
docker.java.15.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/adoptopenjdk:${java.15.tag}
docker.java.lts.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.lts.tag}
# Supported versions of MongoDB
docker.mongodb.4.0.version=4.0.28
docker.mongodb.4.4.version=4.4.12
docker.mongodb.5.0.version=5.0.6

21
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.10</version>
<version>4.0.0-M2</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.5.10</version>
<version>3.0.0-M2</version>
</parent>
<modules>
@@ -24,10 +24,11 @@
</modules>
<properties>
<source.level>16</source.level>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.5.10</springdata.commons>
<mongo>4.2.3</mongo>
<springdata.commons>3.0.0-M2</springdata.commons>
<mongo>4.5.0</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -134,18 +135,18 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
<releases>
<enabled>false</enabled>
</releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.10</version>
<version>4.0.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.10</version>
<version>4.0.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.10</version>
<version>4.0.0-M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -87,6 +87,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.2</version>
<optional>true</optional>
</dependency>
<!-- reactive -->
<dependency>
@@ -115,27 +122,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId>
@@ -145,12 +131,6 @@
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
@@ -160,31 +140,48 @@
</dependency>
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<groupId>jakarta.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>${javax-annotation-api}</version>
<groupId>jakarta.annotation</groupId>
<artifactId>jakarta.annotation-api</artifactId>
<version>${jakarta-annotation-api}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-spi</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-impl</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation -->
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -199,28 +196,23 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.4.3.Final</version>
<version>7.0.1.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jakarta.el</groupId>
<artifactId>jakarta.el-api</artifactId>
<version>4.0.0</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<artifactId>jakarta.el</artifactId>
<version>4.0.2</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
@@ -230,13 +222,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
@@ -272,9 +257,9 @@
</dependency>
<dependency>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<groupId>jakarta.transaction</groupId>
<artifactId>jakarta.transaction-api</artifactId>
<version>2.0.0</version>
<scope>test</scope>
</dependency>
@@ -310,6 +295,15 @@
<scope>test</scope>
</dependency>
<!-- jMolecules -->
<dependency>
<groupId>org.jmolecules</groupId>
<artifactId>jmolecules-ddd</artifactId>
<version>${jmolecules}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -103,19 +103,11 @@ public class BindableMongoExpression implements MongoExpression {
return new BindableMongoExpression(expressionString, codecRegistryProvider, args);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoExpression#toDocument()
*/
@Override
public Document toDocument() {
return target.get();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args="

View File

@@ -193,19 +193,11 @@ public class MongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
@@ -214,10 +206,6 @@ public class MongoDatabaseUtils {
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
@@ -228,10 +216,6 @@ public class MongoDatabaseUtils {
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -1,57 +0,0 @@
/*
* Copyright 2011-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 3.0, use {@link MongoDatabaseFactory} instead.
*/
@Deprecated
public interface MongoDbFactory extends MongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase()} instead.
*/
@Deprecated
default MongoDatabase getDb() throws DataAccessException {
return getMongoDatabase();
}
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase(String)} instead.
*/
@Deprecated
default MongoDatabase getDb(String dbName) throws DataAccessException {
return getMongoDatabase(dbName);
}
}

View File

@@ -106,10 +106,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
@@ -118,19 +114,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
@@ -160,10 +148,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
@@ -173,19 +157,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
@@ -236,10 +212,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
@@ -259,10 +231,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
@@ -270,10 +238,6 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
@@ -325,19 +289,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDatabaseFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
@@ -505,19 +461,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();

View File

@@ -214,19 +214,11 @@ public class ReactiveMongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
@@ -237,10 +229,6 @@ public class ReactiveMongoDatabaseUtils {
return Mono.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override
public Mono<Void> afterCompletion(int status) {
@@ -256,10 +244,6 @@ public class ReactiveMongoDatabaseUtils {
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -110,10 +110,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException {
@@ -123,19 +119,11 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return new ReactiveMongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException {
@@ -175,10 +163,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException {
@@ -192,10 +176,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) {
@@ -203,10 +183,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -243,10 +219,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) {
@@ -268,10 +240,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -282,10 +250,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) {
@@ -340,10 +304,6 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return databaseFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDatabaseFactory();
@@ -509,19 +469,11 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
throw new UnsupportedOperationException("flush() not supported");

View File

@@ -95,10 +95,6 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.sessionType = sessionType;
}
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable
@Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable {

View File

@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.util.Version;
import org.springframework.util.StringUtils;
@@ -31,7 +31,7 @@ import com.mongodb.MongoDriverInformation;
*/
public class SpringDataMongoDB {
private static final Logger LOGGER = LoggerFactory.getLogger(SpringDataMongoDB.class);
private static final Log LOGGER = LogFactory.getLog(SpringDataMongoDB.class);
private static final Version FALLBACK_VERSION = new Version(3);
private static final MongoDriverInformation DRIVER_INFORMATION = MongoDriverInformation
@@ -68,7 +68,7 @@ public class SpringDataMongoDB {
try {
return Version.parse(versionString);
} catch (Exception e) {
LOGGER.debug("Cannot read Spring Data MongoDB version '{}'.", versionString);
LOGGER.debug(String.format("Cannot read Spring Data MongoDB version '%s'.", versionString));
}
return FALLBACK_VERSION;

View File

@@ -25,9 +25,7 @@ import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
@@ -80,24 +78,6 @@ public abstract class AbstractMongoClientConfiguration extends MongoConfiguratio
return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.

View File

@@ -30,10 +30,6 @@ import com.mongodb.ConnectionString;
*/
public class ConnectionStringPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String connectionString) {

View File

@@ -34,10 +34,6 @@ import org.w3c.dom.Element;
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -46,10 +42,6 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -80,7 +80,7 @@ import org.w3c.dom.Element;
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("jakarta.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
/* (non-Javadoc)
@@ -376,10 +376,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory)
throws IOException {

View File

@@ -47,28 +47,16 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEntityCallback.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -27,6 +28,8 @@ import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.util.Assert;
@@ -39,28 +42,16 @@ import org.springframework.util.Assert;
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
@@ -70,10 +61,6 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
@@ -81,17 +68,13 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(org.springframework.data.repository.config.PersistentEntitiesFactoryBean.class);
definition.addConstructorArgValue(new RuntimeBeanReference(MappingContext.class));
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {

View File

@@ -35,10 +35,6 @@ import org.w3c.dom.Element;
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);

View File

@@ -51,10 +51,6 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String OPTIONS_DELIMITER = "?";
private static final String OPTION_VALUE_DELIMITER = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String text) throws IllegalArgumentException {

View File

@@ -62,10 +62,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -74,10 +70,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -26,10 +26,6 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());

View File

@@ -22,9 +22,12 @@ import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionValidationException;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientSettingsFactoryBean;
import org.springframework.data.mongodb.core.MongoServerApiFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -37,7 +40,6 @@ import org.w3c.dom.Element;
* @author Christoph Strobl
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
@@ -112,6 +114,20 @@ abstract class MongoParsingUtils {
// Field level encryption
setPropertyReference(clientOptionsDefBuilder, settingsElement, "encryption-settings-ref", "autoEncryptionSettings");
// ServerAPI
if (StringUtils.hasText(settingsElement.getAttribute("server-api-version"))) {
MongoServerApiFactoryBean serverApiFactoryBean = new MongoServerApiFactoryBean();
serverApiFactoryBean.setVersion(settingsElement.getAttribute("server-api-version"));
try {
clientOptionsDefBuilder.addPropertyValue("serverApi", serverApiFactoryBean.getObject());
} catch (Exception exception) {
throw new BeanDefinitionValidationException("Non parsable server-api.", exception);
}
} else {
setPropertyReference(clientOptionsDefBuilder, settingsElement, "server-api-ref", "serverApi");
}
// and the rest
mongoClientBuilder.addPropertyValue("mongoClientSettings", clientOptionsDefBuilder.getBeanDefinition());

View File

@@ -39,10 +39,6 @@ import org.w3c.dom.Element;
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -51,10 +47,6 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -41,19 +41,11 @@ public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEnti
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;

View File

@@ -38,28 +38,16 @@ import org.springframework.util.Assert;
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
@@ -74,10 +62,6 @@ class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupp
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {

View File

@@ -32,10 +32,6 @@ import com.mongodb.ReadConcernLevel;
*/
public class ReadConcernPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
public void setAsText(@Nullable String readConcernString) {

View File

@@ -29,10 +29,6 @@ import com.mongodb.ReadPreference;
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {

View File

@@ -21,8 +21,8 @@ import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -43,13 +43,9 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'. Check your replica set configuration!";
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String replicaSetString) {
@@ -88,14 +84,18 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private ServerAddress parseServerAddress(String source) {
if (!StringUtils.hasText(source)) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
return null;
}
String[] hostAndPort = extractHostAddressAndPort(source.trim());
if (hostAndPort.length > 2) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
return null;
}
@@ -105,9 +105,13 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
} catch (UnknownHostException e) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]));
}
} catch (NumberFormatException e) {
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]));
}
}
return null;

View File

@@ -26,10 +26,6 @@ import com.mongodb.WriteConcern;
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source);

View File

@@ -29,10 +29,6 @@ import org.springframework.util.StringUtils;
*/
public class UUidRepresentationPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String value) {

View File

@@ -189,10 +189,6 @@ public class ChangeStreamEvent<T> {
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';

View File

@@ -17,8 +17,11 @@ package org.springframework.data.mongodb.core;
import java.util.Optional;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.timeseries.GranularityDefinition;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
@@ -42,29 +45,18 @@ public class CollectionOptions {
private @Nullable Boolean capped;
private @Nullable Collation collation;
private ValidationOptions validationOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise. Can be {@literal null}.
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none());
}
private @Nullable TimeSeriesOptions timeSeriesOptions;
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions) {
@Nullable Collation collation, ValidationOptions validationOptions,
@Nullable TimeSeriesOptions timeSeriesOptions) {
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
this.collation = collation;
this.validationOptions = validationOptions;
this.timeSeriesOptions = timeSeriesOptions;
}
/**
@@ -78,7 +70,7 @@ public class CollectionOptions {
Assert.notNull(collation, "Collation must not be null!");
return new CollectionOptions(null, null, null, collation, ValidationOptions.none());
return new CollectionOptions(null, null, null, collation, ValidationOptions.none(), null);
}
/**
@@ -88,7 +80,21 @@ public class CollectionOptions {
* @since 2.0
*/
public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null, ValidationOptions.none());
return new CollectionOptions(null, null, null, null, ValidationOptions.none(), null);
}
/**
* Quick way to set up {@link CollectionOptions} for a Time Series collection. For more advanced settings use
* {@link #timeSeries(TimeSeriesOptions)}.
*
* @param timeField The name of the property which contains the date in each time series document. Must not be
* {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @see #timeSeries(TimeSeriesOptions)
* @since 3.3
*/
public static CollectionOptions timeSeries(String timeField) {
return empty().timeSeries(TimeSeriesOptions.timeSeries(timeField));
}
/**
@@ -99,7 +105,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions, null);
}
/**
@@ -110,7 +116,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
}
/**
@@ -121,7 +127,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
}
/**
@@ -132,7 +138,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
}
/**
@@ -252,7 +258,20 @@ public class CollectionOptions {
public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param timeSeriesOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions timeSeries(TimeSeriesOptions timeSeriesOptions) {
Assert.notNull(timeSeriesOptions, "TimeSeriesOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions);
}
/**
@@ -303,6 +322,16 @@ public class CollectionOptions {
return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions);
}
/**
* Get the {@link TimeSeriesOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 3.3
*/
public Optional<TimeSeriesOptions> getTimeSeriesOptions() {
return Optional.ofNullable(timeSeriesOptions);
}
/**
* Encapsulation of ValidationOptions options.
*
@@ -398,4 +427,89 @@ public class CollectionOptions {
return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel());
}
}
/**
* Options applicable to Time Series collections.
*
* @author Christoph Strobl
* @since 3.3
* @see <a href=
* "https://docs.mongodb.com/manual/core/timeseries-collections">https://docs.mongodb.com/manual/core/timeseries-collections</a>
*/
public static class TimeSeriesOptions {
private final String timeField;
private @Nullable final String metaField;
private final GranularityDefinition granularity;
private TimeSeriesOptions(String timeField, @Nullable String metaField, GranularityDefinition granularity) {
Assert.hasText(timeField, "Time field must not be empty or null!");
this.timeField = timeField;
this.metaField = metaField;
this.granularity = granularity;
}
/**
* Create a new instance of {@link TimeSeriesOptions} using the given field as its {@literal timeField}. The one,
* that contains the date in each time series document. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param timeField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public static TimeSeriesOptions timeSeries(String timeField) {
return new TimeSeriesOptions(timeField, null, Granularity.DEFAULT);
}
/**
* Set the name of the field which contains metadata in each time series document. Should not be the {@literal id}
* nor {@link TimeSeriesOptions#timeSeries(String)} timeField} nor point to an {@literal array} or
* {@link java.util.Collection}. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param metaField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public TimeSeriesOptions metaField(String metaField) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* Select the {@link GranularityDefinition} parameter to define how data in the time series collection is organized.
* Select one that is closest to the time span between incoming measurements.
*
* @return new instance of {@link TimeSeriesOptions}.
* @see Granularity
*/
public TimeSeriesOptions granularity(GranularityDefinition granularity) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* @return never {@literal null}.
*/
public String getTimeField() {
return timeField;
}
/**
* @return can be {@literal null}. Might be an {@literal empty} {@link String} as well, so maybe check via
* {@link org.springframework.util.StringUtils#hasText(String)}.
*/
@Nullable
public String getMetaField() {
return metaField;
}
/**
* @return never {@literal null}.
*/
public GranularityDefinition getGranularity() {
return granularity;
}
}
}

View File

@@ -109,10 +109,6 @@ class DefaultBulkOperations implements BulkOperations {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
@@ -125,10 +121,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
@@ -139,10 +131,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
@@ -153,10 +141,6 @@ class DefaultBulkOperations implements BulkOperations {
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
@@ -169,10 +153,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
@@ -183,10 +163,6 @@ class DefaultBulkOperations implements BulkOperations {
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
@@ -199,19 +175,11 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
@@ -222,10 +190,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
@@ -239,10 +203,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
@@ -255,10 +215,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
@@ -278,10 +234,6 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public com.mongodb.bulk.BulkWriteResult execute() {

View File

@@ -112,10 +112,6 @@ public class DefaultIndexOperations implements IndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public String ensureIndex(final IndexDefinition indexDefinition) {
return execute(collection -> {
@@ -150,10 +146,6 @@ public class DefaultIndexOperations implements IndexOperations {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
execute(collection -> {
@@ -163,18 +155,10 @@ public class DefaultIndexOperations implements IndexOperations {
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return execute(new CollectionCallback<List<IndexInfo>>() {

View File

@@ -42,10 +42,6 @@ class DefaultIndexOperationsProvider implements IndexOperationsProvider {
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName, Class<?> type) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type);

View File

@@ -86,10 +86,6 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, collection -> {
@@ -119,26 +115,14 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //

View File

@@ -70,19 +70,11 @@ class DefaultScriptOperations implements ScriptOperations {
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
@@ -92,10 +84,6 @@ class DefaultScriptOperations implements ScriptOperations {
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
@@ -115,10 +103,6 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
@@ -135,10 +119,6 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
@@ -147,10 +127,6 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2021-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoClientSettings;
/**
* A factory bean for construction of a {@link MongoClientSettings} instance to be used with the async MongoDB driver.
* Encryption algorithms supported by MongoDB Client Side Field Level Encryption.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
* @deprecated since 3.0 - Use {@link MongoClientSettingsFactoryBean} instead.
* @since 3.3
*/
@Deprecated
public class ReactiveMongoClientSettingsFactoryBean extends MongoClientSettingsFactoryBean {
public final class EncryptionAlgorithms {
public static final String AEAD_AES_256_CBC_HMAC_SHA_512_Deterministic = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic";
public static final String AEAD_AES_256_CBC_HMAC_SHA_512_Random = "AEAD_AES_256_CBC_HMAC_SHA_512-Random";
}

View File

@@ -21,27 +21,45 @@ import java.util.Map;
import java.util.Optional;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.CollectionOptions.TimeSeriesOptions;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.TimeSeries;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.EntityProjectionIntrospector;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.TimeSeriesGranularity;
import com.mongodb.client.model.ValidationOptions;
/**
* Common operations performed on an entity in the context of it's mapping metadata.
@@ -58,9 +76,31 @@ class EntityOperations {
private static final String ID_FIELD = "_id";
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
private final QueryMapper queryMapper;
EntityOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
private final EntityProjectionIntrospector introspector;
private final MongoJsonSchemaMapper schemaMapper;
EntityOperations(MongoConverter converter) {
this(converter, new QueryMapper(converter));
}
EntityOperations(MongoConverter converter, QueryMapper queryMapper) {
this(converter, converter.getMappingContext(), converter.getCustomConversions(), converter.getProjectionFactory(),
queryMapper);
}
EntityOperations(MongoConverter converter,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
CustomConversions conversions, ProjectionFactory projectionFactory, QueryMapper queryMapper) {
this.context = context;
this.queryMapper = queryMapper;
this.introspector = EntityProjectionIntrospector.create(projectionFactory,
EntityProjectionIntrospector.ProjectionPredicate.typeHierarchy()
.and(((target, underlyingType) -> !conversions.isSimpleType(target))),
context);
this.schemaMapper = new MongoJsonSchemaMapper(converter);
}
/**
@@ -225,6 +265,89 @@ class EntityOperations {
return UntypedOperations.instance();
}
/**
* Introspect the given {@link Class result type} in the context of the {@link Class entity type} whether the returned
* type is a projection and what property paths are participating in the projection.
*
* @param resultType the type to project on. Must not be {@literal null}.
* @param entityType the source domain type. Must not be {@literal null}.
* @return the introspection result.
* @since 3.4
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
return introspector.introspect(resultType, entityType);
}
/**
* Convert {@link CollectionOptions} to {@link CreateCollectionOptions} using {@link Class entityType} to obtain
* mapping metadata.
*
* @param collectionOptions
* @param entityType
* @return
* @since 3.4
*/
public CreateCollectionOptions convertToCreateCollectionOptions(@Nullable CollectionOptions collectionOptions,
Class<?> entityType) {
Optional<Collation> collation = Optionals.firstNonEmpty(
() -> Optional.ofNullable(collectionOptions).flatMap(CollectionOptions::getCollation),
() -> forType(entityType).getCollation());//
CreateCollectionOptions result = new CreateCollectionOptions();
collation.map(Collation::toMongoCollation).ifPresent(result::collation);
if (collectionOptions == null) {
return result;
}
collectionOptions.getCapped().ifPresent(result::capped);
collectionOptions.getSize().ifPresent(result::sizeInBytes);
collectionOptions.getMaxDocuments().ifPresent(result::maxDocuments);
collectionOptions.getCollation().map(Collation::toMongoCollation).ifPresent(result::collation);
collectionOptions.getValidationOptions().ifPresent(it -> {
ValidationOptions validationOptions = new ValidationOptions();
it.getValidationAction().ifPresent(validationOptions::validationAction);
it.getValidationLevel().ifPresent(validationOptions::validationLevel);
it.getValidator().ifPresent(val -> validationOptions.validator(getMappedValidator(val, entityType)));
result.validationOptions(validationOptions);
});
collectionOptions.getTimeSeriesOptions().map(forType(entityType)::mapTimeSeriesOptions).ifPresent(it -> {
com.mongodb.client.model.TimeSeriesOptions options = new com.mongodb.client.model.TimeSeriesOptions(
it.getTimeField());
if (StringUtils.hasText(it.getMetaField())) {
options.metaField(it.getMetaField());
}
if (!Granularity.DEFAULT.equals(it.getGranularity())) {
options.granularity(TimeSeriesGranularity.valueOf(it.getGranularity().name().toUpperCase()));
}
result.timeSeriesOptions(options);
});
return result;
}
private Document getMappedValidator(Validator validator, Class<?> domainType) {
Document validationRules = validator.toDocument();
if (validationRules.containsKey("$jsonSchema")) {
return schemaMapper.mapSchema(validationRules, domainType);
}
return queryMapper.getMappedObject(validationRules, context.getPersistentEntity(domainType));
}
/**
* A representation of information about an entity.
*
@@ -369,37 +492,21 @@ class EntityOperations {
this.map = map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return ID_FIELD;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return map.get(ID_FIELD);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -409,19 +516,11 @@ class EntityOperations {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion()
*/
@Override
public Query getQueryForVersion() {
throw new MappingException("Cannot query for version on plain Documents!");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document //
@@ -429,47 +528,27 @@ class EntityOperations {
: new Document(map));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#incrementVersion()
*/
@Override
public T incrementVersion() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return map.get(ID_FIELD) != null;
@@ -482,10 +561,6 @@ class EntityOperations {
super(map);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
@SuppressWarnings("unchecked")
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -525,28 +600,16 @@ class EntityOperations {
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return entity.getRequiredIdProperty().getFieldName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return idAccessor.getRequiredIdentifier();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
@@ -559,10 +622,6 @@ class EntityOperations {
return Query.query(Criteria.where(idProperty.getName()).is(getId()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion(java.lang.Object)
*/
@Override
public Query getQueryForVersion() {
@@ -573,10 +632,6 @@ class EntityOperations {
.and(versionProperty.getName()).is(getVersion()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -592,10 +647,6 @@ class EntityOperations {
return MappedDocument.of(document);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#assertUpdateableIdIfNotSet()
*/
public void assertUpdateableIdIfNotSet() {
if (!entity.hasIdProperty()) {
@@ -616,38 +667,22 @@ class EntityOperations {
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#isVersionedEntity()
*/
@Override
public boolean isVersionedEntity() {
return entity.hasVersionProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getVersion()
*/
@Override
@Nullable
public Object getVersion() {
return propertyAccessor.getProperty(entity.getRequiredVersionProperty());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
@@ -682,10 +717,6 @@ class EntityOperations {
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -707,10 +738,6 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MappedEntity#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
@@ -720,10 +747,6 @@ class EntityOperations {
return propertyAccessor.getProperty(versionProperty, Number.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
@@ -738,10 +761,6 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#incrementVersion()
*/
@Override
public T incrementVersion() {
@@ -778,6 +797,24 @@ class EntityOperations {
* @return
*/
Optional<Collation> getCollation(Query query);
/**
* Derive the applicable {@link CollectionOptions} for the given type.
*
* @return never {@literal null}.
* @since 3.3
*/
CollectionOptions getCollectionOptions();
/**
* Map the fields of a given {@link TimeSeriesOptions} against the target domain type to consider potentially
* annotated field names.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 3.3
*/
TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options);
}
/**
@@ -795,19 +832,11 @@ class EntityOperations {
return (TypedOperations) INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
@@ -817,6 +846,16 @@ class EntityOperations {
return query.getCollation();
}
@Override
public CollectionOptions getCollectionOptions() {
return CollectionOptions.empty();
}
@Override
public TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options) {
return options;
}
}
/**
@@ -832,19 +871,11 @@ class EntityOperations {
this.entity = entity;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.ofNullable(entity.getCollation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
@@ -854,6 +885,58 @@ class EntityOperations {
return Optional.ofNullable(entity.getCollation());
}
@Override
public CollectionOptions getCollectionOptions() {
CollectionOptions collectionOptions = CollectionOptions.empty();
if (entity.hasCollation()) {
collectionOptions = collectionOptions.collation(entity.getCollation());
}
if (entity.isAnnotationPresent(TimeSeries.class)) {
TimeSeries timeSeries = entity.getRequiredAnnotation(TimeSeries.class);
if (entity.getPersistentProperty(timeSeries.timeField()) == null) {
throw new MappingException(String.format("Time series field '%s' does not exist in type %s",
timeSeries.timeField(), entity.getName()));
}
TimeSeriesOptions options = TimeSeriesOptions.timeSeries(timeSeries.timeField());
if (StringUtils.hasText(timeSeries.metaField())) {
if (entity.getPersistentProperty(timeSeries.metaField()) == null) {
throw new MappingException(
String.format("Meta field '%s' does not exist in type %s", timeSeries.metaField(), entity.getName()));
}
options = options.metaField(timeSeries.metaField());
}
if (!Granularity.DEFAULT.equals(timeSeries.granularity())) {
options = options.granularity(timeSeries.granularity());
}
collectionOptions = collectionOptions.timeSeries(options);
}
return collectionOptions;
}
@Override
public TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions source) {
TimeSeriesOptions target = TimeSeriesOptions.timeSeries(mappedNameOrDefault(source.getTimeField()));
if (StringUtils.hasText(source.getMetaField())) {
target = target.metaField(mappedNameOrDefault(source.getMetaField()));
}
return target.granularity(source.getGranularity());
}
private String mappedNameOrDefault(String name) {
MongoPersistentProperty persistentProperty = entity.getPersistentProperty(name);
return persistentProperty != null ? persistentProperty.getFieldName() : name;
}
}
}

View File

@@ -15,9 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.util.CloseableIterator;
/**
* {@link ExecutableAggregationOperation} allows creation and execution of MongoDB aggregation operations in a fluent
@@ -88,12 +89,12 @@ public interface ExecutableAggregationOperation {
/**
* Apply pipeline operations as specified and stream all matching elements. <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable}
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable}
*
* @return a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* Never {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
*/
CloseableIterator<T> stream();
Stream<T> stream();
}
/**

View File

@@ -15,10 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -37,10 +38,6 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ExecutableAggregation<T> aggregateAndReturn(Class<T> domainType) {
@@ -69,10 +66,6 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationWithAggregation<T> inCollection(String collection) {
@@ -81,10 +74,6 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithAggregation#by(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public TerminatingAggregation<T> by(Aggregation aggregation) {
@@ -93,21 +82,13 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#all()
*/
@Override
public AggregationResults<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#stream()
*/
@Override
public CloseableIterator<T> stream() {
public Stream<T> stream() {
return template.aggregateStream(aggregation, getCollectionName(aggregation), domainType);
}

View File

@@ -118,8 +118,8 @@ public interface ExecutableFindOperation {
/**
* Stream all matching elements.
*
* @return a {@link Stream} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed. Never
* {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
*/
Stream<T> stream();

View File

@@ -20,12 +20,11 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -51,10 +50,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation#query(java.lang.Class)
*/
@Override
public <T> ExecutableFind<T> query(Class<T> domainType) {
@@ -74,11 +69,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
@Nullable private final String collection;
private final @Nullable String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
@Nullable String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
@@ -86,10 +81,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
@@ -98,10 +89,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection#as(Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
@@ -110,10 +97,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
@@ -122,10 +105,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#oneValue()
*/
@Override
public T oneValue() {
@@ -142,10 +121,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#firstValue()
*/
@Override
public T firstValue() {
@@ -154,55 +129,31 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return ObjectUtils.isEmpty(result) ? null : result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#all()
*/
@Override
public List<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#stream()
*/
@Override
public Stream<T> stream() {
return StreamUtils.createStreamFromIterator(doStream());
return doStream();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#count()
*/
@Override
public long count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#exists()
*/
@Override
public boolean exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked")
@Override
public TerminatingDistinct<Object> distinct(String field) {
@@ -227,7 +178,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private CloseableIterator<T> doStream() {
private Stream<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType);
}
@@ -257,10 +208,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.clientFindIterable)
*/
@Override
public FindIterable<Document> prepare(FindIterable<Document> iterable) {
@@ -295,10 +242,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
@@ -308,10 +251,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingDistinct<T> matching(Query query) {
@@ -320,10 +259,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override
public List<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -40,10 +40,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.coreExecutableInsertOperation#insert(java.lan.Class)
*/
@Override
public <T> ExecutableInsert<T> insert(Class<T> domainType) {
@@ -71,10 +67,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.bulkMode = bulkMode;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#insert(java.lang.Class)
*/
@Override
public T one(T object) {
@@ -83,10 +75,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Collection<T> all(Collection<? extends T> objects) {
@@ -95,10 +83,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingBulkInsert#bulk(java.util.Collection)
*/
@Override
public BulkWriteResult bulk(Collection<? extends T> objects) {
@@ -108,10 +92,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
.insert(new ArrayList<>(objects)).execute();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public InsertWithBulkMode<T> inCollection(String collection) {
@@ -120,10 +100,6 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithBulkMode#withBulkMode(org.springframework.data.mongodb.core.BulkMode)
*/
@Override
public TerminatingBulkInsert<T> withBulkMode(BulkMode bulkMode) {

View File

@@ -41,10 +41,6 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ExecutableRemove<T> remove(Class<T> domainType) {
@@ -71,10 +67,6 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithCollection#inCollection(java.lang.String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
@@ -83,10 +75,6 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
@@ -95,28 +83,16 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#all()
*/
@Override
public DeleteResult all() {
return template.doRemove(getCollectionName(), query, domainType, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#one()
*/
@Override
public DeleteResult one() {
return template.doRemove(getCollectionName(), query, domainType, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public List<T> findAndRemove() {

View File

@@ -40,10 +40,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ExecutableUpdate<T> update(Class<T> domainType) {
@@ -85,10 +81,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(UpdateDefinition update) {
@@ -98,10 +90,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
@@ -111,10 +99,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
@@ -124,10 +108,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#replaceWith(Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
@@ -137,10 +117,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
@@ -150,10 +126,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
options, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
@@ -163,10 +135,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
@@ -176,37 +144,21 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, resultType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#all()
*/
@Override
public UpdateResult all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#first()
*/
@Override
public UpdateResult first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public UpdateResult upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue()
*/
@Override
public @Nullable T findAndModifyValue() {
@@ -215,10 +167,6 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndReplace#findAndReplaceValue()
*/
@Override
public @Nullable T findAndReplaceValue() {

View File

@@ -115,6 +115,10 @@ abstract class IndexConverters {
ops = ops.collation(fromDocument(indexOptions.get("collation", Document.class)));
}
if (indexOptions.containsKey("wildcardProjection")) {
ops.wildcardProjection(indexOptions.get("wildcardProjection", Document.class));
}
return ops;
};
}

View File

@@ -112,55 +112,31 @@ public class MappedDocument {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override
public Document getUpdateObject() {
return delegate.getUpdateObject();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override
public boolean modifies(String key) {
return delegate.modifies(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override
public void inc(String version) {
delegate.inc(version);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override
public Boolean isIsolated() {
return delegate.isIsolated();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override
public boolean hasArrayFilters() {
return delegate.hasArrayFilters();

View File

@@ -20,13 +20,19 @@ import java.util.Collection;
import java.util.Collections;
import java.util.EnumSet;
import java.util.List;
import java.util.function.Predicate;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.Encrypted;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.ArrayJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.EncryptedJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.ObjectJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type;
@@ -34,10 +40,13 @@ import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.MongoJsonSchemaBuilder;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* {@link MongoJsonSchemaCreator} implementation using both {@link MongoConverter} and {@link MappingContext} to obtain
@@ -52,6 +61,8 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private final MongoConverter converter;
private final MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final Predicate<JsonSchemaPropertyContext> filter;
private final LinkedMultiValueMap<String, Class<?>> mergeProperties;
/**
* Create a new instance of {@link MappingMongoJsonSchemaCreator}.
@@ -61,27 +72,78 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
@SuppressWarnings("unchecked")
MappingMongoJsonSchemaCreator(MongoConverter converter) {
Assert.notNull(converter, "Converter must not be null!");
this.converter = converter;
this.mappingContext = (MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty>) converter
.getMappingContext();
this(converter, (MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty>) converter.getMappingContext(),
(property) -> true, new LinkedMultiValueMap<>());
}
/*
* (non-Javadoc)
* org.springframework.data.mongodb.core.MongoJsonSchemaCreator#createSchemaFor(java.lang.Class)
@SuppressWarnings("unchecked")
MappingMongoJsonSchemaCreator(MongoConverter converter,
MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
Predicate<JsonSchemaPropertyContext> filter, LinkedMultiValueMap<String, Class<?>> mergeProperties) {
Assert.notNull(converter, "Converter must not be null!");
this.converter = converter;
this.mappingContext = mappingContext;
this.filter = filter;
this.mergeProperties = mergeProperties;
}
@Override
public MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter) {
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, mergeProperties);
}
@Override
public PropertySpecifier property(String path) {
return types -> withTypesFor(path, types);
}
/**
* Specify additional types to be considered wehen rendering the schema for the given path.
*
* @param path path the path using {@literal dot '.'} notation.
* @param types must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.4
*/
public MongoJsonSchemaCreator withTypesFor(String path, Class<?>... types) {
LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone();
for (Class<?> type : types) {
clone.add(path, type);
}
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, clone);
}
@Override
public MongoJsonSchema createSchemaFor(Class<?> type) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(type);
MongoJsonSchemaBuilder schemaBuilder = MongoJsonSchema.builder();
{
Encrypted encrypted = entity.findAnnotation(Encrypted.class);
if (encrypted != null) {
Document encryptionMetadata = new Document();
Collection<Object> encryptionKeyIds = entity.getEncryptionKeyIds();
if (!CollectionUtils.isEmpty(encryptionKeyIds)) {
encryptionMetadata.append("keyId", encryptionKeyIds);
}
if (StringUtils.hasText(encrypted.algorithm())) {
encryptionMetadata.append("algorithm", encrypted.algorithm());
}
schemaBuilder.encryptionMetadata(encryptionMetadata);
}
}
List<JsonSchemaProperty> schemaProperties = computePropertiesForEntity(Collections.emptyList(), entity);
schemaBuilder.properties(schemaProperties.toArray(new JsonSchemaProperty[0]));
return schemaBuilder.build();
}
private List<JsonSchemaProperty> computePropertiesForEntity(List<MongoPersistentProperty> path,
@@ -93,6 +155,14 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
List<MongoPersistentProperty> currentPath = new ArrayList<>(path);
String stringPath = currentPath.stream().map(PersistentProperty::getName).collect(Collectors.joining("."));
stringPath = StringUtils.hasText(stringPath) ? (stringPath + "." + nested.getName()) : nested.getName();
if (!filter.test(new PropertyContext(stringPath, nested))) {
if (!mergeProperties.containsKey(stringPath)) {
continue;
}
}
if (path.contains(nested)) { // cycle guard
schemaProperties.add(createSchemaProperty(computePropertyFieldName(CollectionUtils.lastElement(currentPath)),
Object.class, false));
@@ -108,27 +178,114 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private JsonSchemaProperty computeSchemaForProperty(List<MongoPersistentProperty> path) {
String stringPath = path.stream().map(MongoPersistentProperty::getName).collect(Collectors.joining("."));
MongoPersistentProperty property = CollectionUtils.lastElement(path);
boolean required = isRequiredProperty(property);
Class<?> rawTargetType = computeTargetType(property); // target type before conversion
Class<?> targetType = converter.getTypeMapper().getWriteTargetTypeFor(rawTargetType); // conversion target type
if (property.isEntity() && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) {
return createObjectSchemaPropertyForEntity(path, property, required);
if (!isCollection(property) && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) {
if (property.isEntity() || mergeProperties.containsKey(stringPath)) {
List<JsonSchemaProperty> targetProperties = new ArrayList<>();
if (property.isEntity()) {
targetProperties.add(createObjectSchemaPropertyForEntity(path, property, required));
}
if (mergeProperties.containsKey(stringPath)) {
for (Class<?> theType : mergeProperties.get(stringPath)) {
ObjectJsonSchemaProperty target = JsonSchemaProperty.object(property.getName());
List<JsonSchemaProperty> nestedProperties = computePropertiesForEntity(path,
mappingContext.getRequiredPersistentEntity(theType));
targetProperties.add(createPotentiallyRequiredSchemaProperty(
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required));
}
}
return targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.merged(targetProperties);
}
}
String fieldName = computePropertyFieldName(property);
if (property.isCollectionLike()) {
return createSchemaProperty(fieldName, targetType, required);
JsonSchemaProperty schemaProperty;
if (isCollection(property)) {
schemaProperty = createArraySchemaProperty(fieldName, property, required);
} else if (property.isMap()) {
return createSchemaProperty(fieldName, Type.objectType(), required);
schemaProperty = createSchemaProperty(fieldName, Type.objectType(), required);
} else if (ClassUtils.isAssignable(Enum.class, targetType)) {
return createEnumSchemaProperty(fieldName, targetType, required);
schemaProperty = createEnumSchemaProperty(fieldName, targetType, required);
} else {
schemaProperty = createSchemaProperty(fieldName, targetType, required);
}
return createSchemaProperty(fieldName, targetType, required);
return applyEncryptionDataIfNecessary(property, schemaProperty);
}
private JsonSchemaProperty createArraySchemaProperty(String fieldName, MongoPersistentProperty property,
boolean required) {
ArrayJsonSchemaProperty schemaProperty = JsonSchemaProperty.array(fieldName);
if (isSpecificType(property)) {
schemaProperty = potentiallyEnhanceArraySchemaProperty(property, schemaProperty);
}
return createPotentiallyRequiredSchemaProperty(schemaProperty, required);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private ArrayJsonSchemaProperty potentiallyEnhanceArraySchemaProperty(MongoPersistentProperty property,
ArrayJsonSchemaProperty schemaProperty) {
MongoPersistentEntity<?> persistentEntity = mappingContext
.getPersistentEntity(property.getTypeInformation().getRequiredComponentType());
if (persistentEntity != null) {
List<JsonSchemaProperty> nestedProperties = computePropertiesForEntity(Collections.emptyList(), persistentEntity);
if (nestedProperties.isEmpty()) {
return schemaProperty;
}
return schemaProperty
.items(JsonSchemaObject.object().properties(nestedProperties.toArray(new JsonSchemaProperty[0])));
}
if (ClassUtils.isAssignable(Enum.class, property.getActualType())) {
List<Object> possibleValues = getPossibleEnumValues((Class<Enum>) property.getActualType());
return schemaProperty
.items(createSchemaObject(computeTargetType(property.getActualType(), possibleValues), possibleValues));
}
return schemaProperty.items(JsonSchemaObject.of(property.getActualType()));
}
private boolean isSpecificType(MongoPersistentProperty property) {
return !ClassTypeInformation.OBJECT.equals(property.getTypeInformation().getActualType());
}
private JsonSchemaProperty applyEncryptionDataIfNecessary(MongoPersistentProperty property,
JsonSchemaProperty schemaProperty) {
Encrypted encrypted = property.findAnnotation(Encrypted.class);
if (encrypted == null) {
return schemaProperty;
}
EncryptedJsonSchemaProperty enc = new EncryptedJsonSchemaProperty(schemaProperty);
if (StringUtils.hasText(encrypted.algorithm())) {
enc = enc.algorithm(encrypted.algorithm());
}
if (!ObjectUtils.isEmpty(encrypted.keyId())) {
enc = enc.keys(property.getEncryptionKeyIds());
}
return enc;
}
private JsonSchemaProperty createObjectSchemaPropertyForEntity(List<MongoPersistentProperty> path,
@@ -142,15 +299,12 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private JsonSchemaProperty createEnumSchemaProperty(String fieldName, Class<?> targetType, boolean required) {
List<Object> possibleValues = new ArrayList<>();
List<Object> possibleValues = getPossibleEnumValues((Class<Enum>) targetType);
for (Object enumValue : EnumSet.allOf((Class) targetType)) {
possibleValues.add(converter.convertToMongoType(enumValue));
}
targetType = possibleValues.isEmpty() ? targetType : possibleValues.iterator().next().getClass();
targetType = computeTargetType(targetType, possibleValues);
return createSchemaProperty(fieldName, targetType, required, possibleValues);
}
@@ -161,14 +315,20 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
JsonSchemaProperty createSchemaProperty(String fieldName, Object type, boolean required,
Collection<?> possibleValues) {
TypedJsonSchemaObject schemaObject = createSchemaObject(type, possibleValues);
return createPotentiallyRequiredSchemaProperty(JsonSchemaProperty.named(fieldName).with(schemaObject), required);
}
private TypedJsonSchemaObject createSchemaObject(Object type, Collection<?> possibleValues) {
TypedJsonSchemaObject schemaObject = type instanceof Type ? JsonSchemaObject.of(Type.class.cast(type))
: JsonSchemaObject.of(Class.class.cast(type));
if (!CollectionUtils.isEmpty(possibleValues)) {
schemaObject = schemaObject.possibleValues(possibleValues);
}
return createPotentiallyRequiredSchemaProperty(JsonSchemaProperty.named(fieldName).with(schemaObject), required);
return schemaObject;
}
private String computePropertyFieldName(PersistentProperty property) {
@@ -199,12 +359,53 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
return mongoProperty.getFieldType() != mongoProperty.getActualType() ? Object.class : mongoProperty.getFieldType();
}
static JsonSchemaProperty createPotentiallyRequiredSchemaProperty(JsonSchemaProperty property, boolean required) {
private static Class<?> computeTargetType(Class<?> fallback, List<Object> possibleValues) {
return possibleValues.isEmpty() ? fallback : possibleValues.iterator().next().getClass();
}
if (!required) {
private <E extends Enum<E>> List<Object> getPossibleEnumValues(Class<E> targetType) {
EnumSet<E> enumSet = EnumSet.allOf(targetType);
List<Object> possibleValues = new ArrayList<>(enumSet.size());
for (Object enumValue : enumSet) {
possibleValues.add(converter.convertToMongoType(enumValue));
}
return possibleValues;
}
private static boolean isCollection(MongoPersistentProperty property) {
return property.isCollectionLike() && !property.getType().equals(byte[].class);
}
static JsonSchemaProperty createPotentiallyRequiredSchemaProperty(JsonSchemaProperty property, boolean required) {
return required ? JsonSchemaProperty.required(property) : property;
}
class PropertyContext implements JsonSchemaPropertyContext {
private final String path;
private final MongoPersistentProperty property;
public PropertyContext(String path, MongoPersistentProperty property) {
this.path = path;
this.property = property;
}
@Override
public String getPath() {
return path;
}
@Override
public MongoPersistentProperty getProperty() {
return property;
}
return JsonSchemaProperty.required(property);
@Override
public <T> MongoPersistentEntity<T> resolveEntity(MongoPersistentProperty property) {
return (MongoPersistentEntity<T>) mappingContext.getPersistentEntity(property);
}
}
}

View File

@@ -46,25 +46,16 @@ public class MongoAdmin implements MongoAdminOperations {
this.mongoClient = client;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).drop();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson();

View File

@@ -119,27 +119,15 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends MongoClient> getObjectType() {
return MongoClient.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClient createInstance() throws Exception {
return createMongoClient(computeClientSetting());
@@ -336,10 +324,6 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
return !fromConnectionStringIsDefault ? fromConnectionString : defaultValue;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
@@ -353,6 +337,11 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
}
private String getOrDefault(Object value, String defaultValue) {
return !StringUtils.isEmpty(value) ? value.toString() : defaultValue;
if(value == null) {
return defaultValue;
}
String sValue = value.toString();
return StringUtils.hasText(sValue) ? sValue : defaultValue;
}
}

View File

@@ -36,6 +36,7 @@ import com.mongodb.MongoClientSettings.Builder;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
import com.mongodb.ServerAddress;
import com.mongodb.ServerApi;
import com.mongodb.WriteConcern;
import com.mongodb.connection.ClusterConnectionMode;
import com.mongodb.connection.ClusterType;
@@ -113,6 +114,7 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
// encryption and retry
private @Nullable AutoEncryptionSettings autoEncryptionSettings;
private @Nullable ServerApi serverApi;
/**
* @param socketConnectTimeoutMS in msec
@@ -395,6 +397,15 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
this.autoEncryptionSettings = autoEncryptionSettings;
}
/**
* @param serverApi can be {@literal null}.
* @see MongoClientSettings.Builder#serverApi(ServerApi)
* @since 3.3
*/
public void setServerApi(@Nullable ServerApi serverApi) {
this.serverApi = serverApi;
}
@Override
public Class<?> getObjectType() {
return MongoClientSettings.class;
@@ -476,9 +487,11 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
if (retryWrites != null) {
builder = builder.retryWrites(retryWrites);
}
if (uUidRepresentation != null) {
builder.uuidRepresentation(uUidRepresentation);
builder = builder.uuidRepresentation(uUidRepresentation);
}
if (serverApi != null) {
builder = builder.serverApi(serverApi);
}
return builder.build();

View File

@@ -84,18 +84,10 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
public MongoDatabase getMongoDatabase() throws DataAccessException {
return getMongoDatabase(getDefaultDatabaseName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
@@ -118,28 +110,16 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
*/
protected abstract MongoDatabase doGetMongoDatabase(String dbName);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
closeClient();
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.Session)
*/
public MongoDatabaseFactory withSession(ClientSession session) {
return new MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session, this);
}
@@ -180,55 +160,31 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
@Override
public MongoDatabase getMongoDatabase() throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase(dbName));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#isTransactionActive()
*/
@Override
public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction();

View File

@@ -1,50 +0,0 @@
/*
* Copyright 2018-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.support.PersistenceExceptionTranslator;
/**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as
* database name and exception translator.
* <br />
* Not intended to be used directly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @param <C> Client type.
* @since 2.1
* @see SimpleMongoClientDatabaseFactory
* @deprecated since 3.0, use {@link MongoDatabaseFactorySupport} instead.
*/
@Deprecated
public abstract class MongoDbFactorySupport<C> extends MongoDatabaseFactorySupport<C> {
/**
* Create a new {@link MongoDbFactorySupport} object given {@code mongoClient}, {@code databaseName},
* {@code mongoInstanceCreated} and {@link PersistenceExceptionTranslator}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated {@literal true} if the client instance was created by a subclass of
* {@link MongoDbFactorySupport} to close the client on {@link #destroy()}.
* @param exceptionTranslator must not be {@literal null}.
*/
protected MongoDbFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
super(mongoClient, databaseName, mongoInstanceCreated, exceptionTranslator);
}
}

View File

@@ -88,10 +88,6 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
this.schemaMap = schemaMap;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public AutoEncryptionSettings getObject() {
@@ -109,10 +105,6 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
return source != null ? source : Collections.emptyMap();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return AutoEncryptionSettings.class;

View File

@@ -68,10 +68,6 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {

View File

@@ -15,7 +15,24 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import java.util.function.Predicate;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import org.springframework.data.mongodb.core.mapping.Encrypted;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.Unwrapped.Nullable;
import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.util.Assert;
@@ -46,6 +63,7 @@ import org.springframework.util.Assert;
* {@link org.springframework.data.annotation.Id _id} properties using types that can be converted into
* {@link org.bson.types.ObjectId} like {@link String} will be mapped to {@code type : 'object'} unless there is more
* specific information available via the {@link org.springframework.data.mongodb.core.mapping.MongoId} annotation.
* {@link Encrypted} properties will contain {@literal encrypt} information.
*
* @author Christoph Strobl
* @since 2.2
@@ -60,6 +78,111 @@ public interface MongoJsonSchemaCreator {
*/
MongoJsonSchema createSchemaFor(Class<?> type);
/**
* Create a merged {@link MongoJsonSchema} out of the individual schemas of the given types by merging their
* properties into one large {@link MongoJsonSchema schema}.
*
* @param types must not be {@literal null} nor contain {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema mergedSchemaFor(Class<?>... types) {
MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new);
return MongoJsonSchema.merge(schemas);
}
/**
* Filter matching {@link JsonSchemaProperty properties}.
*
* @param filter the {@link Predicate} to evaluate for inclusion. Must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter);
/**
* Entry point to specify additional behavior for a given path.
*
* @param path the path using {@literal dot '.'} notation.
* @return new instance of {@link PropertySpecifier}.
* @since 3.4
*/
PropertySpecifier property(String path);
/**
* The context in which a specific {@link #getProperty()} is encountered during schema creation.
*
* @since 3.3
*/
interface JsonSchemaPropertyContext {
/**
* The path to a given field/property in dot notation.
*
* @return never {@literal null}.
*/
String getPath();
/**
* The current property.
*
* @return never {@literal null}.
*/
MongoPersistentProperty getProperty();
/**
* Obtain the {@link MongoPersistentEntity} for a given property.
*
* @param property must not be {@literal null}.
* @param <T>
* @return {@literal null} if the property is not an entity. It is nevertheless recommend to check
* {@link PersistentProperty#isEntity()} first.
*/
@Nullable
<T> MongoPersistentEntity<T> resolveEntity(MongoPersistentProperty property);
}
/**
* A filter {@link Predicate} that matches {@link Encrypted encrypted properties} and those having nested ones.
*
* @return new instance of {@link Predicate}.
* @since 3.3
*/
static Predicate<JsonSchemaPropertyContext> encryptedOnly() {
return new Predicate<JsonSchemaPropertyContext>() {
// cycle guard
private final Set<MongoPersistentProperty> seen = new HashSet<>();
@Override
public boolean test(JsonSchemaPropertyContext context) {
return extracted(context.getProperty(), context);
}
private boolean extracted(MongoPersistentProperty property, JsonSchemaPropertyContext context) {
if (property.isAnnotationPresent(Encrypted.class)) {
return true;
}
if (!property.isEntity() || seen.contains(property)) {
return false;
}
seen.add(property);
for (MongoPersistentProperty nested : context.resolveEntity(property)) {
if (extracted(nested, context)) {
return true;
}
}
return false;
}
};
}
/**
* Creates a new {@link MongoJsonSchemaCreator} that is aware of conversions applied by the given
* {@link MongoConverter}.
@@ -72,4 +195,56 @@ public interface MongoJsonSchemaCreator {
Assert.notNull(mongoConverter, "MongoConverter must not be null!");
return new MappingMongoJsonSchemaCreator(mongoConverter);
}
/**
* Creates a new {@link MongoJsonSchemaCreator} that is aware of type mappings and potential
* {@link org.springframework.data.spel.spi.EvaluationContextExtension extensions}.
*
* @param mappingContext must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
static MongoJsonSchemaCreator create(MappingContext mappingContext) {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(config -> {}));
converter.afterPropertiesSet();
return create(converter);
}
/**
* Creates a new {@link MongoJsonSchemaCreator} that does not consider potential extensions - suitable for testing. We
* recommend to use {@link #create(MappingContext)}.
*
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
static MongoJsonSchemaCreator create() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
mappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(config -> {}));
converter.afterPropertiesSet();
return create(converter);
}
/**
* @author Christoph Strobl
* @since 3.4
*/
interface PropertySpecifier {
/**
* Set additional type parameters for polymorphic ones.
*
* @param types must not be {@literal null}.
* @return the source
*/
MongoJsonSchemaCreator withTypes(Class<?>... types);
}
}

View File

@@ -20,6 +20,7 @@ import java.util.List;
import java.util.Set;
import java.util.function.Consumer;
import java.util.function.Supplier;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.data.geo.GeoResults;
@@ -32,8 +33,6 @@ import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
@@ -42,7 +41,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -57,8 +55,7 @@ import com.mongodb.client.result.UpdateResult;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy).
* <br />
* proxy). <br />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
@@ -84,7 +81,7 @@ public interface MongoOperations extends FluentMongoOperations {
String getCollectionName(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* Execute a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* obtain the {@link Document} holding the actual command. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
@@ -124,8 +121,7 @@ public interface MongoOperations extends FluentMongoOperations {
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch);
/**
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <br />
* Executes a {@link DbCallback} translating any exceptions as necessary. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
@@ -137,8 +133,7 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T execute(DbCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the entity collection of the specified class.
* <br />
* Executes the given {@link CollectionCallback} on the entity collection of the specified class. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
@@ -150,8 +145,7 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T execute(Class<?> entityClass, CollectionCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <br />
* Executes the given {@link CollectionCallback} on the collection of the given name. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
@@ -175,8 +169,7 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <br />
* provided by the given {@link Supplier} to each and every command issued against MongoDB. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use the
* {@link SessionScoped#execute(SessionCallback, Consumer)} hook to potentially close the {@link ClientSession}.
*
@@ -211,8 +204,7 @@ public interface MongoOperations extends FluentMongoOperations {
}
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoOperations}.
* <br />
* Obtain a {@link ClientSession} bound instance of {@link MongoOperations}. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
@@ -225,34 +217,34 @@ public interface MongoOperations extends FluentMongoOperations {
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
* Returns a {@link String} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param <T> element return type
* @return will never be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
<T> Stream<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param <T> element return type
* @return will never be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
<T> Stream<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
@@ -299,8 +291,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Get a {@link MongoCollection} by its name. The returned collection may not exists yet (except in local memory) and
* is created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first.
* <br />
* exists} first. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -309,8 +300,7 @@ public interface MongoOperations extends FluentMongoOperations {
MongoCollection<Document> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists.
* <br />
* Check to see if a collection with a name indicated by the entity class exists. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection. Must not be {@literal null}.
@@ -319,8 +309,7 @@ public interface MongoOperations extends FluentMongoOperations {
<T> boolean collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists.
* <br />
* Check to see if a collection with a given name exists. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -329,8 +318,7 @@ public interface MongoOperations extends FluentMongoOperations {
boolean collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class.
* <br />
* Drop the collection with the name indicated by the entity class. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
@@ -338,8 +326,7 @@ public interface MongoOperations extends FluentMongoOperations {
<T> void dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name.
* <br />
* Drop the collection with the given name. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
@@ -402,11 +389,9 @@ public interface MongoOperations extends FluentMongoOperations {
BulkOperations bulkOps(BulkMode mode, @Nullable Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <br />
* Query for a list of objects of type T from the collection used by the entity class. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -416,11 +401,9 @@ public interface MongoOperations extends FluentMongoOperations {
<T> List<T> findAll(Class<T> entityClass);
/**
* Query for a list of objects of type T from the specified collection.
* <br />
* Query for a list of objects of type T from the specified collection. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -430,43 +413,6 @@ public interface MongoOperations extends FluentMongoOperations {
*/
<T> List<T> findAll(Class<T> entityClass, String collectionName);
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} and
* {@link org.springframework.data.mongodb.core.aggregation.MatchOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(@Nullable Criteria criteria, String inputCollectionName, GroupBy groupBy,
Class<T> entityClass);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
@@ -521,9 +467,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class. The name of the inputCollection is
* derived from the inputType of the aggregation.
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class. The name of the inputCollection is derived from
* the inputType of the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
@@ -532,35 +478,37 @@ public interface MongoOperations extends FluentMongoOperations {
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class and are returned as stream. The name
* of the inputCollection is derived from the inputType of the aggregation.
* <br />
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the inputType of the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class.
* <br />
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -569,17 +517,18 @@ public interface MongoOperations extends FluentMongoOperations {
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
<O> Stream<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class.
* <br />
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -588,10 +537,11 @@ public interface MongoOperations extends FluentMongoOperations {
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
<O> Stream<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
@@ -601,7 +551,9 @@ public interface MongoOperations extends FluentMongoOperations {
* @param reduceFunction The JavaScript reduce function
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
@@ -614,7 +566,9 @@ public interface MongoOperations extends FluentMongoOperations {
* @param mapReduceOptions Options that specify detailed map-reduce behavior.
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
@Nullable MapReduceOptions mapReduceOptions, Class<T> entityClass);
@@ -628,7 +582,9 @@ public interface MongoOperations extends FluentMongoOperations {
* @param reduceFunction The JavaScript reduce function
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
@@ -642,7 +598,9 @@ public interface MongoOperations extends FluentMongoOperations {
* @param mapReduceOptions Options that specify detailed map-reduce behavior
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
@Nullable MapReduceOptions mapReduceOptions, Class<T> entityClass);
@@ -701,11 +659,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
* <br />
* specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -719,11 +675,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type.
* <br />
* type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -767,11 +721,9 @@ public interface MongoOperations extends FluentMongoOperations {
boolean exists(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <br />
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -783,11 +735,9 @@ public interface MongoOperations extends FluentMongoOperations {
<T> List<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <br />
* Map the results of an ad-hoc query on the specified collection to a List of the specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1119,10 +1069,8 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* database. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1139,8 +1087,7 @@ public interface MongoOperations extends FluentMongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1159,18 +1106,17 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* This method uses an
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @see #exactCount(Query, Class)
* @see #estimatedCount(Class)
*/
long count(Query query, Class<?> entityClass);
@@ -1181,25 +1127,44 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* This method uses an
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #exactCount(Query, String)
* @see #estimatedCount(String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #estimatedCount(String)
*/
long count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <br />
* based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1214,8 +1179,7 @@ public interface MongoOperations extends FluentMongoOperations {
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <br />
* Estimate the number of documents in the given collection based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1225,14 +1189,60 @@ public interface MongoOperations extends FluentMongoOperations {
*/
long estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
* <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @since 3.4
*/
default long exactCount(Query query, Class<?> entityClass) {
return exactCount(query, entityClass, getCollectionName(entityClass));
}
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @since 3.4
*/
default long exactCount(Query query, String collectionName) {
return exactCount(query, null, collectionName);
}
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* count all matches. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -1244,20 +1254,18 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @since 3.4
*/
long count(Query query, @Nullable Class<?> entityClass, String collectionName);
long exactCount(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1269,11 +1277,9 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T insert(T objectToSave);
/**
* Insert the object into the specified collection.
* <br />
* Insert the object into the specified collection. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1314,17 +1320,14 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <br />
* object is not already present, that is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1335,16 +1338,14 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <br />
* is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* <br />
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details. <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.

View File

@@ -0,0 +1,92 @@
/*
* Copyright 2021-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import com.mongodb.ServerApi;
import com.mongodb.ServerApi.Builder;
import com.mongodb.ServerApiVersion;
/**
* {@link FactoryBean} for creating {@link ServerApi} using the {@link ServerApi.Builder}.
*
* @author Christoph Strobl
* @since 3.3
*/
public class MongoServerApiFactoryBean implements FactoryBean<ServerApi> {
private String version;
private @Nullable Boolean deprecationErrors;
private @Nullable Boolean strict;
/**
* @param version the version string either as the enum name or the server version value.
* @see ServerApiVersion
*/
public void setVersion(String version) {
this.version = version;
}
/**
* @param deprecationErrors
* @see ServerApi.Builder#deprecationErrors(boolean)
*/
public void setDeprecationErrors(@Nullable Boolean deprecationErrors) {
this.deprecationErrors = deprecationErrors;
}
/**
* @param strict
* @see ServerApi.Builder#strict(boolean)
*/
public void setStrict(@Nullable Boolean strict) {
this.strict = strict;
}
@Nullable
@Override
public ServerApi getObject() throws Exception {
Builder builder = ServerApi.builder().version(version());
if (deprecationErrors != null) {
builder = builder.deprecationErrors(deprecationErrors);
}
if (strict != null) {
builder = builder.strict(strict);
}
return builder.build();
}
@Nullable
@Override
public Class<?> getObjectType() {
return ServerApi.class;
}
private ServerApiVersion version() {
try {
// lookup by name eg. 'V1'
return ObjectUtils.caseInsensitiveValueOf(ServerApiVersion.values(), version);
} catch (IllegalArgumentException e) {
// or just the version number, eg. just '1'
return ServerApiVersion.findByValue(version);
}
}
}

View File

@@ -16,18 +16,19 @@
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.data.mapping.SimplePropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.util.ClassUtils;
import org.springframework.data.mongodb.core.mapping.PersistentPropertyTranslator;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.util.Predicates;
/**
* Common operations performed on properties of an entity like extracting fields information for projection creation.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
class PropertyOperations {
@@ -40,37 +41,37 @@ class PropertyOperations {
/**
* For cases where {@code fields} is {@link Document#isEmpty() empty} include only fields that are required for
* creating the projection (target) type if the {@code targetType} is a {@literal DTO projection} or a
* creating the projection (target) type if the {@code EntityProjection} is a {@literal DTO projection} or a
* {@literal closed interface projection}.
*
* @param projectionFactory must not be {@literal null}.
* @param projection must not be {@literal null}.
* @param fields must not be {@literal null}.
* @param domainType must not be {@literal null}.
* @param targetType must not be {@literal null}.
* @return {@link Document} with fields to be included.
*/
Document computeFieldsForProjection(ProjectionFactory projectionFactory, Document fields, Class<?> domainType,
Class<?> targetType) {
Document computeMappedFieldsForProjection(EntityProjection<?, ?> projection,
Document fields) {
if (!fields.isEmpty() || ClassUtils.isAssignable(domainType, targetType)) {
if (!projection.isClosedProjection()) {
return fields;
}
Document projectedFields = new Document();
if (targetType.isInterface()) {
ProjectionInformation projectionInformation = projectionFactory.getProjectionInformation(targetType);
if (projectionInformation.isClosed()) {
projectionInformation.getInputProperties().forEach(it -> projectedFields.append(it.getName(), 1));
}
if (projection.getMappedType().getType().isInterface()) {
projection.forEach(it -> {
projectedFields.put(it.getPropertyPath().getSegment(), 1);
});
} else {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(targetType);
if (entity != null) {
entity.doWithProperties(
(SimplePropertyHandler) persistentProperty -> projectedFields.append(persistentProperty.getName(), 1));
// DTO projections use merged metadata between domain type and result type
PersistentPropertyTranslator translator = PersistentPropertyTranslator.create(
mappingContext.getRequiredPersistentEntity(projection.getDomainType()),
Predicates.negate(MongoPersistentProperty::hasExplicitFieldName));
MongoPersistentEntity<?> persistentEntity = mappingContext
.getRequiredPersistentEntity(projection.getMappedType());
for (MongoPersistentProperty property : persistentEntity) {
projectedFields.put(translator.translate(property).getFieldName(), 1);
}
}

View File

@@ -28,6 +28,7 @@ import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
@@ -54,11 +55,10 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.CountOptions;
@@ -288,45 +288,59 @@ class QueryOperations {
return queryMapper.getMappedObject(getQueryObject(), entity);
}
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity, Class<?> targetType,
ProjectionFactory projectionFactory) {
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity,
EntityProjection<?, ?> projection) {
Document fields = new Document();
Document fields = evaluateFields(entity);
for (Entry<String, Object> entry : query.getFieldsObject().entrySet()) {
if (entity == null) {
return fields;
}
Document mappedFields;
if (!fields.isEmpty()) {
mappedFields = queryMapper.getMappedFields(fields, entity);
} else {
mappedFields = propertyOperations.computeMappedFieldsForProjection(projection, fields);
mappedFields = queryMapper.addMetaAttributes(mappedFields, entity);
}
if (entity.hasTextScoreProperty() && mappedFields.containsKey(entity.getTextScoreProperty().getFieldName())
&& !query.getQueryObject().containsKey("$text")) {
mappedFields.remove(entity.getTextScoreProperty().getFieldName());
}
if (mappedFields.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
return mappedFields;
}
private Document evaluateFields(@Nullable MongoPersistentEntity<?> entity) {
Document fields = query.getFieldsObject();
if (fields.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
Document evaluated = new Document();
for (Entry<String, Object> entry : fields.entrySet()) {
if (entry.getValue() instanceof MongoExpression) {
AggregationOperationContext ctx = entity == null ? Aggregation.DEFAULT_CONTEXT
: new RelaxedTypeBasedAggregationOperationContext(entity.getType(), mappingContext, queryMapper);
fields.put(entry.getKey(), AggregationExpression.from((MongoExpression) entry.getValue()).toDocument(ctx));
evaluated.put(entry.getKey(), AggregationExpression.from((MongoExpression) entry.getValue()).toDocument(ctx));
} else {
fields.put(entry.getKey(), entry.getValue());
evaluated.put(entry.getKey(), entry.getValue());
}
}
Document mappedFields = fields;
if (entity == null) {
return mappedFields;
}
Document projectedFields = propertyOperations.computeFieldsForProjection(projectionFactory, fields,
entity.getType(), targetType);
if (ObjectUtils.nullSafeEquals(fields, projectedFields)) {
mappedFields = queryMapper.getMappedFields(projectedFields, entity);
} else {
mappedFields = queryMapper.getMappedFields(projectedFields,
mappingContext.getRequiredPersistentEntity(targetType));
}
if (entity.hasTextScoreProperty() && !query.getQueryObject().containsKey("$text")) {
mappedFields.remove(entity.getTextScoreProperty().getFieldName());
}
return mappedFields;
return evaluated;
}
/**
@@ -388,8 +402,8 @@ class QueryOperations {
}
@Override
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity, Class<?> targetType,
ProjectionFactory projectionFactory) {
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity,
EntityProjection<?, ?> projection) {
return getMappedFields(entity);
}

View File

@@ -46,10 +46,6 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ReactiveAggregation<T> aggregateAndReturn(Class<T> domainType) {
@@ -75,10 +71,6 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationOperationWithAggregation<T> inCollection(String collection) {
@@ -87,10 +79,6 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithAggregation#by(org.springframework.data.mongodb.core.Aggregation)
*/
@Override
public TerminatingAggregationOperation<T> by(Aggregation aggregation) {
@@ -99,10 +87,6 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.TerminatingAggregationOperation#all()
*/
@Override
public Flux<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);

View File

@@ -46,10 +46,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation#changeStream(java.lang.Class)
*/
@Override
public <T> ReactiveChangeStream<T> changeStream(Class<T> domainType) {
@@ -76,10 +72,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.String)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(String collection) {
@@ -88,10 +80,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.Class)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(Class<?> entityClass) {
@@ -100,10 +88,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return watchCollection(template.getCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAt(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAt(Object token) {
@@ -117,10 +101,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAfter(Object token) {
@@ -129,10 +109,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.resumeAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#startAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> startAfter(Object token) {
@@ -141,10 +117,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.startAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithOptions#withOptions(java.util.function.Consumer)
*/
@Override
public ReactiveChangeStreamSupport<T> withOptions(Consumer<ChangeStreamOptionsBuilder> optionsConsumer) {
@@ -154,10 +126,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, builder.build());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithProjection#as(java.lang.Class)
*/
@Override
public <R> ChangeStreamWithFilterAndProjection<R> as(Class<R> resultType) {
@@ -166,19 +134,11 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, resultType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(Aggregation filter) {
return withOptions(builder -> builder.filter(filter));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(CriteriaDefinition by) {
@@ -188,10 +148,6 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return filter(aggregation);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.TerminatingChangeStream#listen()
*/
@Override
public Flux<ChangeStreamEvent<T>> listen() {
return template.changeStream(collection, options != null ? options : ChangeStreamOptions.empty(), returnType);

View File

@@ -44,10 +44,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation#query(java.lang.Class)
*/
@Override
public <T> ReactiveFind<T> query(Class<T> domainType) {
@@ -81,10 +77,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
@@ -93,10 +85,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection#as(java.lang.Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
@@ -105,10 +93,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
@@ -117,10 +101,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#first()
*/
@Override
public Mono<T> first() {
@@ -130,10 +110,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return result.next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#one()
*/
@Override
public Mono<T> one() {
@@ -155,55 +131,31 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#all()
*/
@Override
public Flux<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#tail()
*/
@Override
public Flux<T> tail() {
return doFind(template.new TailingQueryFindPublisherPreparer(query, domainType));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#count()
*/
@Override
public Mono<Long> count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#exists()
*/
@Override
public Mono<Boolean> exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindDistinct#distinct(java.lang.String)
*/
@Override
public TerminatingDistinct<Object> distinct(String field) {
@@ -255,10 +207,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
@@ -267,10 +215,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new DistinctOperationSupport<>((ReactiveFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings("unchecked")
public TerminatingDistinct<T> matching(Query query) {
@@ -280,10 +224,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new DistinctOperationSupport<>((ReactiveFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core..ReactiveFindOperation.TerminatingDistinct#all()
*/
@Override
public Flux<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -38,10 +38,6 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation#insert(java.lang.Class)
*/
@Override
public <T> ReactiveInsert<T> insert(Class<T> domainType) {
@@ -63,10 +59,6 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#one(java.lang.Object)
*/
@Override
public Mono<T> one(T object) {
@@ -75,10 +67,6 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Flux<T> all(Collection<? extends T> objects) {
@@ -87,10 +75,6 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public ReactiveInsert<T> inCollection(String collection) {

View File

@@ -42,7 +42,6 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.transaction.reactive.TransactionalOperator;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -58,8 +57,7 @@ import com.mongodb.reactivestreams.client.MongoCollection;
* <p>
* Implemented by {@link ReactiveMongoTemplate}. Not often used but a useful option for extensibility and testability
* (as it can be easily mocked, stubbed, or be the target of a JDK proxy). Command execution using
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}.
* <br />
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}. <br />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
@@ -91,7 +89,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
ReactiveIndexOperations indexOps(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* Execute a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
@@ -120,8 +118,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Document> executeCommand(Document command, @Nullable ReadPreference readPreference);
/**
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary.
* <br />
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
@@ -132,8 +129,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(ReactiveDatabaseCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class.
* <br />
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
@@ -144,8 +140,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name.
* <br />
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name. <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
@@ -158,8 +153,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <br />
* provided by the given {@link Supplier} to each and every command issued against MongoDB. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -177,8 +171,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding a new {@link ClientSession}
* with given {@literal sessionOptions} to each and every command issued against MongoDB.
* <br />
* with given {@literal sessionOptions} to each and every command issued against MongoDB. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -204,48 +197,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
ReactiveSessionScoped withSession(Publisher<ClientSession> sessionProvider);
/**
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* <br />
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}. <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
* @return {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* @since 2.1
*/
ReactiveMongoOperations withSession(ClientSession session);
/**
* Initiate a new {@link ClientSession} and obtain a {@link ClientSession session} bound instance of
* {@link ReactiveSessionScoped}. Starts the transaction and adds the {@link ClientSession} to each and every command
* issued against MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction();
/**
* Obtain a {@link ClientSession session} bound instance of {@link ReactiveSessionScoped}, start the transaction and
* bind the {@link ClientSession} provided by the given {@link Publisher} to each and every command issued against
* MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction(Publisher<ClientSession> sessionProvider);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -292,8 +251,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Get a {@link MongoCollection} by name. The returned collection may not exists yet (except in local memory) and is
* created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first.
* <br />
* exists} first. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection.
@@ -302,8 +260,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<MongoCollection<Document>> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists.
* <br />
* Check to see if a collection with a name indicated by the entity class exists. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection. Must not be {@literal null}.
@@ -312,8 +269,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Boolean> collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists.
* <br />
* Check to see if a collection with a given name exists. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -322,8 +278,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Boolean> collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class.
* <br />
* Drop the collection with the name indicated by the entity class. <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
@@ -331,8 +286,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Void> dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name.
* <br />
* Drop the collection with the given name. <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
@@ -340,11 +294,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Void> dropCollection(String collectionName);
/**
* Query for a {@link Flux} of objects of type T from the collection used by the entity class.
* <br />
* Query for a {@link Flux} of objects of type T from the collection used by the entity class. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -354,11 +306,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> findAll(Class<T> entityClass);
/**
* Query for a {@link Flux} of objects of type T from the specified collection.
* <br />
* Query for a {@link Flux} of objects of type T from the specified collection. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -370,11 +320,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
* <br />
* specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -387,11 +335,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type.
* <br />
* type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -437,8 +383,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -450,11 +395,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type.
* <br />
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -565,11 +508,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}.
* <br />
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -583,11 +524,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given {@code ouputType}. The name of the inputCollection is derived from the
* {@code inputType}.
* <br />
* {@code inputType}. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -603,10 +542,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation.
* <br />
* The raw results will be mapped to the given entity class.
* <br />
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given entity class. <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -901,10 +838,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* database. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -920,8 +855,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -939,18 +873,17 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* This method uses an
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @see #exactCount(Query, Class)
* @see #estimatedCount(Class)
*/
Mono<Long> count(Query query, Class<?> entityClass);
@@ -961,18 +894,17 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* This method uses an
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #estimatedCount(String)
* @see #exactCount(Query, String)
*/
Mono<Long> count(Query query, String collectionName);
@@ -982,26 +914,24 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <br />
* This method uses an
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #estimatedCount(String)
* @see #exactCount(Query, Class, String)
*/
Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <br />
* based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1016,8 +946,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <br />
* Estimate the number of documents in the given collection based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
@@ -1028,16 +957,82 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Long> estimatedCount(String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
* <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @since 3.4
*/
default Mono<Long> exactCount(Query query, Class<?> entityClass) {
return exactCount(query, entityClass, getCollectionName(entityClass));
}
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @since 3.4
*/
default Mono<Long> exactCount(Query query, String collectionName) {
return exactCount(query, null, collectionName);
}
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @since 3.4
*/
Mono<Long> exactCount(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1049,11 +1044,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<T> insert(T objectToSave);
/**
* Insert the object into the specified collection.
* <br />
* Insert the object into the specified collection. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1093,16 +1086,13 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> insertAll(Collection<? extends T> objectsToSave);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1139,17 +1129,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <br />
* object is not already present, that is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details.
* <br />
* Type Conversion"</a> for more details. <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1160,15 +1147,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <br />
* is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1179,15 +1165,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <br />
* object is not already present, that is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the saved object.
@@ -1196,15 +1181,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <br />
* is an 'upsert'. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
*
* @param objectToSave the object to store in the collReactiveMongoOperationsection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1477,11 +1461,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1496,11 +1478,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1516,11 +1496,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the configured default database via the reactive infrastructure. Use the optional provided {@link Aggregation} to
* filter events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1540,11 +1518,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the given collection via the reactive infrastructure. Use the optional provided {@link Aggregation} to filter
* events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <br />
* {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1565,11 +1541,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> via the reactive
* infrastructure. Use the optional provided {@link Aggregation} to filter events. The stream will not be completed
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}.
* <br />
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}. <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1599,7 +1573,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, Class<T> resultType, String mapFunction,
String reduceFunction, MapReduceOptions options);
@@ -1617,7 +1593,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, String inputCollectionName, Class<T> resultType,
String mapFunction, String reduceFunction, MapReduceOptions options);

View File

@@ -41,10 +41,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ReactiveRemove<T> remove(Class<T> domainType) {
@@ -68,10 +64,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithCollection#inCollection(String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
@@ -80,10 +72,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
@@ -92,10 +80,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#all()
*/
@Override
public Mono<DeleteResult> all() {
@@ -104,10 +88,6 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return template.doRemove(collectionName, query, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public Flux<T> findAndRemove() {

View File

@@ -42,10 +42,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ReactiveUpdate<T> update(Class<T> domainType) {
@@ -83,10 +79,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(org.springframework.data.mongodb.core.query.UpdateDefinition update) {
@@ -96,10 +88,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
@@ -109,28 +97,16 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#first()
*/
@Override
public Mono<UpdateResult> first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public Mono<UpdateResult> upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndModify#findAndModify()
*/
@Override
public Mono<T> findAndModify() {
@@ -141,10 +117,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndReplace#findAndReplace()
*/
@Override
public Mono<T> findAndReplace() {
return template.findAndReplace(query, replacement,
@@ -152,10 +124,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
getCollectionName(), targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
@@ -165,19 +133,11 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#all()
*/
@Override
public Mono<UpdateResult> all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
@@ -187,10 +147,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#replaceWith(java.lang.Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
@@ -200,10 +156,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
@@ -213,10 +165,6 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {

View File

@@ -75,28 +75,16 @@ public class SimpleMongoClientDatabaseFactory extends MongoDatabaseFactorySuppor
super(mongoClient, databaseName, mongoInstanceCreated, new MongoExceptionTranslator());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return getMongoClient().startSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#closeClient()
*/
@Override
protected void closeClient() {
getMongoClient().close();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#doGetMongoDatabase(java.lang.String)
*/
@Override
protected MongoDatabase doGetMongoDatabase(String dbName) {
return getMongoClient().getDatabase(dbName);

View File

@@ -1,74 +0,0 @@
/*
* Copyright 2018-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.ConnectionString;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
import com.mongodb.client.MongoDatabase;
/**
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
*
* @author Christoph Strobl
* @since 2.1
* @deprecated since 3.0, use {@link SimpleMongoClientDatabaseFactory} instead.
*/
@Deprecated
public class SimpleMongoClientDbFactory extends SimpleMongoClientDatabaseFactory {
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance for the given {@code connectionString}.
*
* @param connectionString connection coordinates for a database connection. Must contain a database name and must not
* be {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/connection-string/">MongoDB Connection String reference</a>
*/
public SimpleMongoClientDbFactory(String connectionString) {
this(new ConnectionString(connectionString));
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param connectionString connection coordinates for a database connection. Must contain also a database name and not
* be {@literal null}.
*/
public SimpleMongoClientDbFactory(ConnectionString connectionString) {
this(MongoClients.create(connectionString), connectionString.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
public SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated
*/
private SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) {
super(mongoClient, databaseName, mongoInstanceCreated);
}
}

View File

@@ -97,18 +97,10 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getMongoDatabase()
*/
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return getMongoDatabase(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getMongoDatabase(java.lang.String)
*/
public Mono<MongoDatabase> getMongoDatabase(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
@@ -133,36 +125,20 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getCodecRegistry()
*/
@Override
public CodecRegistry getCodecRegistry() {
return this.mongo.getDatabase(databaseName).getCodecRegistry();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public Mono<ClientSession> getSession(ClientSessionOptions options) {
return Mono.from(mongo.startSession(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public ReactiveMongoDatabaseFactory withSession(ClientSession session) {
return new ClientSessionBoundMongoDbFactory(session, this);
@@ -186,64 +162,36 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()
*/
@Override
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return delegate.getMongoDatabase().map(this::decorateDatabase);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase(java.lang.String)
*/
@Override
public Mono<MongoDatabase> getMongoDatabase(String dbName) throws DataAccessException {
return delegate.getMongoDatabase(dbName).map(this::decorateDatabase);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getCodecRegistry()
*/
@Override
public CodecRegistry getCodecRegistry() {
return delegate.getCodecRegistry();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public Mono<ClientSession> getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public ReactiveMongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#isTransactionActive()
*/
@Override
public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022. the original author or authors.
* Copyright 2016-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,9 +44,6 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return toDocument(this.value, context);

View File

@@ -142,11 +142,118 @@ public class AccumulatorOperators {
return usesFieldRef() ? StdDevSamp.stdDevSampOf(fieldReference) : StdDevSamp.stdDevSampOf(expression);
}
/**
* Creates new {@link AggregationExpression} that uses the previous input (field/expression) and the value of the
* given field to calculate the population covariance of the two.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
* @since 3.3
*/
public CovariancePop covariancePop(String fieldReference) {
return covariancePop().and(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that uses the previous input (field/expression) and the result of the
* given {@link AggregationExpression expression} to calculate the population covariance of the two.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
* @since 3.3
*/
public CovariancePop covariancePop(AggregationExpression expression) {
return covariancePop().and(expression);
}
private CovariancePop covariancePop() {
return usesFieldRef() ? CovariancePop.covariancePopOf(fieldReference) : CovariancePop.covariancePopOf(expression);
}
/**
* Creates new {@link AggregationExpression} that uses the previous input (field/expression) and the value of the
* given field to calculate the sample covariance of the two.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
* @since 3.3
*/
public CovarianceSamp covarianceSamp(String fieldReference) {
return covarianceSamp().and(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that uses the previous input (field/expression) and the result of the
* given {@link AggregationExpression expression} to calculate the sample covariance of the two.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
* @since 3.3
*/
public CovarianceSamp covarianceSamp(AggregationExpression expression) {
return covarianceSamp().and(expression);
}
private CovarianceSamp covarianceSamp() {
return usesFieldRef() ? CovarianceSamp.covarianceSampOf(fieldReference)
: CovarianceSamp.covarianceSampOf(expression);
}
/**
* Creates new {@link ExpMovingAvgBuilder} that to build {@link AggregationExpression expMovingAvg} that calculates
* the exponential moving average of numeric values
*
* @return new instance of {@link ExpMovingAvg}.
* @since 3.3
*/
public ExpMovingAvgBuilder expMovingAvg() {
ExpMovingAvg expMovingAvg = usesFieldRef() ? ExpMovingAvg.expMovingAvgOf(fieldReference)
: ExpMovingAvg.expMovingAvgOf(expression);
return new ExpMovingAvgBuilder() {
@Override
public ExpMovingAvg historicalDocuments(int numberOfHistoricalDocuments) {
return expMovingAvg.n(numberOfHistoricalDocuments);
}
@Override
public ExpMovingAvg alpha(double exponentialDecayValue) {
return expMovingAvg.alpha(exponentialDecayValue);
}
};
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* Builder for {@link ExpMovingAvg}.
*
* @since 3.3
*/
public interface ExpMovingAvgBuilder {
/**
* Define the number of historical documents with significant mathematical weight.
*
* @param numberOfHistoricalDocuments
* @return new instance of {@link ExpMovingAvg}.
*/
ExpMovingAvg historicalDocuments(int numberOfHistoricalDocuments);
/**
* Define the exponential decay value.
*
* @param exponentialDecayValue
* @return new instance of {@link ExpMovingAvg}.
*/
ExpMovingAvg alpha(double exponentialDecayValue);
}
/**
* {@link AggregationExpression} for {@code $sum}.
*
@@ -227,9 +334,6 @@ public class AccumulatorOperators {
return new Sum(append(value));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -310,9 +414,6 @@ public class AccumulatorOperators {
return new Avg(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -393,9 +494,6 @@ public class AccumulatorOperators {
return new Max(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -476,9 +574,6 @@ public class AccumulatorOperators {
return new Min(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -559,9 +654,6 @@ public class AccumulatorOperators {
return new StdDevPop(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -642,9 +734,6 @@ public class AccumulatorOperators {
return new StdDevSamp(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -658,4 +747,185 @@ public class AccumulatorOperators {
return super.toDocument(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $covariancePop}.
*
* @author Christoph Strobl
* @since 3.3
*/
public static class CovariancePop extends AbstractAggregationExpression {
private CovariancePop(Object value) {
super(value);
}
/**
* Creates new {@link CovariancePop}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
*/
public static CovariancePop covariancePopOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new CovariancePop(asFields(fieldReference));
}
/**
* Creates new {@link CovariancePop}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
*/
public static CovariancePop covariancePopOf(AggregationExpression expression) {
return new CovariancePop(Collections.singletonList(expression));
}
/**
* Creates new {@link CovariancePop} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
*/
public CovariancePop and(String fieldReference) {
return new CovariancePop(append(asFields(fieldReference)));
}
/**
* Creates new {@link CovariancePop} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovariancePop}.
*/
public CovariancePop and(AggregationExpression expression) {
return new CovariancePop(append(expression));
}
@Override
protected String getMongoMethod() {
return "$covariancePop";
}
}
/**
* {@link AggregationExpression} for {@code $covarianceSamp}.
*
* @author Christoph Strobl
* @since 3.3
*/
public static class CovarianceSamp extends AbstractAggregationExpression {
private CovarianceSamp(Object value) {
super(value);
}
/**
* Creates new {@link CovarianceSamp}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovarianceSamp}.
*/
public static CovarianceSamp covarianceSampOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new CovarianceSamp(asFields(fieldReference));
}
/**
* Creates new {@link CovarianceSamp}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovarianceSamp}.
*/
public static CovarianceSamp covarianceSampOf(AggregationExpression expression) {
return new CovarianceSamp(Collections.singletonList(expression));
}
/**
* Creates new {@link CovarianceSamp} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link CovarianceSamp}.
*/
public CovarianceSamp and(String fieldReference) {
return new CovarianceSamp(append(asFields(fieldReference)));
}
/**
* Creates new {@link CovarianceSamp} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link CovarianceSamp}.
*/
public CovarianceSamp and(AggregationExpression expression) {
return new CovarianceSamp(append(expression));
}
@Override
protected String getMongoMethod() {
return "$covarianceSamp";
}
}
/**
* {@link ExpMovingAvg} calculates the exponential moving average of numeric values.
*
* @author Christoph Strobl
* @since 3.3
*/
public static class ExpMovingAvg extends AbstractAggregationExpression {
private ExpMovingAvg(Object value) {
super(value);
}
/**
* Create a new {@link ExpMovingAvg} by defining the field holding the value to be used as input.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link ExpMovingAvg}.
*/
public static ExpMovingAvg expMovingAvgOf(String fieldReference) {
return new ExpMovingAvg(Collections.singletonMap("input", Fields.field(fieldReference)));
}
/**
* Create a new {@link ExpMovingAvg} by defining the {@link AggregationExpression expression} to compute the value
* to be used as input.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link ExpMovingAvg}.
*/
public static ExpMovingAvg expMovingAvgOf(AggregationExpression expression) {
return new ExpMovingAvg(Collections.singletonMap("input", expression));
}
/**
* Define the number of historical documents with significant mathematical weight. <br />
* Specify either {@link #n(int) N} or {@link #alpha(double) aplha}. Not both!
*
* @param numberOfHistoricalDocuments
* @return new instance of {@link ExpMovingAvg}.
*/
public ExpMovingAvg n/*umber of historical documents*/(int numberOfHistoricalDocuments) {
return new ExpMovingAvg(append("N", numberOfHistoricalDocuments));
}
/**
* Define the exponential decay value. <br />
* Specify either {@link #alpha(double) aplha} or {@link #n(int) N}. Not both!
*
* @param exponentialDecayValue
* @return new instance of {@link ExpMovingAvg}.
*/
public ExpMovingAvg alpha(double exponentialDecayValue) {
return new ExpMovingAvg(append("alpha", exponentialDecayValue));
}
@Override
protected String getMongoMethod() {
return "$expMovingAvg";
}
}
}

View File

@@ -99,10 +99,6 @@ public class AddFieldsOperation extends DocumentEnhancingOperation {
return new AddFieldsOperationBuilder(getValueMap());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.DocumentEnhancingOperation#mongoOperator()
*/
@Override
protected String mongoOperator() {
return "$addFields";
@@ -201,4 +197,5 @@ public class AddFieldsOperation extends DocumentEnhancingOperation {
AddFieldsOperationBuilder withValueOfExpression(String operation, Object... values);
}
}
}

View File

@@ -226,8 +226,7 @@ public class Aggregation {
}
/**
* Obtain an {@link AddFieldsOperationBuilder builder} instance to create a new {@link AddFieldsOperation}.
* <br />
* Obtain an {@link AddFieldsOperationBuilder builder} instance to create a new {@link AddFieldsOperation}. <br />
* Starting in version 4.2, MongoDB adds a new aggregation pipeline stage {@link AggregationUpdate#set $set} that is
* an alias for {@code $addFields}.
*
@@ -435,18 +434,6 @@ public class Aggregation {
return new SortByCountOperation(groupAndSortExpression);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return new instance of {@link SkipOperation}.
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
*/
@Deprecated
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
@@ -499,6 +486,17 @@ public class Aggregation {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link MatchOperation} using the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link MatchOperation}.
* @since 3.3
*/
public static MatchOperation match(AggregationExpression expression) {
return new MatchOperation(expression);
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the {@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
@@ -714,8 +712,7 @@ public class Aggregation {
}
/**
* Converts this {@link Aggregation} specification to a {@link Document}.
* <br />
* Converts this {@link Aggregation} specification to a {@link Document}. <br />
* MongoDB requires as of 3.6 cursor-based aggregation. Use {@link #toPipeline(AggregationOperationContext)} to render
* an aggregation pipeline.
*
@@ -730,10 +727,6 @@ public class Aggregation {
return options.applyAndReturnPotentiallyChangedCommand(command);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument("__collection__", DEFAULT_CONTEXT));
@@ -777,10 +770,6 @@ public class Aggregation {
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());

View File

@@ -1,108 +0,0 @@
/*
* Copyright 2015-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.7
* @deprecated since 1.10. Please use {@link ArithmeticOperators} and {@link ComparisonOperators} instead.
*/
@Deprecated
public enum AggregationFunctionExpressions {
SIZE, CMP, EQ, GT, GTE, LT, LTE, NE, SUBTRACT, ADD, MULTIPLY;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return new instance of {@link AggregationExpression}.
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.7
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.size());
for (Object value : values) {
args.add(unpack(value, context));
}
return new Document("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -80,28 +80,16 @@ class AggregationOperationRenderer {
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(org.bson.Document, java.lang.Class)
*/
@Override
public Document getMappedObject(Document document, @Nullable Class<?> type) {
return document;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new DirectFieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new DirectFieldReference(new ExposedField(new AggregationField(name), true));

View File

@@ -339,9 +339,6 @@ public class AggregationOptions {
return !maxTime.isZero() && !maxTime.isNegative();
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDocument().toJson();

View File

@@ -77,10 +77,6 @@ public class AggregationResults<T> implements Iterable<T> {
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}

View File

@@ -64,9 +64,6 @@ public class AggregationSpELExpression implements AggregationExpression {
return new AggregationSpELExpression(expressionString, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return (Document) TRANSFORMER.transform(rawExpression, context, parameters);

View File

@@ -242,48 +242,26 @@ public class AggregationUpdate extends Aggregation implements UpdateDefinition {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override
public Boolean isIsolated() {
return isolated;
}
/*
* Returns a update document containing the update pipeline.
* The resulting document needs to be unwrapped to be used with update operations.
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override
public Document getUpdateObject() {
return new Document("", toPipeline(Aggregation.DEFAULT_CONTEXT));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override
public boolean modifies(String key) {
return keysTouched.contains(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override
public void inc(String key) {
set(new SetOperation(key, ArithmeticOperators.valueOf(key).add(1)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return Collections.emptyList();

View File

@@ -35,6 +35,7 @@ import org.springframework.util.Assert;
* @author Christoph Strobl
* @author Mark Paluch
* @author Shashank Sharma
* @author Divya Srivastava
* @since 1.0
*/
public class ArrayOperators {
@@ -362,6 +363,38 @@ public class ArrayOperators {
return usesExpression() ? ArrayToObject.arrayValueOfToObject(expression) : ArrayToObject.arrayToObject(values);
}
/**
* Creates new {@link AggregationExpression} that return the first element in the associated array.
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @return new instance of {@link First}.
* @since 3.4
*/
public First first() {
if (usesFieldRef()) {
return First.firstOf(fieldReference);
}
return usesExpression() ? First.firstOf(expression) : First.first(values);
}
/**
* Creates new {@link AggregationExpression} that return the last element in the given array.
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @return new instance of {@link Last}.
* @since 3.4
*/
public Last last() {
if (usesFieldRef()) {
return Last.lastOf(fieldReference);
}
return usesExpression() ? Last.lastOf(expression) : Last.last(values);
}
/**
* @author Christoph Strobl
@@ -612,10 +645,6 @@ public class ArrayOperators {
return new FilterExpressionBuilder().filter(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(final AggregationOperationContext context) {
return toFilter(ExposedFields.from(as), context);
@@ -736,10 +765,6 @@ public class ArrayOperators {
return new FilterExpressionBuilder();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.InputBuilder#filter(java.util.List)
*/
@Override
public AsBuilder filter(List<?> array) {
@@ -748,10 +773,6 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.InputBuilder#filter(org.springframework.data.mongodb.core.aggregation.Field)
*/
@Override
public AsBuilder filter(Field field) {
@@ -760,10 +781,6 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.AsBuilder#as(java.lang.String)
*/
@Override
public ConditionBuilder as(String variableName) {
@@ -772,10 +789,6 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Filter by(AggregationExpression condition) {
@@ -784,10 +797,6 @@ public class ArrayOperators {
return filter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(java.lang.String)
*/
@Override
public Filter by(String expression) {
@@ -796,10 +805,6 @@ public class ArrayOperators {
return filter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(org.bson.Document)
*/
@Override
public Filter by(Document expression) {
@@ -1244,9 +1249,6 @@ public class ArrayOperators {
this.reduceExpressions = reduceExpressions;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -1433,9 +1435,6 @@ public class ArrayOperators {
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return new Document(propertyName, aggregationExpression.toDocument(context));
@@ -1803,13 +1802,117 @@ public class ArrayOperators {
return new ArrayToObject(expression);
}
@Override
protected String getMongoMethod() {
return "$arrayToObject";
}
}
/**
* {@link AggregationExpression} for {@code $first} that returns the first element in an array. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @author Divya Srivastava
* @author Christoph Strobl
* @since 3.4
*/
public static class First extends AbstractAggregationExpression {
private First(Object value) {
super(value);
}
/**
* Returns the first element in the given array.
*
* @param array must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First first(Object array) {
return new First(array);
}
/**
* Returns the first element in the array pointed to by the given {@link Field field reference}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First firstOf(String fieldReference) {
return new First(Fields.field(fieldReference));
}
/**
* Returns the first element of the array computed by the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First firstOf(AggregationExpression expression) {
return new First(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AbstractAggregationExpression#getMongoMethod()
*/
@Override
protected String getMongoMethod() {
return "$arrayToObject";
return "$first";
}
}
/**
* {@link AggregationExpression} for {@code $last} that returns the last element in an array. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @author Divya Srivastava
* @author Christoph Strobl
* @since 3.4
*/
public static class Last extends AbstractAggregationExpression {
private Last(Object value) {
super(value);
}
/**
* Returns the last element in the given array.
*
* @param array must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last last(Object array) {
return new Last(array);
}
/**
* Returns the last element in the array pointed to by the given {@link Field field reference}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last lastOf(String fieldReference) {
return new Last(Fields.field(fieldReference));
}
/**
* Returns the last element of the array computed buy the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last lastOf(AggregationExpression expression) {
return new Last(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AbstractAggregationExpression#getMongoMethod()
*/
@Override
protected String getMongoMethod() {
return "$last";
}
}
}

View File

@@ -88,9 +88,6 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -107,10 +104,6 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
return new Document(getOperator(), options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$bucketAuto";
@@ -144,33 +137,21 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
return new BucketAutoOperation(this, buckets, granularity.getMongoRepresentation());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketAutoOperation newBucketOperation(Outputs outputs) {
return new BucketAutoOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketAutoOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketAutoOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketAutoOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(String fieldName) {
return new BucketAutoOperationOutputBuilder(Fields.field(fieldName), this);
@@ -192,9 +173,6 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
@@ -223,9 +201,6 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
@@ -270,9 +245,6 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GranularitytoMongoGranularity()
*/
@Override
public String getMongoRepresentation() {
return granularity;

View File

@@ -84,9 +84,6 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
this.defaultBucket = defaultBucket;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -103,10 +100,6 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
return new Document(getOperator(), options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$bucket";
@@ -143,33 +136,21 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
return new BucketOperation(this, newBoundaries, defaultBucket);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketOperation newBucketOperation(Outputs outputs) {
return new BucketOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketOperationOutputBuilder andOutput(String fieldName) {
return new BucketOperationOutputBuilder(Fields.field(fieldName), this);
@@ -191,9 +172,6 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
@@ -221,9 +199,6 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);

View File

@@ -141,9 +141,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
});
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -159,9 +156,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
return document;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return outputs.asExposedFields();
@@ -454,9 +448,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
return outputs.isEmpty();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -540,10 +531,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.values = operationOutput.values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -636,9 +623,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return (Document) TRANSFORMER.transform(expression, context, params);
@@ -665,9 +649,6 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.expression = expression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return expression.toDocument(context);

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
@@ -235,7 +236,7 @@ public class ConditionalOperators {
*
* @author Mark Paluch
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/">https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/">https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/</a>
*/
public static class IfNull implements AggregationExpression {
@@ -251,7 +252,8 @@ public class ConditionalOperators {
/**
* Creates new {@link IfNull}.
*
* @param fieldReference the field to check for a {@literal null} value, field reference must not be {@literal null}.
* @param fieldReference the field to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder ifNull(String fieldReference) {
@@ -264,7 +266,7 @@ public class ConditionalOperators {
* Creates new {@link IfNull}.
*
* @param expression the expression to check for a {@literal null} value, field reference must not be
* {@literal null}.
* {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder ifNull(AggregationExpression expression) {
@@ -273,28 +275,34 @@ public class ConditionalOperators {
return new IfNullOperatorBuilder().ifNull(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
List<Object> list = new ArrayList<Object>();
if (condition instanceof Field) {
list.add(context.getReference((Field) condition).toString());
} else if (condition instanceof AggregationExpression) {
list.add(((AggregationExpression) condition).toDocument(context));
if (condition instanceof Collection) {
for (Object val : ((Collection) this.condition)) {
list.add(mapCondition(val, context));
}
} else {
list.add(condition);
list.add(mapCondition(condition, context));
}
list.add(resolve(value, context));
return new Document("$ifNull", list);
}
private Object mapCondition(Object condition, AggregationOperationContext context) {
if (condition instanceof Field) {
return context.getReference((Field) condition).toString();
} else if (condition instanceof AggregationExpression) {
return ((AggregationExpression) condition).toDocument(context);
} else {
return condition;
}
}
private Object resolve(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
@@ -315,28 +323,48 @@ public class ConditionalOperators {
/**
* @param fieldReference the field to check for a {@literal null} value, field reference must not be
* {@literal null}.
* {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder ifNull(String fieldReference);
/**
* @param expression the expression to check for a {@literal null} value, field name must not be {@literal null}
* or empty.
* @return the {@link ThenBuilder}
* or empty.
* @return the {@link ThenBuilder}.
*/
ThenBuilder ifNull(AggregationExpression expression);
}
/**
* @author Christoph Strobl
* @since 3.3
*/
public interface OrBuilder {
/**
* @param fieldReference the field to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder orIfNull(String fieldReference);
/**
* @param expression the expression to check for a {@literal null} value,
* @return the {@link ThenBuilder}.
*/
ThenBuilder orIfNull(AggregationExpression expression);
}
/**
* @author Mark Paluch
*/
public interface ThenBuilder {
public interface ThenBuilder extends OrBuilder {
/**
* @param value the value to be used if the {@code $ifNull} condition evaluates {@literal true}. Can be a
* {@link Document}, a value that is supported by MongoDB or a value that can be converted to a MongoDB
* representation but must not be {@literal null}.
* {@link Document}, a value that is supported by MongoDB or a value that can be converted to a MongoDB
* representation but must not be {@literal null}.
* @return new instance of {@link IfNull}.
*/
IfNull then(Object value);
@@ -361,9 +389,10 @@ public class ConditionalOperators {
*/
static final class IfNullOperatorBuilder implements IfNullBuilder, ThenBuilder {
private @Nullable Object condition;
private @Nullable List<Object> conditions;
private IfNullOperatorBuilder() {
conditions = new ArrayList<>();
}
/**
@@ -375,50 +404,45 @@ public class ConditionalOperators {
return new IfNullOperatorBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(java.lang.String)
*/
public ThenBuilder ifNull(String fieldReference) {
Assert.hasText(fieldReference, "FieldReference name must not be null or empty!");
this.condition = Fields.field(fieldReference);
this.conditions.add(Fields.field(fieldReference));
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression name must not be null or empty!");
this.condition = expression;
this.conditions.add(expression);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#then(java.lang.Object)
*/
public IfNull then(Object value) {
return new IfNull(condition, value);
@Override
public ThenBuilder orIfNull(String fieldReference) {
return ifNull(fieldReference);
}
@Override
public ThenBuilder orIfNull(AggregationExpression expression) {
return ifNull(expression);
}
public IfNull then(Object value) {
return new IfNull(conditions, value);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(java.lang.String)
*/
public IfNull thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNull(condition, Fields.field(fieldReference));
return new IfNull(conditions, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
public IfNull thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IfNull(condition, expression);
return new IfNull(conditions, expression);
}
}
}
@@ -458,7 +482,7 @@ public class ConditionalOperators {
public static Switch switchCases(List<CaseOperator> conditions) {
Assert.notNull(conditions, "Conditions must not be null!");
return new Switch(Collections.<String, Object>singletonMap("branches", new ArrayList<CaseOperator>(conditions)));
return new Switch(Collections.<String, Object> singletonMap("branches", new ArrayList<CaseOperator>(conditions)));
}
/**
@@ -500,9 +524,6 @@ public class ConditionalOperators {
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -545,7 +566,7 @@ public class ConditionalOperators {
* @author Mark Paluch
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/cond/">https://docs.mongodb.com/manual/reference/operator/aggregation/cond/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/cond/">https://docs.mongodb.com/manual/reference/operator/aggregation/cond/</a>
*/
public static class Cond implements AggregationExpression {
@@ -590,10 +611,6 @@ public class ConditionalOperators {
this.otherwiseValue = otherwiseValue;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -806,8 +823,8 @@ public class ConditionalOperators {
/**
* @param value the value to be used if the condition evaluates {@literal true}. Can be a {@link Document}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder then(Object value);
@@ -832,8 +849,8 @@ public class ConditionalOperators {
/**
* @param value the value to be used if the condition evaluates {@literal false}. Can be a {@link Document}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwise(Object value);
@@ -861,8 +878,7 @@ public class ConditionalOperators {
private @Nullable Object condition;
private @Nullable Object thenValue;
private ConditionalExpressionBuilder() {
}
private ConditionalExpressionBuilder() {}
/**
* Creates a new builder for {@link Cond}.
@@ -873,9 +889,6 @@ public class ConditionalOperators {
return new ConditionalExpressionBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.bson.Document)
*/
@Override
public ConditionalExpressionBuilder when(Document booleanExpression) {
@@ -885,9 +898,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ThenBuilder when(CriteriaDefinition criteria) {
@@ -896,9 +906,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder when(AggregationExpression expression) {
@@ -907,9 +914,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(java.lang.String)
*/
@Override
public ThenBuilder when(String booleanField) {
@@ -918,9 +922,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#then(java.lang.Object)
*/
@Override
public OtherwiseBuilder then(Object thenValue) {
@@ -929,9 +930,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(java.lang.String)
*/
@Override
public OtherwiseBuilder thenValueOf(String fieldReference) {
@@ -940,9 +938,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
@@ -951,9 +946,6 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwise(java.lang.Object)
*/
@Override
public Cond otherwise(Object otherwiseValue) {
@@ -961,9 +953,6 @@ public class ConditionalOperators {
return new Cond(condition, thenValue, otherwiseValue);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(java.lang.String)
*/
@Override
public Cond otherwiseValueOf(String fieldReference) {
@@ -971,9 +960,6 @@ public class ConditionalOperators {
return new Cond(condition, thenValue, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Cond otherwiseValueOf(AggregationExpression expression) {

Some files were not shown because too many files have changed in this diff Show More