Compare commits

..

92 Commits

Author SHA1 Message Date
Mark Paluch
547b96e132 Release version 3.2.9 (2021.0.9).
See #3934
2022-02-18 10:24:48 +01:00
Mark Paluch
2d6ab2bb73 Prepare 3.2.9 (2021.0.9).
See #3934
2022-02-18 10:24:13 +01:00
Christoph Strobl
edd67c087c Serialize values for debug output safely in AbstractMongoEventListener.
We now make sure that codec configuration will not cause an exception when debug logging is turned on.

Resolves: #3968
Original Pull Request: #3970
2022-02-18 10:13:34 +01:00
Christoph Strobl
5cd74e6f96 Update copyright year to 2022.
See: #3966
2022-02-16 10:29:00 +01:00
Greg L. Turnquist
6bc6821942 Update CI properties.
See #3934
2022-02-14 14:38:45 -06:00
blu10ph
6cad96709d Avoid obtaining mapped sort multiple times for mapReduce.
Apply already mapped sort for map reduce instead of running the source document through the mapping layer again.

Closes: #3960
2022-02-11 11:37:41 +01:00
Greg L. Turnquist
badfbb50c1 Use Harbor Proxy for containers.
Leverage internal infrastructure for pulling Docker container images. Reduces pressure on Docker Hub and reduces risk of hitting rate limits.

See #3954.
Related https://github.com/spring-projects/spring-data-build/issues/1630.
2022-02-07 10:54:59 -06:00
Mark Paluch
a1e8fb70ee Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-03 15:38:01 +01:00
Greg L. Turnquist
97eff9bdb7 Externalize build properties.
By reading a properties file from an external location, it is possible to inject a consistent set of properties from Spring Data Build. This also supports repeatable builds.

Closes #3949.
2022-02-03 15:20:29 +01:00
Mihail Cornescu
10ca58c579 Add IgnoreCase to repository queries documentation.
Update reference documentaion and add missing IgnoreCase keyword.

Closes: #3916
Original Pull Request: #3950
2022-02-02 13:20:11 +01:00
Christoph Strobl
5a3b4f8698 After release cleanups.
See #3878
2022-01-14 10:00:57 +01:00
Christoph Strobl
fe2c4fd4e7 Prepare next development iteration.
See #3878
2022-01-14 10:00:53 +01:00
Christoph Strobl
20a8b5e5f1 Release version 3.2.8 (2021.0.8).
See #3878
2022-01-14 09:47:47 +01:00
Christoph Strobl
3c25ed8378 Prepare 3.2.8 (2021.0.8).
See #3878
2022-01-14 09:47:06 +01:00
Mark Paluch
6c21eab84b Polishing.
Simplify assertions, reformat code.

See #3921
Original pull request: #3930.
2022-01-13 11:07:21 +01:00
Christoph Strobl
f00e8ed93c Use index instead of iterator to map position and map keys for updates.
This commit removes usage of the iterator and replaces map key and positional parameter mappings with an index based token lookup.

Closes #3921
Original pull request: #3930.
2022-01-13 11:06:42 +01:00
Mark Paluch
34a35bd489 Polishing.
Tweak Javadoc.

See #3898
Original pull request: #3904.
2021-12-14 09:37:05 +01:00
Christoph Strobl
ba2b65cfd5 Fix field inclusion in aggregation project operation.
Closes #3898
Original pull request: #3904.
2021-12-14 09:37:05 +01:00
Jens Schauder
2c9975e8db After release cleanups.
See #3863
2021-11-12 10:38:57 +01:00
Jens Schauder
781ba63226 Prepare next development iteration.
See #3863
2021-11-12 10:38:54 +01:00
Jens Schauder
d855a0b07d Release version 3.2.7 (2021.0.7).
See #3863
2021-11-12 10:27:20 +01:00
Jens Schauder
d809ee0104 Prepare 3.2.7 (2021.0.7).
See #3863
2021-11-12 10:26:42 +01:00
Mark Paluch
d138296123 After release cleanups.
See #3828
2021-10-18 11:19:44 +02:00
Mark Paluch
e11560dffc Prepare next development iteration.
See #3828
2021-10-18 11:19:41 +02:00
Mark Paluch
4fbf01467e Release version 3.2.6 (2021.0.6).
See #3828
2021-10-18 11:11:44 +02:00
Mark Paluch
012d1245b0 Prepare 3.2.6 (2021.0.6).
See #3828
2021-10-18 11:10:48 +02:00
Mark Paluch
d8eb0f124a After release cleanups.
See #3769
2021-09-17 09:27:38 +02:00
Mark Paluch
bfeb896c70 Prepare next development iteration.
See #3769
2021-09-17 09:27:35 +02:00
Mark Paluch
efffc936fa Release version 3.2.5 (2021.0.5).
See #3769
2021-09-17 09:18:31 +02:00
Mark Paluch
563a3fb845 Prepare 3.2.5 (2021.0.5).
See #3769
2021-09-17 09:17:28 +02:00
Christoph Strobl
50ae6fd045 Change visibility of PersistentEntitiesFactoryBean.
Closes: #3825
2021-09-15 15:30:10 +02:00
Christoph Strobl
ae0e240334 Move and add tests to UpdateMapper.
Also update author information.

Original Pull Request: #3815
2021-09-13 14:36:15 +02:00
divyajnu08
852a461429 Fix update mapping using nested integer keys on map structures.
Closes: #3775
Original Pull Request: #3815
2021-09-13 14:36:05 +02:00
Mark Paluch
2cbed2a052 Upgrade to Maven Wrapper 3.8.2.
See #3819
2021-09-10 15:39:33 +02:00
Mark Paluch
95667edec3 Reduce allocations in query and update mapping.
Introduce EmptyDocument and utility methods in BsonUtils. Avoid entrySet and iterator creation for document iterations/inspections.

Original Pull Request: #3809
2021-09-09 08:21:22 +02:00
Mark Paluch
c1a52de8e5 Introduce SessionSynchronization.NEVER to disable transactional participation.
SessionSynchronization.NEVER bypasses all transactional integration in cases where applications do not want to make use of transactions so that transaction inspection overhead is avoided.

Original Pull Request: #3809
2021-09-09 08:14:16 +02:00
Christoph Strobl
7e94c1bdc3 Fix slice argument in query fields projection.
We now use a Collection instead of an Array to pass on $slice projection values for offset and limit.

Closes: #3811
Original pull request: #3812.
2021-09-08 14:47:41 +02:00
Mark Paluch
c3259e395c Avoid nested Document conversion to primitive types for fields with an explicit write target.
We now no longer attempt to convert query Documents into primitive types to avoid e.g. Document to String conversion.

Closes: #3783
Original Pull Request: #3797
2021-09-02 10:23:14 +02:00
Mark Paluch
3526b6a2d8 Polishing.
Reorder methods. Add since tag. Simplify assertions. Use diamond syntax.

See: #3776
Original pull request: #3777.
2021-08-25 14:58:41 +02:00
Ivan Volzhev
cb70a97ea8 Relax requirement for GeoJsonMultiPoint construction allowing creation using a single point.
Only 1 point is required per GeoJson RFC and Mongo works just fine with 1 point as well.

Closes #3776
Original pull request: #3777.
2021-08-25 14:58:41 +02:00
Mark Paluch
52415bc702 Polishing.
Update since version. Reformat code.

See: #3761.
2021-08-25 14:33:45 +02:00
sangyongchoi
43de140842 Add Criteria infix functions for maxDistance and minDistance.
Closes: #3761
2021-08-25 14:33:45 +02:00
Mark Paluch
15b000ecce Polishing.
Fix typo in reference docs.

See #3758
2021-08-25 10:15:19 +02:00
Ryan Gibb
e428b9b977 Fix a typo in MongoConverter javadoc.
Original pull request: #3758.
2021-08-25 10:15:19 +02:00
Mark Paluch
6e38610ac1 Polishing.
Fix asterisk callouts.

See #3786
2021-08-24 11:19:41 +02:00
Mark Paluch
e7af70efca Extract Aggregation Framework and GridFS docs in own source files.
Closes #3786
2021-08-24 11:08:52 +02:00
Christoph Strobl
39f5f91261 Change visibility of Reactive/MongoRepositoryFactoryBean setters.
Setters of the FactoryBean should be public.

Closes: #3779
Original pull request: #3780.
2021-08-24 10:26:59 +02:00
Jens Schauder
c48daa6d56 After release cleanups.
See #3735
2021-08-12 11:37:31 +02:00
Jens Schauder
11356cd20f Prepare next development iteration.
See #3735
2021-08-12 11:37:29 +02:00
Jens Schauder
7385262c47 Release version 3.2.4 (2021.0.4).
See #3735
2021-08-12 11:22:50 +02:00
Jens Schauder
259938588a Prepare 3.2.4 (2021.0.4).
See #3735
2021-08-12 11:22:26 +02:00
Mark Paluch
a1b4e3fc55 Run unpaged query using Pageable.unpaged() through QuerydslMongoPredicateExecutor.findAll(…).
We now correctly consider unpaged queries if the Pageable is unpaged.

Closes: #3751
Original Pull Request: #3754
2021-07-26 15:16:30 +02:00
Jens Schauder
76479820bc After release cleanups.
See #3682
2021-07-16 11:51:05 +02:00
Jens Schauder
c47bbc4a20 Prepare next development iteration.
See #3682
2021-07-16 11:51:02 +02:00
Jens Schauder
74791d0bca Release version 3.2.3 (2021.0.3).
See #3682
2021-07-16 11:35:22 +02:00
Jens Schauder
f4d2287011 Prepare 3.2.3 (2021.0.3).
See #3682
2021-07-16 11:34:27 +02:00
Jens Schauder
ab6ba194c1 Updated changelog.
See #3682
2021-07-16 11:34:20 +02:00
Mark Paluch
595a346705 Polishing.
Support DBObject and Map that as source for entity materialization and map conversion.

See #3702
Original pull request: #3704.
2021-07-15 10:00:00 +02:00
Christoph Strobl
08c5e5a810 Fix raw document conversion in Collection like properties.
Along the lines make sure to convert map like structures correctly if they do not come as a Document, eg. cause they got converted to a plain Map in a post load, pre convert event.

Closes #3702
Original pull request: #3704.
2021-07-15 09:59:50 +02:00
Christoph Strobl
f987217c3c Custom Converter should also be applicable for simple types.
This commit fixes a regression that prevented custom converters from being applied to types considered store native ones.

Original pull request: #3703.
Fixes #3670
2021-07-15 09:00:35 +02:00
Christoph Strobl
92a22978c2 Polishing.
Simplify KeyMapper current property/index setup.

Original Pull Request: #3689
2021-07-06 12:56:10 +02:00
David Julia
2e2e076b5b Fix Regression in generating queries with nested maps with numeric keys.
While maps that have numeric keys work if there is only one map with an integer key, when there are multiple maps with numeric keys in a given query, it fails.

Take the following example for a map called outer with numeric keys holding reference to another object with a map called inner with numeric keys: Updates that are meant to generate {"$set": {"outerMap.1234.inner.5678": "hello"}} are instead generating {"$set": {"outerMap.1234.inner.inner": "hello"}}, repeating the later map property name instead of using the integer key value.

This commit adds unit tests both for the UpdateMapper and QueryMapper, which check multiple consecutive maps with numeric keys, and adds a fix in the KeyMapper. Because we cannot easily change the path parsing to somehow parse path parts corresponding to map keys differently, we address the issue in the KeyMapper. We keep track of the partial path corresponding to the current property and use it to skip adding the duplicated property name for the map to the query, and instead add the key.

This is a bit redundant in that we now have both an iterator and an index-based way of accessing the path parts, but it gets the tests passing and fixes the issue without making a large change to the current approach.

Fixes: #3688
Original Pull Request: #3689
2021-07-06 12:05:58 +02:00
Christoph Strobl
0c50d97887 Fix NPE when reading/mapping null value inside collection.
Closes: #3686
2021-07-01 11:16:13 +02:00
Christoph Strobl
c10d4b6af0 Favor ObjectUtils over Objects for equals/hashCode.
Original Pull Request: #3684
2021-06-24 13:43:55 +02:00
Gatto
6644ac6875 Add equals and hashCode to UnwrappedMongoPersistentProperty.
Fixes #3683
Original Pull Request: #3684
2021-06-24 13:42:12 +02:00
Mark Paluch
708def0df1 After release cleanups.
See #3650
2021-06-22 16:05:22 +02:00
Mark Paluch
889e5d52bb Prepare next development iteration.
See #3650
2021-06-22 16:05:19 +02:00
Mark Paluch
8930091b33 Release version 3.2.2 (2021.0.2).
See #3650
2021-06-22 15:52:27 +02:00
Mark Paluch
b1d750efed Prepare 3.2.2 (2021.0.2).
See #3650
2021-06-22 15:51:38 +02:00
Mark Paluch
7a19593f02 Updated changelog.
See #3650
2021-06-22 15:51:33 +02:00
Mark Paluch
9021445ccd Updated changelog.
See #3649
2021-06-22 15:29:52 +02:00
Mark Paluch
db92c37502 Update reference docs to use correct MongoClient.
Closes #3666
2021-06-22 14:37:02 +02:00
larsw
99e5e2596e Add closing quote to GeoJson javadoc.
Closes #3677
2021-06-21 13:58:40 +02:00
Christoph Strobl
d0bf0e2e62 Fix field projection value conversion.
The field projection conversion should actually only map field names and avoid value conversion. In the MongoId case an inclusion parameter (1) was unintentionally converted into its String representation which causes trouble on Mongo 4.4 servers.

Fixes: #3668
Original pull request: #3678.
2021-06-21 13:45:54 +02:00
Christoph Strobl
28efb3afbe Polishing.
Fix typo in class name and make sure MongoTestTemplate uses the configured simple types.

See: #3659
Original pull request: #3661.
2021-06-18 14:27:18 +02:00
Christoph Strobl
99eb849c93 Fix query mapper path resolution for types considered simple ones.
spring-projects/spring-data-commons#2293 changed how PersistentProperty paths get resolved and considers potentially registered converters for those, which made the path resolution fail in during the query mapping process.
This commit makes sure to capture the according exception and continue with the given user input.

Fixes: #3659
Original pull request: #3661.
2021-06-18 14:13:33 +02:00
Christoph Strobl
d33aa682e5 Fix $or / $nor keyword mapping in query mapper.
This commit fixes an issue with the pattern used for detecting $or / $nor which also matched other keywords like $floor.

Closes: #3635
Original pull request: #3637.
2021-06-18 13:49:37 +02:00
Mark Paluch
f6db089f6f Polishing.
Add nullability annotation. Return early on null value conversion.

See #3633
Original pull request: #3643.
2021-06-09 14:14:49 +02:00
Christoph Strobl
13ae5e17bb Fix NPE in QueryMapper when trying to apply target type on null value.
Closes #3633
Original pull request: #3643.
2021-06-09 14:14:44 +02:00
Mark Paluch
ee203bf22a Polishing.
Reformat code.

See #3660.
Original pull request: #3662.
2021-06-09 11:34:20 +02:00
Christoph Strobl
990696ba11 Fix conversion for types having a converter registered.
Fixes: #3660
Original pull request: #3662.
2021-06-09 11:33:34 +02:00
Mark Paluch
4bc2f108fe After release cleanups.
See #3629
2021-05-14 12:34:20 +02:00
Mark Paluch
5064ba5b24 Prepare next development iteration.
See #3629
2021-05-14 12:34:16 +02:00
Mark Paluch
1179ded140 Release version 3.2.1 (2021.0.1).
See #3629
2021-05-14 12:23:55 +02:00
Mark Paluch
e12700c00b Prepare 3.2.1 (2021.0.1).
See #3629
2021-05-14 12:23:21 +02:00
Mark Paluch
0507adab20 Updated changelog.
See #3629
2021-05-14 12:23:14 +02:00
Mark Paluch
375ddf8afb Updated changelog.
See #3628
2021-05-14 12:06:39 +02:00
Mark Paluch
283cf06dc1 Introduce template method for easier customization of fragments.
Closes #3638.
2021-04-27 10:46:01 +02:00
Greg L. Turnquist
10a8456581 Authenticate with artifactory.
See #3616.
2021-04-22 15:01:43 -05:00
Clément Petit
e751a43cdf Fix bullet points in aggregation framework reference documentation.
Closes: #3632
2021-04-20 08:22:33 +02:00
Mark Paluch
c987ba5f83 After release cleanups.
See #3616
2021-04-14 14:30:40 +02:00
Mark Paluch
950bae0306 Prepare next development iteration.
See #3616
2021-04-14 14:30:36 +02:00
697 changed files with 16310 additions and 34858 deletions

View File

@@ -1,2 +1,2 @@
#Fri Jun 03 09:32:40 CEST 2022 #Fri Sep 10 15:39:33 CEST 2021
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.5/apache-maven-3.8.5-bin.zip distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.2/apache-maven-3.8.2-bin.zip

View File

@@ -1,6 +1,6 @@
= Continuous Integration = Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Moore%20(main)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]

View File

@@ -1,3 +1,3 @@
= Spring Data contribution guidelines = Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/main/CONTRIBUTING.adoc[here]. You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

138
Jenkinsfile vendored
View File

@@ -9,7 +9,7 @@ pipeline {
triggers { triggers {
pollSCM 'H/10 * * * *' pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/main", threshold: hudson.model.Result.SUCCESS) upstream(upstreamProjects: "spring-data-commons/2.5.x", threshold: hudson.model.Result.SUCCESS)
} }
options { options {
@@ -20,10 +20,10 @@ pipeline {
stages { stages {
stage("Docker images") { stage("Docker images") {
parallel { parallel {
stage('Publish JDK (Java 17) + MongoDB 4.4') { stage('Publish JDK (main) + MongoDB 4.0') {
when { when {
anyOf { anyOf {
changeset "ci/openjdk17-mongodb-4.4/**" changeset "ci/openjdk8-mongodb-4.0/**"
changeset "ci/pipeline.properties" changeset "ci/pipeline.properties"
} }
} }
@@ -32,17 +32,17 @@ pipeline {
steps { steps {
script { script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/") def image = docker.build("springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.0.version']} ci/openjdk8-mongodb-4.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) { docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push() image.push()
} }
} }
} }
} }
stage('Publish JDK (Java 17) + MongoDB 5.0') { stage('Publish JDK (main) + MongoDB 4.4') {
when { when {
anyOf { anyOf {
changeset "ci/openjdk17-mongodb-5.0/**" changeset "ci/openjdk8-mongodb-4.4/**"
changeset "ci/pipeline.properties" changeset "ci/pipeline.properties"
} }
} }
@@ -51,26 +51,24 @@ pipeline {
steps { steps {
script { script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/") def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk8-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) { docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push() image.push()
} }
} }
} }
} }
stage('Publish JDK (Java 17) + MongoDB 6.0') { stage('Publish JDK 15 + MongoDB 4.4') {
when { when {
anyOf { changeset "ci/openjdk15-mongodb-4.4/**"
changeset "ci/openjdk17-mongodb-6.0/**"
changeset "ci/pipeline.properties" changeset "ci/pipeline.properties"
} }
}
agent { label 'data' } agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') } options { timeout(time: 30, unit: 'MINUTES') }
steps { steps {
script { script {
def image = docker.build("springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.6.0.version']} ci/openjdk17-mongodb-6.0/") def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.15.tag']}", "--build-arg BASE=${p['docker.java.15.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk15-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) { docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push() image.push()
} }
@@ -80,11 +78,10 @@ pipeline {
} }
} }
stage("test: baseline (Java 17)") { stage("test: baseline (main)") {
when { when {
beforeAgent(true)
anyOf { anyOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP") branch '3.2.x'
not { triggeredBy 'UpstreamCause' } not { triggeredBy 'UpstreamCause' }
} }
} }
@@ -93,10 +90,43 @@ pipeline {
} }
options { timeout(time: 30, unit: 'MINUTES') } options { timeout(time: 30, unit: 'MINUTES') }
environment { environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}") ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
} }
steps { steps {
script { script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
}
stage("Test other configurations") {
when {
allOf {
branch '3.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: mongodb 4.4 (main)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) { docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log' sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &' sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
@@ -108,68 +138,38 @@ pipeline {
} }
} }
} }
stage("Test other configurations") {
when {
beforeAgent(true)
allOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP")
not { triggeredBy 'UpstreamCause' }
} }
}
parallel {
stage("test: MongoDB 5.0 (Java 17)") { stage("test: baseline (jdk15)") {
agent { agent {
label 'data' label 'data'
} }
options { timeout(time: 30, unit: 'MINUTES') } options { timeout(time: 30, unit: 'MINUTES') }
environment { environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}") ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
} }
steps { steps {
script { script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) { docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.15.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log' sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &' sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10' sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"' sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15' sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B' sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
} }
} }
} }
} }
stage("test: MongoDB 6.0 (Java 17)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-6.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongosh --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
} }
} }
} }
stage('Release to artifactory') { stage('Release to artifactory') {
when { when {
beforeAgent(true)
anyOf { anyOf {
branch(pattern: "main|(\\d\\.\\d\\.x)", comparator: "REGEXP") branch '3.2.x'
not { triggeredBy 'UpstreamCause' } not { triggeredBy 'UpstreamCause' }
} }
} }
@@ -179,13 +179,13 @@ pipeline {
options { timeout(time: 20, unit: 'MINUTES') } options { timeout(time: 20, unit: 'MINUTES') }
environment { environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}") ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
} }
steps { steps {
script { script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) { docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -v'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' + sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' + '-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " + "-Dartifactory.username=${ARTIFACTORY_USR} " +
@@ -200,6 +200,36 @@ pipeline {
} }
} }
stage('Publish documentation') {
when {
branch '3.2.x'
}
agent {
label 'data'
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.distribution-repository=temp-private-local " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
}
}
}
post { post {
changed { changed {
script { script {

View File

@@ -1,19 +1,17 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]] = Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://spring.io/projects/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services. The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database. The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
[[code-of-conduct]]
== Code of Conduct == Code of Conduct
This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io. This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
[[getting-started]]
== Getting Started == Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java: Here is a quick teaser of an application using Spring Data Repositories in Java:
@@ -61,7 +59,6 @@ class ApplicationConfig extends AbstractMongoClientConfiguration {
} }
---- ----
[[maven-configuration]]
=== Maven configuration === Maven configuration
Add the Maven dependency: Add the Maven dependency:
@@ -71,25 +68,24 @@ Add the Maven dependency:
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>${version}</version> <version>${version}.RELEASE</version>
</dependency> </dependency>
---- ----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
and declare the appropriate dependency version.
[source,xml] [source,xml]
---- ----
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>${version}-SNAPSHOT</version> <version>${version}.BUILD-SNAPSHOT</version>
</dependency> </dependency>
<repository> <repository>
<id>spring-snapshot</id> <id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name> <name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/snapshot</url> <url>https://repo.spring.io/libs-snapshot</url>
</repository> </repository>
---- ----
@@ -102,7 +98,7 @@ Some of the changes affect the initial setup configuration as well as compile/ru
.Changed XML Namespace Elements and Attributes: .Changed XML Namespace Elements and Attributes:
|=== |===
| Element / Attribute | 2.x | 3.x Element / Attribute | 2.x | 3.x
| `<mongo:mongo-client />` | `<mongo:mongo-client />`
| Used to create a `com.mongodb.MongoClient` | Used to create a `com.mongodb.MongoClient`
@@ -115,12 +111,12 @@ Use `<mongo:client-settings cluster-hosts="..." />` instead
| `<mongo:db-factory writeConcern="..." />` | `<mongo:db-factory writeConcern="..." />`
| NONE, NORMAL, SAFE, FSYNC_SAFE, REPLICAS_SAFE, MAJORITY | NONE, NORMAL, SAFE, FSYNC_SAFE, REPLICAS_SAFE, MAJORITY
| W1, W2, W3, UNACKNOWLEDGED, ACKNOWLEDGED, JOURNALED, MAJORITY | W1, W2, W3, UNAKNOWLEDGED, AKNOWLEDGED, JOURNALED, MAJORITY
|=== |===
.Removed XML Namespace Elements and Attributes: .Removed XML Namespace Elements and Attributes:
|=== |===
| Element / Attribute | Replacement in 3.x | Comment Element / Attribute | Replacement in 3.x | Comment
| `<mongo:db-factory mongo-ref="..." />` | `<mongo:db-factory mongo-ref="..." />`
| `<mongo:db-factory mongo-client-ref="..." />` | `<mongo:db-factory mongo-client-ref="..." />`
@@ -137,7 +133,7 @@ Use `<mongo:client-settings cluster-hosts="..." />` instead
.New XML Namespace Elements and Attributes: .New XML Namespace Elements and Attributes:
|=== |===
| Element | Comment Element | Comment
| `<mongo:db-factory mongo-client-ref="..." />` | `<mongo:db-factory mongo-client-ref="..." />`
| Replacement for `<mongo:db-factory mongo-ref="..." />` | Replacement for `<mongo:db-factory mongo-ref="..." />`
@@ -157,7 +153,7 @@ Use `<mongo:client-settings cluster-hosts="..." />` instead
.Java API changes .Java API changes
|=== |===
| Type | Comment Type | Comment
| `MongoClientFactoryBean` | `MongoClientFactoryBean`
| Creates `com.mongodb.client.MongoClient` instead of `com.mongodb.MongoClient` + | Creates `com.mongodb.client.MongoClient` instead of `com.mongodb.MongoClient` +
@@ -178,7 +174,7 @@ Uses `MongoClientSettings` instead of `MongoClientOptions`.
.Removed Java API: .Removed Java API:
|=== |===
| 2.x | Replacement in 3.x | Comment 2.x | Replacement in 3.x | Comment
| `MongoClientOptionsFactoryBean` | `MongoClientOptionsFactoryBean`
| `MongoClientSettingsFactoryBean` | `MongoClientSettingsFactoryBean`
@@ -230,7 +226,6 @@ static class Config extends AbstractMongoClientConfiguration {
---- ----
==== ====
[[getting-help]]
== Getting Help == Getting Help
Having trouble with Spring Data? Wed love to help! Having trouble with Spring Data? Wed love to help!
@@ -244,7 +239,6 @@ If you are just starting out with Spring, try one of the https://spring.io/guide
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter]. You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://github.com/spring-projects/spring-data-mongodb/issues[github.com/spring-projects/spring-data-mongodb/issues]. * Report bugs with Spring Data MongoDB at https://github.com/spring-projects/spring-data-mongodb/issues[github.com/spring-projects/spring-data-mongodb/issues].
[[reporting-issues]]
== Reporting Issues == Reporting Issues
Spring Data uses Github as issue tracking system to record bugs and feature requests. Spring Data uses Github as issue tracking system to record bugs and feature requests.
@@ -255,96 +249,19 @@ If you want to raise an issue, please follow the recommendations below:
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using, the JVM version, Stacktrace, etc. * Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using, the JVM version, Stacktrace, etc.
* If you need to paste code, or include a stack trace use https://guides.github.com/features/mastering-markdown/[Markdown] code fences +++```+++. * If you need to paste code, or include a stack trace use https://guides.github.com/features/mastering-markdown/[Markdown] code fences +++```+++.
[[guides]]
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
[[examples]]
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
[[building-from-source]]
== Building from Source == Building from Source
You do not need to build from source to use Spring Data. Binaries are available in https://repo.spring.io[repo.spring.io] You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
and accessible from Maven using the Maven configuration noted <<maven-configuration,above>>. You also need JDK 1.8.
NOTE: Configuration for Gradle is similar to Maven.
The best way to get started is by creating a Spring Boot project using MongoDB on https://start.spring.io[start.spring.io].
Follow this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb[link]
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=3.0.0&packaging=jar&jvmVersion=17&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
to build a reactive one.
However, if you want to try out the latest and greatest, Spring Data MongoDB can be easily built with the https://github.com/takari/maven-wrapper[Maven wrapper]
and minimally, JDK 17 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
In order to build Spring Data MongoDB, you will need to https://www.mongodb.com/try/download/community[download]
and https://docs.mongodb.com/manual/installation/[install a MongoDB distribution].
Once you have installed MongoDB, you need to start a MongoDB server. It is convenient to set an environment variable to
your MongoDB installation directory (e.g. `MONGODB_HOME`).
To run the full test suite, a https://docs.mongodb.com/manual/tutorial/deploy-replica-set/[MongoDB Replica Set]
is required.
To run the MongoDB server enter the following command from a command-line:
[source,bash]
----
$ $MONGODB_HOME/bin/mongod --dbpath $MONGODB_HOME/runtime/data --ipv6 --port 27017 --replSet rs0
...
"msg":"Successfully connected to host"
----
Once the MongoDB server starts up, you should see the message (`msg`), "_Successfully connected to host_".
Notice the `--dbpath` option to the `mongod` command. You can set this to anything you like, but in this case, we set
the absolute path to a sub-directory (`runtime/data/`) under the MongoDB installation directory (in `$MONGODB_HOME`).
You need to initialize the MongoDB replica set only once on the first time the MongoDB server is started.
To initialize the replica set, start a mongo client:
[source,bash]
----
$ $MONGODB_HOME/bin/mongo
MongoDB server version: 5.0.0
...
----
Then enter the following command:
[source,bash]
----
mongo> rs.initiate({ _id: 'rs0', members: [ { _id: 0, host: '127.0.0.1:27017' } ] })
----
Finally, on UNIX-based system (for example, Linux or Mac OS X) you may need to adjust the `ulimit`.
In case you need to, you can adjust the `ulimit` with the following command (32768 is just a recommendation):
[source,bash]
----
$ ulimit -n 32768
----
You can use `ulimit -a` again to verify the `ulimit` for "_open files_" was set appropriately.
Now you are ready to build Spring Data MongoDB. Simply enter the following `mvnw` (Maven Wrapper) command:
[source,bash] [source,bash]
---- ----
$ ./mvnw clean install $ ./mvnw clean install
---- ----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.8.0 or above]. If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular, please sign _Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation === Building reference documentation
@@ -357,7 +274,17 @@ Building the documentation builds also the project without running tests.
The generated documentation is available from `target/site/reference/html/index.html`. The generated documentation is available from `target/site/reference/html/index.html`.
[[license]] == Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License == License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license]. Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

View File

@@ -7,15 +7,12 @@ ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \ RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \ apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \ apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \ echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \ echo ${TZ} > /etc/timezone;
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \ RUN apt-get update ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \ apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean && \ apt-get clean; \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*;

View File

@@ -0,0 +1,18 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;

View File

@@ -1,23 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 && \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,23 +0,0 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 6.0 release signing key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | apt-key add - && \
# Needed when MongoDB creates a 6.0 folder.
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/6.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-6.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,18 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;

View File

@@ -0,0 +1,20 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
echo ${TZ} > /etc/timezone;
RUN apt-get update ; \
ln -T /bin/true /usr/bin/systemctl ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
rm /usr/bin/systemctl ; \
apt-get clean ; \
rm -rf /var/lib/apt/lists/* ;

View File

@@ -1,19 +1,23 @@
# Java versions # Java versions
java.main.tag=17.0.2_8-jdk java.main.tag=8u312-b07-jdk
java.11.tag=11.0.13_8-jdk
java.15.tag=15.0.2_7-jdk-hotspot
# Docker container images - standard # Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag} docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.11.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.11.tag}
docker.java.15.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/adoptopenjdk:${java.15.tag}
# Supported versions of MongoDB # Supported versions of MongoDB
docker.mongodb.4.4.version=4.4.12 docker.mongodb.4.0.version=4.0.23
docker.mongodb.5.0.version=5.0.6 docker.mongodb.4.4.version=4.4.4
docker.mongodb.6.0.version=6.0.0 docker.mongodb.5.0.version=5.0.3
# Supported versions of Redis # Supported versions of Redis
docker.redis.6.version=6.2.6 docker.redis.6.version=6.2.4
# Supported versions of Cassandra # Supported versions of Cassandra
docker.cassandra.3.version=3.11.12 docker.cassandra.3.version=3.11.10
# Docker environment settings # Docker environment settings
docker.java.inside.basic=-v $HOME:/tmp/jenkins-home docker.java.inside.basic=-v $HOME:/tmp/jenkins-home

28
pom.xml
View File

@@ -5,17 +5,17 @@
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-RC1</version> <version>3.2.9</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>Spring Data MongoDB</name> <name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description> <description>MongoDB support for Spring Data</description>
<url>https://spring.io/projects/spring-data-mongodb</url> <url>https://projects.spring.io/spring-data-mongodb</url>
<parent> <parent>
<groupId>org.springframework.data.build</groupId> <groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId> <artifactId>spring-data-parent</artifactId>
<version>3.0.0-RC1</version> <version>2.5.9</version>
</parent> </parent>
<modules> <modules>
@@ -26,8 +26,8 @@
<properties> <properties>
<project.type>multi</project.type> <project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id> <dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.0.0-RC1</springdata.commons> <springdata.commons>2.5.9</springdata.commons>
<mongo>4.8.0-beta0</mongo> <mongo>4.2.3</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams> <mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version> <jmh.version>1.19</jmh.version>
</properties> </properties>
@@ -112,17 +112,6 @@
</developer> </developer>
</developers> </developers>
<scm>
<connection>scm:git:https://github.com/spring-projects/spring-data-mongodb.git</connection>
<developerConnection>scm:git:git@github.com:spring-projects/spring-data-mongodb.git</developerConnection>
<url>https://github.com/spring-projects/spring-data-mongodb</url>
</scm>
<issueManagement>
<system>GitHub</system>
<url>https://github.com/spring-projects/spring-data-mongodb/issues</url>
</issueManagement>
<profiles> <profiles>
<profile> <profile>
<id>benchmarks</id> <id>benchmarks</id>
@@ -145,11 +134,8 @@
<repositories> <repositories>
<repository> <repository>
<id>spring-libs-milestone</id> <id>spring-libs-release</id>
<url>https://repo.spring.io/libs-milestone</url> <url>https://repo.spring.io/libs-release</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository> </repository>
<repository> <repository>
<id>sonatype-libs-snapshot</id> <id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-RC1</version> <version>3.2.9</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -322,7 +322,7 @@ public class AbstractMicrobenchmark {
try { try {
ResultsWriter.forUri(uri).write(results); ResultsWriter.forUri(uri).write(results);
} catch (Exception e) { } catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'; Error was %s", uri, e)); System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
} }
} }
} }

View File

@@ -1,6 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
@@ -15,18 +14,13 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-RC1</version> <version>3.2.9</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<properties> <properties>
<project.root>${basedir}/..</project.root> <project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key> <dist.key>SDMONGO</dist.key>
<!-- Observability -->
<micrometer-docs-generator.inputPath>${maven.multiModuleProjectDirectory}/spring-data-mongodb/</micrometer-docs-generator.inputPath>
<micrometer-docs-generator.inclusionPattern>.*</micrometer-docs-generator.inclusionPattern>
<micrometer-docs-generator.outputPath>${maven.multiModuleProjectDirectory}/target/</micrometer-docs-generator.outputPath>
</properties> </properties>
<build> <build>
@@ -35,63 +29,12 @@
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId> <artifactId>maven-assembly-plugin</artifactId>
</plugin> </plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>generate-metrics-metadata</id>
<phase>prepare-package</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>io.micrometer.docs.metrics.DocsFromSources
</mainClass>
</configuration>
</execution>
<execution>
<id>generate-tracing-metadata</id>
<phase>prepare-package</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>io.micrometer.docs.spans.DocsFromSources
</mainClass>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-docs-generator-spans</artifactId>
<version>${micrometer-docs-generator}</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-docs-generator-metrics</artifactId>
<version>${micrometer-docs-generator}</version>
<type>jar</type>
</dependency>
</dependencies>
<configuration>
<includePluginDependencies>true</includePluginDependencies>
<arguments>
<argument>${micrometer-docs-generator.inputPath}</argument>
<argument>${micrometer-docs-generator.inclusionPattern}</argument>
<argument>${micrometer-docs-generator.outputPath}</argument>
</arguments>
</configuration>
</plugin>
<plugin> <plugin>
<groupId>org.asciidoctor</groupId> <groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId> <artifactId>asciidoctor-maven-plugin</artifactId>
<configuration> <configuration>
<attributes> <attributes>
<mongo-reactivestreams>${mongo.reactivestreams} <mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
</mongo-reactivestreams>
<reactor>${reactor}</reactor> <reactor>${reactor}</reactor>
</attributes> </attributes>
</configuration> </configuration>
@@ -100,15 +43,4 @@
</build> </build>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-plugins-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
</project> </project>

View File

@@ -1,7 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
@@ -13,7 +11,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-RC1</version> <version>3.2.9</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
@@ -89,13 +87,6 @@
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.2</version>
<optional>true</optional>
</dependency>
<!-- reactive --> <!-- reactive -->
<dependency> <dependency>
@@ -124,6 +115,27 @@
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<dependency> <dependency>
<groupId>io.reactivex.rxjava3</groupId> <groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId> <artifactId>rxjava</artifactId>
@@ -133,6 +145,12 @@
<!-- CDI --> <!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 --> <!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency> <dependency>
<groupId>javax.interceptor</groupId> <groupId>javax.interceptor</groupId>
@@ -142,48 +160,31 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>jakarta.enterprise</groupId> <groupId>javax.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId> <artifactId>cdi-api</artifactId>
<version>${cdi}</version> <version>${cdi}</version>
<scope>provided</scope> <scope>provided</scope>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
<groupId>jakarta.annotation</groupId> <groupId>javax.annotation</groupId>
<artifactId>jakarta.annotation-api</artifactId> <artifactId>javax.annotation-api</artifactId>
<version>${jakarta-annotation-api}</version> <version>${javax-annotation-api}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.openwebbeans</groupId> <groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId> <artifactId>openwebbeans-se</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-spi</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-impl</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version> <version>${webbeans}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<!-- JSR 303 Validation --> <!-- JSR 303 Validation -->
<dependency> <dependency>
<groupId>jakarta.validation</groupId> <groupId>javax.validation</groupId>
<artifactId>jakarta.validation-api</artifactId> <artifactId>validation-api</artifactId>
<version>${validation}</version> <version>${validation}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
@@ -196,37 +197,30 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>io.micrometer</groupId> <groupId>org.hibernate</groupId>
<artifactId>micrometer-observation</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.hibernate.validator</groupId>
<artifactId>hibernate-validator</artifactId> <artifactId>hibernate-validator</artifactId>
<version>7.0.1.Final</version> <version>5.4.3.Final</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>jakarta.el</groupId> <groupId>org.glassfish</groupId>
<artifactId>jakarta.el-api</artifactId> <artifactId>javax.el</artifactId>
<version>4.0.0</version> <version>3.0.1-b11</version>
<scope>provided</scope> <scope>test</scope>
<optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.glassfish</groupId> <groupId>joda-time</groupId>
<artifactId>jakarta.el</artifactId> <artifactId>joda-time</artifactId>
<version>4.0.2</version> <version>${jodatime}</version>
<scope>provided</scope> <scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
@@ -236,6 +230,13 @@
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency> <dependency>
<groupId>nl.jqno.equalsverifier</groupId> <groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId> <artifactId>equalsverifier</artifactId>
@@ -271,9 +272,9 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>jakarta.transaction</groupId> <groupId>javax.transaction</groupId>
<artifactId>jakarta.transaction-api</artifactId> <artifactId>jta</artifactId>
<version>2.0.0</version> <version>1.1</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
@@ -304,43 +305,11 @@
<dependency> <dependency>
<groupId>io.mockk</groupId> <groupId>io.mockk</groupId>
<artifactId>mockk-jvm</artifactId> <artifactId>mockk</artifactId>
<version>${mockk}</version> <version>${mockk}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>com.github.tomakehurst</groupId>
<artifactId>wiremock-jre8-standalone</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing-integration-test</artifactId>
<scope>test</scope>
</dependency>
<!-- jMolecules -->
<dependency>
<groupId>org.jmolecules</groupId>
<artifactId>jmolecules-ddd</artifactId>
<version>${jmolecules}</version>
<scope>test</scope>
</dependency>
</dependencies> </dependencies>
<build> <build>
@@ -365,11 +334,8 @@
<goal>test-process</goal> <goal>test-process</goal>
</goals> </goals>
<configuration> <configuration>
<outputDirectory>target/generated-test-sources <outputDirectory>target/generated-test-sources</outputDirectory>
</outputDirectory> <processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
<processor>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</processor>
</configuration> </configuration>
</execution> </execution>
</executions> </executions>
@@ -389,9 +355,7 @@
<exclude>**/ReactivePerformanceTests.java</exclude> <exclude>**/ReactivePerformanceTests.java</exclude>
</excludes> </excludes>
<systemPropertyVariables> <systemPropertyVariables>
<java.util.logging.config.file> <java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
src/test/resources/logging.properties
</java.util.logging.config.file>
<reactor.trace.cancel>true</reactor.trace.cancel> <reactor.trace.cancel>true</reactor.trace.cancel>
</systemPropertyVariables> </systemPropertyVariables>
</configuration> </configuration>

View File

@@ -31,7 +31,7 @@ import org.springframework.util.StringUtils;
* expression. The expression will be wrapped within <code>{ ... }</code> if necessary. The actual parsing and parameter * expression. The expression will be wrapped within <code>{ ... }</code> if necessary. The actual parsing and parameter
* binding of placeholders like {@code ?0} is delayed upon first call on the the target {@link Document} via * binding of placeholders like {@code ?0} is delayed upon first call on the the target {@link Document} via
* {@link #toDocument()}. * {@link #toDocument()}.
* <br /> * <p />
* *
* <pre class="code"> * <pre class="code">
* $toUpper : $name -> { '$toUpper' : '$name' } * $toUpper : $name -> { '$toUpper' : '$name' }
@@ -103,11 +103,19 @@ public class BindableMongoExpression implements MongoExpression {
return new BindableMongoExpression(expressionString, codecRegistryProvider, args); return new BindableMongoExpression(expressionString, codecRegistryProvider, args);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoExpression#toDocument()
*/
@Override @Override
public Document toDocument() { public Document toDocument() {
return target.get(); return target.get();
} }
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override @Override
public String toString() { public String toString() {
return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args=" return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args="

View File

@@ -62,7 +62,7 @@ public interface CodecRegistryProvider {
*/ */
default <T> Optional<Codec<T>> getCodecFor(Class<T> type) { default <T> Optional<Codec<T>> getCodecFor(Class<T> type) {
Assert.notNull(type, "Type must not be null"); Assert.notNull(type, "Type must not be null!");
try { try {
return Optional.of(getCodecRegistry().get(type)); return Optional.of(getCodecRegistry().get(type));

View File

@@ -20,8 +20,8 @@ import org.springframework.util.StringUtils;
/** /**
* Helper class featuring helper methods for working with MongoDb collections. * Helper class featuring helper methods for working with MongoDb collections.
* <br /> * <p/>
* <br /> * <p/>
* Mainly intended for internal use within the framework. * Mainly intended for internal use within the framework.
* *
* @author Thomas Risberg * @author Thomas Risberg

View File

@@ -30,7 +30,7 @@ import com.mongodb.client.MongoDatabase;
* Helper class for managing a {@link MongoDatabase} instances via {@link MongoDatabaseFactory}. Used for obtaining * Helper class for managing a {@link MongoDatabase} instances via {@link MongoDatabaseFactory}. Used for obtaining
* {@link ClientSession session bound} resources, such as {@link MongoDatabase} and * {@link ClientSession session bound} resources, such as {@link MongoDatabase} and
* {@link com.mongodb.client.MongoCollection} suitable for transactional usage. * {@link com.mongodb.client.MongoCollection} suitable for transactional usage.
* <br /> * <p />
* <strong>Note:</strong> Intended for internal usage only. * <strong>Note:</strong> Intended for internal usage only.
* *
* @author Christoph Strobl * @author Christoph Strobl
@@ -43,7 +43,7 @@ public class MongoDatabaseUtils {
/** /**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory} using * Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}. * {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -56,7 +56,7 @@ public class MongoDatabaseUtils {
/** /**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory}. * Obtain the default {@link MongoDatabase database} form the given {@link MongoDatabaseFactory factory}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -71,7 +71,7 @@ public class MongoDatabaseUtils {
/** /**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory} using * Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}. * {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -85,7 +85,7 @@ public class MongoDatabaseUtils {
/** /**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory}. * Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDatabaseFactory factory}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -102,7 +102,7 @@ public class MongoDatabaseUtils {
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDatabaseFactory factory, private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) { SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "Factory must not be null"); Assert.notNull(factory, "Factory must not be null!");
if (sessionSynchronization == SessionSynchronization.NEVER if (sessionSynchronization == SessionSynchronization.NEVER
|| !TransactionSynchronizationManager.isSynchronizationActive()) { || !TransactionSynchronizationManager.isSynchronizationActive()) {
@@ -193,11 +193,19 @@ public class MongoDatabaseUtils {
this.resourceHolder = resourceHolder; this.resourceHolder = resourceHolder;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override @Override
protected boolean shouldReleaseBeforeCompletion() { protected boolean shouldReleaseBeforeCompletion() {
return false; return false;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override @Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) { protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
@@ -206,6 +214,10 @@ public class MongoDatabaseUtils {
} }
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override @Override
public void afterCompletion(int status) { public void afterCompletion(int status) {
@@ -216,6 +228,10 @@ public class MongoDatabaseUtils {
super.afterCompletion(status); super.afterCompletion(status);
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override @Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) { protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2011-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 3.0, use {@link MongoDatabaseFactory} instead.
*/
@Deprecated
public interface MongoDbFactory extends MongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase()} instead.
*/
@Deprecated
default MongoDatabase getDb() throws DataAccessException {
return getMongoDatabase();
}
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase(String)} instead.
*/
@Deprecated
default MongoDatabase getDb(String dbName) throws DataAccessException {
return getMongoDatabase(dbName);
}
}

View File

@@ -18,7 +18,7 @@ package org.springframework.data.mongodb;
/** /**
* Wrapper object for MongoDB expressions like {@code $toUpper : $name} that manifest as {@link org.bson.Document} when * Wrapper object for MongoDB expressions like {@code $toUpper : $name} that manifest as {@link org.bson.Document} when
* passed on to the driver. * passed on to the driver.
* <br /> * <p />
* A set of predefined {@link MongoExpression expressions}, including a * A set of predefined {@link MongoExpression expressions}, including a
* {@link org.springframework.data.mongodb.core.aggregation.AggregationSpELExpression SpEL based variant} for method * {@link org.springframework.data.mongodb.core.aggregation.AggregationSpELExpression SpEL based variant} for method
* like expressions (eg. {@code toUpper(name)}) are available via the * like expressions (eg. {@code toUpper(name)}) are available via the

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Arrays;
import java.util.function.Consumer;
import org.springframework.data.domain.ManagedTypes;
/**
* @author Christoph Strobl
* @since 4.0
*/
public final class MongoManagedTypes implements ManagedTypes {
private final ManagedTypes delegate;
private MongoManagedTypes(ManagedTypes types) {
this.delegate = types;
}
/**
* Wraps an existing {@link ManagedTypes} object with {@link MongoManagedTypes}.
*
* @param managedTypes
* @return
*/
public static MongoManagedTypes from(ManagedTypes managedTypes) {
return new MongoManagedTypes(managedTypes);
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given array of {@link Class types}.
*
* @param types array of {@link Class types} used to initialize the {@link ManagedTypes}; must not be {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized from {@link Class types}.
*/
public static MongoManagedTypes from(Class<?>... types) {
return fromIterable(Arrays.asList(types));
}
/**
* Factory method used to construct {@link MongoManagedTypes} from the given, required {@link Iterable} of
* {@link Class types}.
*
* @param types {@link Iterable} of {@link Class types} used to initialize the {@link ManagedTypes}; must not be
* {@literal null}.
* @return new instance of {@link MongoManagedTypes} initialized the given, required {@link Iterable} of {@link Class
* types}.
*/
public static MongoManagedTypes fromIterable(Iterable<? extends Class<?>> types) {
return from(ManagedTypes.fromIterable(types));
}
/**
* Factory method to return an empty {@link MongoManagedTypes} object.
*
* @return an empty {@link MongoManagedTypes} object.
*/
public static MongoManagedTypes empty() {
return from(ManagedTypes.empty());
}
@Override
public void forEach(Consumer<Class<?>> action) {
delegate.forEach(action);
}
}

View File

@@ -24,7 +24,7 @@ import com.mongodb.client.ClientSession;
/** /**
* MongoDB specific {@link ResourceHolderSupport resource holder}, wrapping a {@link ClientSession}. * MongoDB specific {@link ResourceHolderSupport resource holder}, wrapping a {@link ClientSession}.
* {@link MongoTransactionManager} binds instances of this class to the thread. * {@link MongoTransactionManager} binds instances of this class to the thread.
* <br /> * <p />
* <strong>Note:</strong> Intended for internal usage only. * <strong>Note:</strong> Intended for internal usage only.
* *
* @author Christoph Strobl * @author Christoph Strobl
@@ -68,7 +68,7 @@ class MongoResourceHolder extends ResourceHolderSupport {
ClientSession session = getSession(); ClientSession session = getSession();
if (session == null) { if (session == null) {
throw new IllegalStateException("No session available"); throw new IllegalStateException("No session available!");
} }
return session; return session;

View File

@@ -37,18 +37,18 @@ import com.mongodb.client.ClientSession;
/** /**
* A {@link org.springframework.transaction.PlatformTransactionManager} implementation that manages * A {@link org.springframework.transaction.PlatformTransactionManager} implementation that manages
* {@link ClientSession} based transactions for a single {@link MongoDatabaseFactory}. * {@link ClientSession} based transactions for a single {@link MongoDatabaseFactory}.
* <br /> * <p />
* Binds a {@link ClientSession} from the specified {@link MongoDatabaseFactory} to the thread. * Binds a {@link ClientSession} from the specified {@link MongoDatabaseFactory} to the thread.
* <br /> * <p />
* {@link TransactionDefinition#isReadOnly() Readonly} transactions operate on a {@link ClientSession} and enable causal * {@link TransactionDefinition#isReadOnly() Readonly} transactions operate on a {@link ClientSession} and enable causal
* consistency, and also {@link ClientSession#startTransaction() start}, {@link ClientSession#commitTransaction() * consistency, and also {@link ClientSession#startTransaction() start}, {@link ClientSession#commitTransaction()
* commit} or {@link ClientSession#abortTransaction() abort} a transaction. * commit} or {@link ClientSession#abortTransaction() abort} a transaction.
* <br /> * <p />
* Application code is required to retrieve the {@link com.mongodb.client.MongoDatabase} via * Application code is required to retrieve the {@link com.mongodb.client.MongoDatabase} via
* {@link MongoDatabaseUtils#getDatabase(MongoDatabaseFactory)} instead of a standard * {@link MongoDatabaseUtils#getDatabase(MongoDatabaseFactory)} instead of a standard
* {@link MongoDatabaseFactory#getMongoDatabase()} call. Spring classes such as * {@link MongoDatabaseFactory#getMongoDatabase()} call. Spring classes such as
* {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly. * {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly.
* <br /> * <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. One may override * By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. One may override
* {@link #doCommit(MongoTransactionObject)} to implement the * {@link #doCommit(MongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a> * <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
@@ -69,11 +69,11 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
/** /**
* Create a new {@link MongoTransactionManager} for bean-style usage. * Create a new {@link MongoTransactionManager} for bean-style usage.
* <br /> * <p />
* <strong>Note:</strong>The {@link MongoDatabaseFactory db factory} has to be * <strong>Note:</strong>The {@link MongoDatabaseFactory db factory} has to be
* {@link #setDbFactory(MongoDatabaseFactory) set} before using the instance. Use this constructor to prepare a * {@link #setDbFactory(MongoDatabaseFactory) set} before using the instance. Use this constructor to prepare a
* {@link MongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}. * {@link MongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}.
* <br /> * <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining * Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}. * {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
* *
@@ -100,12 +100,16 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
*/ */
public MongoTransactionManager(MongoDatabaseFactory dbFactory, @Nullable TransactionOptions options) { public MongoTransactionManager(MongoDatabaseFactory dbFactory, @Nullable TransactionOptions options) {
Assert.notNull(dbFactory, "DbFactory must not be null"); Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory; this.dbFactory = dbFactory;
this.options = options; this.options = options;
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override @Override
protected Object doGetTransaction() throws TransactionException { protected Object doGetTransaction() throws TransactionException {
@@ -114,11 +118,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return new MongoTransactionObject(resourceHolder); return new MongoTransactionObject(resourceHolder);
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override @Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException { protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder(); return extractMongoTransaction(transaction).hasResourceHolder();
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override @Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException { protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
@@ -148,6 +160,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder); TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override @Override
protected Object doSuspend(Object transaction) throws TransactionException { protected Object doSuspend(Object transaction) throws TransactionException {
@@ -157,11 +173,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory()); return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override @Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) { protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources); TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override @Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException { protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
@@ -188,8 +212,8 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
* By default those labels are ignored, nevertheless one might check for * By default those labels are ignored, nevertheless one might check for
* {@link MongoException#UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL transient commit errors labels} and retry the the * {@link MongoException#UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL transient commit errors labels} and retry the the
* commit. <br /> * commit. <br />
* <pre>
* <code> * <code>
* <pre>
* int retries = 3; * int retries = 3;
* do { * do {
* try { * try {
@@ -202,8 +226,8 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
* } * }
* Thread.sleep(500); * Thread.sleep(500);
* } while (--retries > 0); * } while (--retries > 0);
* </code>
* </pre> * </pre>
* </code>
* *
* @param transactionObject never {@literal null}. * @param transactionObject never {@literal null}.
* @throws Exception in case of transaction errors. * @throws Exception in case of transaction errors.
@@ -212,6 +236,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.commitTransaction(); transactionObject.commitTransaction();
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override @Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException { protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
@@ -231,6 +259,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
} }
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override @Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException { protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
@@ -238,6 +270,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.getRequiredResourceHolder().setRollbackOnly(); transactionObject.getRequiredResourceHolder().setRollbackOnly();
} }
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override @Override
protected void doCleanupAfterCompletion(Object transaction) { protected void doCleanupAfterCompletion(Object transaction) {
@@ -266,7 +302,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
*/ */
public void setDbFactory(MongoDatabaseFactory dbFactory) { public void setDbFactory(MongoDatabaseFactory dbFactory) {
Assert.notNull(dbFactory, "DbFactory must not be null"); Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory; this.dbFactory = dbFactory;
} }
@@ -289,11 +325,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return dbFactory; return dbFactory;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override @Override
public MongoDatabaseFactory getResourceFactory() { public MongoDatabaseFactory getResourceFactory() {
return getRequiredDbFactory(); return getRequiredDbFactory();
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override @Override
public void afterPropertiesSet() { public void afterPropertiesSet() {
getRequiredDbFactory(); getRequiredDbFactory();
@@ -315,7 +359,7 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoDatabaseFactory getRequiredDbFactory() { private MongoDatabaseFactory getRequiredDbFactory() {
Assert.state(dbFactory != null, Assert.state(dbFactory != null,
"MongoTransactionManager operates upon a MongoDbFactory; Did you forget to provide one; It's required"); "MongoTransactionManager operates upon a MongoDbFactory. Did you forget to provide one? It's required.");
return dbFactory; return dbFactory;
} }
@@ -450,22 +494,30 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
private MongoResourceHolder getRequiredResourceHolder() { private MongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present; o_O"); Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present. o_O");
return resourceHolder; return resourceHolder;
} }
private ClientSession getRequiredSession() { private ClientSession getRequiredSession() {
ClientSession session = getSession(); ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null"); Assert.state(session != null, "A Session is required but it turned out to be null.");
return session; return session;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override @Override
public boolean isRollbackOnly() { public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly(); return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override @Override
public void flush() { public void flush() {
TransactionSynchronizationUtils.triggerFlush(); TransactionSynchronizationUtils.triggerFlush();

View File

@@ -36,7 +36,7 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* Helper class for managing reactive {@link MongoDatabase} instances via {@link ReactiveMongoDatabaseFactory}. Used for * Helper class for managing reactive {@link MongoDatabase} instances via {@link ReactiveMongoDatabaseFactory}. Used for
* obtaining {@link ClientSession session bound} resources, such as {@link MongoDatabase} and {@link MongoCollection} * obtaining {@link ClientSession session bound} resources, such as {@link MongoDatabase} and {@link MongoCollection}
* suitable for transactional usage. * suitable for transactional usage.
* <br /> * <p />
* <strong>Note:</strong> Intended for internal usage only. * <strong>Note:</strong> Intended for internal usage only.
* *
* @author Mark Paluch * @author Mark Paluch
@@ -75,7 +75,7 @@ public class ReactiveMongoDatabaseUtils {
/** /**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory} using * Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}. * {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -88,7 +88,7 @@ public class ReactiveMongoDatabaseUtils {
/** /**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory}. * Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -104,7 +104,7 @@ public class ReactiveMongoDatabaseUtils {
/** /**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory * Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory} using {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}. * factory} using {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -119,7 +119,7 @@ public class ReactiveMongoDatabaseUtils {
/** /**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory * Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory}. * factory}.
* <br /> * <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber * Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}. * {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
* *
@@ -136,7 +136,7 @@ public class ReactiveMongoDatabaseUtils {
private static Mono<MongoDatabase> doGetMongoDatabase(@Nullable String dbName, ReactiveMongoDatabaseFactory factory, private static Mono<MongoDatabase> doGetMongoDatabase(@Nullable String dbName, ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) { SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "DatabaseFactory must not be null"); Assert.notNull(factory, "DatabaseFactory must not be null!");
if (sessionSynchronization == SessionSynchronization.NEVER) { if (sessionSynchronization == SessionSynchronization.NEVER) {
return getMongoDatabaseOrDefault(dbName, factory); return getMongoDatabaseOrDefault(dbName, factory);
@@ -214,11 +214,19 @@ public class ReactiveMongoDatabaseUtils {
this.resourceHolder = resourceHolder; this.resourceHolder = resourceHolder;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override @Override
protected boolean shouldReleaseBeforeCompletion() { protected boolean shouldReleaseBeforeCompletion() {
return false; return false;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override @Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) { protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
@@ -229,6 +237,10 @@ public class ReactiveMongoDatabaseUtils {
return Mono.empty(); return Mono.empty();
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override @Override
public Mono<Void> afterCompletion(int status) { public Mono<Void> afterCompletion(int status) {
@@ -244,6 +256,10 @@ public class ReactiveMongoDatabaseUtils {
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override @Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) { protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -24,7 +24,7 @@ import com.mongodb.reactivestreams.client.ClientSession;
/** /**
* MongoDB specific resource holder, wrapping a {@link ClientSession}. {@link ReactiveMongoTransactionManager} binds * MongoDB specific resource holder, wrapping a {@link ClientSession}. {@link ReactiveMongoTransactionManager} binds
* instances of this class to the subscriber context. * instances of this class to the subscriber context.
* <br /> * <p />
* <strong>Note:</strong> Intended for internal usage only. * <strong>Note:</strong> Intended for internal usage only.
* *
* @author Mark Paluch * @author Mark Paluch

View File

@@ -38,21 +38,21 @@ import com.mongodb.reactivestreams.client.ClientSession;
* A {@link org.springframework.transaction.ReactiveTransactionManager} implementation that manages * A {@link org.springframework.transaction.ReactiveTransactionManager} implementation that manages
* {@link com.mongodb.reactivestreams.client.ClientSession} based transactions for a single * {@link com.mongodb.reactivestreams.client.ClientSession} based transactions for a single
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory}. * {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory}.
* <br /> * <p />
* Binds a {@link ClientSession} from the specified * Binds a {@link ClientSession} from the specified
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory} to the subscriber * {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory} to the subscriber
* {@link reactor.util.context.Context}. * {@link reactor.util.context.Context}.
* <br /> * <p />
* {@link org.springframework.transaction.TransactionDefinition#isReadOnly() Readonly} transactions operate on a * {@link org.springframework.transaction.TransactionDefinition#isReadOnly() Readonly} transactions operate on a
* {@link ClientSession} and enable causal consistency, and also {@link ClientSession#startTransaction() start}, * {@link ClientSession} and enable causal consistency, and also {@link ClientSession#startTransaction() start},
* {@link com.mongodb.reactivestreams.client.ClientSession#commitTransaction() commit} or * {@link com.mongodb.reactivestreams.client.ClientSession#commitTransaction() commit} or
* {@link ClientSession#abortTransaction() abort} a transaction. * {@link ClientSession#abortTransaction() abort} a transaction.
* <br /> * <p />
* Application code is required to retrieve the {@link com.mongodb.reactivestreams.client.MongoDatabase} via * Application code is required to retrieve the {@link com.mongodb.reactivestreams.client.MongoDatabase} via
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory)} instead * {@link org.springframework.data.mongodb.ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory)} instead
* of a standard {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()} call. Spring * of a standard {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()} call. Spring
* classes such as {@link org.springframework.data.mongodb.core.ReactiveMongoTemplate} use this strategy implicitly. * classes such as {@link org.springframework.data.mongodb.core.ReactiveMongoTemplate} use this strategy implicitly.
* <br /> * <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. You can override * By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. You can override
* {@link #doCommit(TransactionSynchronizationManager, ReactiveMongoTransactionObject)} to implement the * {@link #doCommit(TransactionSynchronizationManager, ReactiveMongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a> * <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
@@ -71,11 +71,11 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
/** /**
* Create a new {@link ReactiveMongoTransactionManager} for bean-style usage. * Create a new {@link ReactiveMongoTransactionManager} for bean-style usage.
* <br /> * <p />
* <strong>Note:</strong>The {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory db factory} has to * <strong>Note:</strong>The {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory db factory} has to
* be {@link #setDatabaseFactory(ReactiveMongoDatabaseFactory)} set} before using the instance. Use this constructor * be {@link #setDatabaseFactory(ReactiveMongoDatabaseFactory)} set} before using the instance. Use this constructor
* to prepare a {@link ReactiveMongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}. * to prepare a {@link ReactiveMongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}.
* <br /> * <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining * Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}. * {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
* *
@@ -104,12 +104,16 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory, public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory,
@Nullable TransactionOptions options) { @Nullable TransactionOptions options) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null"); Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory; this.databaseFactory = databaseFactory;
this.options = options; this.options = options;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override @Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager) protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException { throws TransactionException {
@@ -119,11 +123,19 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return new ReactiveMongoTransactionObject(resourceHolder); return new ReactiveMongoTransactionObject(resourceHolder);
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override @Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException { protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder(); return extractMongoTransaction(transaction).hasResourceHolder();
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override @Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction, protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException { TransactionDefinition definition) throws TransactionException {
@@ -163,6 +175,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override @Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction) protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException { throws TransactionException {
@@ -176,6 +192,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override @Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction, protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) { Object suspendedResources) {
@@ -183,6 +203,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources)); .fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override @Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager, protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException { GenericReactiveTransaction status) throws TransactionException {
@@ -219,6 +243,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return transactionObject.commitTransaction(); return transactionObject.commitTransaction();
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override @Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager, protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) { GenericReactiveTransaction status) {
@@ -240,6 +268,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override @Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager, protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException { GenericReactiveTransaction status) throws TransactionException {
@@ -250,6 +282,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override @Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager, protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) { Object transaction) {
@@ -281,7 +317,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
*/ */
public void setDatabaseFactory(ReactiveMongoDatabaseFactory databaseFactory) { public void setDatabaseFactory(ReactiveMongoDatabaseFactory databaseFactory) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null"); Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory; this.databaseFactory = databaseFactory;
} }
@@ -304,6 +340,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return databaseFactory; return databaseFactory;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override @Override
public void afterPropertiesSet() { public void afterPropertiesSet() {
getRequiredDatabaseFactory(); getRequiredDatabaseFactory();
@@ -323,7 +363,7 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoDatabaseFactory getRequiredDatabaseFactory() { private ReactiveMongoDatabaseFactory getRequiredDatabaseFactory() {
Assert.state(databaseFactory != null, Assert.state(databaseFactory != null,
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory; Did you forget to provide one; It's required"); "ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory. Did you forget to provide one? It's required.");
return databaseFactory; return databaseFactory;
} }
@@ -458,22 +498,30 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
private ReactiveMongoResourceHolder getRequiredResourceHolder() { private ReactiveMongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present; o_O"); Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present. o_O");
return resourceHolder; return resourceHolder;
} }
private ClientSession getRequiredSession() { private ClientSession getRequiredSession() {
ClientSession session = getSession(); ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null"); Assert.state(session != null, "A Session is required but it turned out to be null.");
return session; return session;
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override @Override
public boolean isRollbackOnly() { public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly(); return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
} }
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override @Override
public void flush() { public void flush() {
throw new UnsupportedOperationException("flush() not supported"); throw new UnsupportedOperationException("flush() not supported");

View File

@@ -35,7 +35,7 @@ import com.mongodb.session.ClientSession;
/** /**
* {@link MethodInterceptor} implementation looking up and invoking an alternative target method having * {@link MethodInterceptor} implementation looking up and invoking an alternative target method having
* {@link ClientSession} as its first argument. This allows seamless integration with the existing code base. * {@link ClientSession} as its first argument. This allows seamless integration with the existing code base.
* <br /> * <p />
* The {@link MethodInterceptor} is aware of methods on {@code MongoCollection} that my return new instances of itself * The {@link MethodInterceptor} is aware of methods on {@code MongoCollection} that my return new instances of itself
* like (eg. {@link com.mongodb.reactivestreams.client.MongoCollection#withWriteConcern(WriteConcern)} and decorate them * like (eg. {@link com.mongodb.reactivestreams.client.MongoCollection#withWriteConcern(WriteConcern)} and decorate them
* if not already proxied. * if not already proxied.
@@ -76,13 +76,13 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType, Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) { ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null"); Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null"); Assert.notNull(target, "Target must not be null!");
Assert.notNull(sessionType, "SessionType must not be null"); Assert.notNull(sessionType, "SessionType must not be null!");
Assert.notNull(databaseType, "Database type must not be null"); Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null"); Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null"); Assert.notNull(collectionType, "Collection type must not be null!");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null"); Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null!");
this.session = session; this.session = session;
this.target = target; this.target = target;
@@ -95,6 +95,10 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.sessionType = sessionType; this.sessionType = sessionType;
} }
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable @Nullable
@Override @Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable { public Object invoke(MethodInvocation methodInvocation) throws Throwable {

View File

@@ -15,8 +15,8 @@
*/ */
package org.springframework.data.mongodb; package org.springframework.data.mongodb;
import org.apache.commons.logging.Log; import org.slf4j.Logger;
import org.apache.commons.logging.LogFactory; import org.slf4j.LoggerFactory;
import org.springframework.data.util.Version; import org.springframework.data.util.Version;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -31,7 +31,7 @@ import com.mongodb.MongoDriverInformation;
*/ */
public class SpringDataMongoDB { public class SpringDataMongoDB {
private static final Log LOGGER = LogFactory.getLog(SpringDataMongoDB.class); private static final Logger LOGGER = LoggerFactory.getLogger(SpringDataMongoDB.class);
private static final Version FALLBACK_VERSION = new Version(3); private static final Version FALLBACK_VERSION = new Version(3);
private static final MongoDriverInformation DRIVER_INFORMATION = MongoDriverInformation private static final MongoDriverInformation DRIVER_INFORMATION = MongoDriverInformation
@@ -48,7 +48,7 @@ public class SpringDataMongoDB {
/** /**
* Fetches the "Implementation-Version" manifest attribute from the jar file. * Fetches the "Implementation-Version" manifest attribute from the jar file.
* <br /> * <p />
* Note that some ClassLoaders do not expose the package metadata, hence this class might not be able to determine the * Note that some ClassLoaders do not expose the package metadata, hence this class might not be able to determine the
* version in all environments. In this case the current Major version is returned as a fallback. * version in all environments. In this case the current Major version is returned as a fallback.
* *
@@ -68,7 +68,7 @@ public class SpringDataMongoDB {
try { try {
return Version.parse(versionString); return Version.parse(versionString);
} catch (Exception e) { } catch (Exception e) {
LOGGER.debug(String.format("Cannot read Spring Data MongoDB version '%s'.", versionString)); LOGGER.debug("Cannot read Spring Data MongoDB version '{}'.", versionString);
} }
return FALLBACK_VERSION; return FALLBACK_VERSION;

View File

@@ -1,40 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.data.aot.TypeContributor;
import org.springframework.data.repository.aot.AotRepositoryContext;
import org.springframework.data.repository.aot.RepositoryRegistrationAotProcessor;
/**
* @author Christoph Strobl
*/
public class AotMongoRepositoryPostProcessor extends RepositoryRegistrationAotProcessor {
private final LazyLoadingProxyAotProcessor lazyLoadingProxyAotProcessor = new LazyLoadingProxyAotProcessor();
@Override
protected void contribute(AotRepositoryContext repositoryContext, GenerationContext generationContext) {
// do some custom type registration here
super.contribute(repositoryContext, generationContext);
repositoryContext.getResolvedTypes().stream().filter(MongoAotPredicates.IS_SIMPLE_TYPE.negate()).forEach(type -> {
TypeContributor.contribute(type, it -> true, generationContext);
lazyLoadingProxyAotProcessor.registerLazyLoadingProxyIfNeeded(type, generationContext);
});
}
}

View File

@@ -1,103 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.lang.annotation.Annotation;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.aot.hint.TypeReference;
import org.springframework.core.ResolvableType;
import org.springframework.core.annotation.AnnotatedElementUtils;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.data.annotation.Reference;
import org.springframework.data.aot.TypeUtils;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxyFactory.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.DocumentReference;
/**
* @author Christoph Strobl
* @since 4.0
*/
class LazyLoadingProxyAotProcessor {
private boolean generalLazyLoadingProxyContributed = false;
void registerLazyLoadingProxyIfNeeded(Class<?> type, GenerationContext generationContext) {
Set<Field> refFields = getFieldsWithAnnotationPresent(type, Reference.class);
if (refFields.isEmpty()) {
return;
}
refFields.stream() //
.filter(LazyLoadingProxyAotProcessor::isLazyLoading) //
.forEach(field -> {
if (!generalLazyLoadingProxyContributed) {
generationContext.getRuntimeHints().proxies().registerJdkProxy(
TypeReference.of(org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class),
TypeReference.of(org.springframework.aop.SpringProxy.class),
TypeReference.of(org.springframework.aop.framework.Advised.class),
TypeReference.of(org.springframework.core.DecoratingProxy.class));
generalLazyLoadingProxyContributed = true;
}
if (field.getType().isInterface()) {
List<Class<?>> interfaces = new ArrayList<>(
TypeUtils.resolveTypesInSignature(ResolvableType.forField(field, type)));
interfaces.add(0, org.springframework.data.mongodb.core.convert.LazyLoadingProxy.class);
interfaces.add(org.springframework.aop.SpringProxy.class);
interfaces.add(org.springframework.aop.framework.Advised.class);
interfaces.add(org.springframework.core.DecoratingProxy.class);
generationContext.getRuntimeHints().proxies().registerJdkProxy(interfaces.toArray(Class[]::new));
} else {
LazyLoadingProxyFactory.resolveProxyType(field.getType(), () -> LazyLoadingInterceptor.none());
}
});
}
private static boolean isLazyLoading(Field field) {
if (AnnotatedElementUtils.isAnnotated(field, DBRef.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DBRef.class).lazy();
}
if (AnnotatedElementUtils.isAnnotated(field, DocumentReference.class)) {
return AnnotatedElementUtils.findMergedAnnotation(field, DocumentReference.class).lazy();
}
return false;
}
private static Set<Field> getFieldsWithAnnotationPresent(Class<?> type, Class<? extends Annotation> annotation) {
Set<Field> fields = new LinkedHashSet<>();
for (Field field : type.getDeclaredFields()) {
if (MergedAnnotations.from(field).get(annotation).isPresent()) {
fields.add(field);
}
}
return fields;
}
}

View File

@@ -1,31 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.util.function.Predicate;
import org.springframework.data.aot.TypeUtils;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
/**
* @author Christoph Strobl
* @since 4.0
*/
class MongoAotPredicates {
static final Predicate<Class<?>> IS_SIMPLE_TYPE = (type) -> MongoSimpleTypes.HOLDER.isSimpleType(type) || TypeUtils.type(type).isPartOf("org.bson");
}

View File

@@ -1,56 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import org.springframework.aot.generate.GenerationContext;
import org.springframework.core.ResolvableType;
import org.springframework.data.aot.ManagedTypesBeanRegistrationAotProcessor;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
/**
* @author Christoph Strobl
* @since 2022/06
*/
class MongoManagedTypesBeanRegistrationAotProcessor extends ManagedTypesBeanRegistrationAotProcessor {
private final LazyLoadingProxyAotProcessor lazyLoadingProxyAotProcessor = new LazyLoadingProxyAotProcessor();
public MongoManagedTypesBeanRegistrationAotProcessor() {
setModuleIdentifier("mongo");
}
@Override
protected boolean isMatch(@Nullable Class<?> beanType, @Nullable String beanName) {
return isMongoManagedTypes(beanType) || super.isMatch(beanType, beanName);
}
protected boolean isMongoManagedTypes(@Nullable Class<?> beanType) {
return beanType != null && ClassUtils.isAssignable(MongoManagedTypes.class, beanType);
}
@Override
protected void contributeType(ResolvableType type, GenerationContext generationContext) {
if (MongoAotPredicates.IS_SIMPLE_TYPE.test(type.toClass())) {
return;
}
super.contributeType(type, generationContext);
lazyLoadingProxyAotProcessor.registerLazyLoadingProxyIfNeeded(type.toClass(), generationContext);
}
}

View File

@@ -1,68 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.aot;
import java.util.Arrays;
import org.springframework.aot.hint.MemberCategory;
import org.springframework.aot.hint.RuntimeHints;
import org.springframework.aot.hint.RuntimeHintsRegistrar;
import org.springframework.aot.hint.TypeReference;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAfterSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeSaveCallback;
import org.springframework.data.repository.util.ReactiveWrappers;
import org.springframework.lang.Nullable;
/**
* {@link RuntimeHintsRegistrar} for repository types and entity callbacks.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 4.0
*/
class MongoRuntimeHints implements RuntimeHintsRegistrar {
private static final boolean PROJECT_REACTOR_PRESENT = ReactiveWrappers
.isAvailable(ReactiveWrappers.ReactiveLibrary.PROJECT_REACTOR);
@Override
public void registerHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
hints.reflection().registerTypes(
Arrays.asList(TypeReference.of("org.springframework.data.mongodb.repository.support.SimpleMongoRepository"),
TypeReference.of(BeforeConvertCallback.class), TypeReference.of(BeforeSaveCallback.class),
TypeReference.of(AfterConvertCallback.class), TypeReference.of(AfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
if (PROJECT_REACTOR_PRESENT) {
hints.reflection()
.registerTypes(Arrays.asList(
TypeReference.of("org.springframework.data.mongodb.repository.support.SimpleReactiveMongoRepository"),
TypeReference.of(ReactiveBeforeConvertCallback.class), TypeReference.of(ReactiveBeforeSaveCallback.class),
TypeReference.of(ReactiveAfterConvertCallback.class), TypeReference.of(ReactiveAfterSaveCallback.class)),
builder -> builder.withMembers(MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS));
}
}
}

View File

@@ -25,7 +25,9 @@ import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver; import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter; import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions; import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext; import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClientSettings; import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder; import com.mongodb.MongoClientSettings.Builder;
@@ -78,12 +80,30 @@ public abstract class AbstractMongoClientConfiguration extends MongoConfiguratio
return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName()); return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName());
} }
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/** /**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and * Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied. * {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.
* *
* @see #customConversions() * @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes) * @see #mongoMappingContext(MongoCustomConversions)
* @see #mongoDbFactory() * @see #mongoDbFactory()
*/ */
@Bean @Bean

View File

@@ -84,10 +84,10 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
/** /**
* Creates a {@link MappingMongoConverter} using the configured {@link #reactiveMongoDbFactory()} and * Creates a {@link MappingMongoConverter} using the configured {@link #reactiveMongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes)}. Will get {@link #customConversions()} applied. * {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.
* *
* @see #customConversions() * @see #customConversions()
* @see #mongoMappingContext(MongoCustomConversions, org.springframework.data.mongodb.MongoManagedTypes) * @see #mongoMappingContext(MongoCustomConversions)
* @see #reactiveMongoDbFactory() * @see #reactiveMongoDbFactory()
* @return never {@literal null}. * @return never {@literal null}.
*/ */

View File

@@ -30,6 +30,10 @@ import com.mongodb.ConnectionString;
*/ */
public class ConnectionStringPropertyEditor extends PropertyEditorSupport { public class ConnectionStringPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override @Override
public void setAsText(@Nullable String connectionString) { public void setAsText(@Nullable String connectionString) {

View File

@@ -34,6 +34,10 @@ import org.w3c.dom.Element;
*/ */
class GridFsTemplateParser extends AbstractBeanDefinitionParser { class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext) protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException { throws BeanDefinitionStoreException {
@@ -42,6 +46,10 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME; return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) { protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -24,6 +24,7 @@ import java.util.List;
import java.util.Set; import java.util.Set;
import org.springframework.beans.BeanMetadataElement; import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder; import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference; import org.springframework.beans.factory.config.RuntimeBeanReference;
@@ -63,7 +64,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils; import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
/** /**
@@ -80,7 +80,7 @@ import org.w3c.dom.Element;
public class MappingMongoConverterParser implements BeanDefinitionParser { public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package"; private static final String BASE_PACKAGE = "base-package";
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("jakarta.validation.Validator", private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader()); MappingMongoConverterParser.class.getClassLoader());
/* (non-Javadoc) /* (non-Javadoc)
@@ -135,7 +135,9 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider")); new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider"));
} }
if (!registry.containsBeanDefinition(INDEX_HELPER_BEAN_NAME)) { try {
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class); .genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
@@ -149,7 +151,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext); BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null && !registry.containsBeanDefinition(VALIDATING_EVENT_LISTENER_BEAN_NAME)) { if (validatingMongoEventListener != null) {
parserContext.registerBeanComponent( parserContext.registerBeanComponent(
new BeanComponentDefinition(validatingMongoEventListener, VALIDATING_EVENT_LISTENER_BEAN_NAME)); new BeanComponentDefinition(validatingMongoEventListener, VALIDATING_EVENT_LISTENER_BEAN_NAME));
} }
@@ -163,16 +165,15 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) { private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
String disableValidation = element.getAttribute("disable-validation"); String disableValidation = element.getAttribute("disable-validation");
boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.parseBoolean(disableValidation); boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.valueOf(disableValidation);
if (!validationDisabled) { if (!validationDisabled) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition();
RuntimeBeanReference validator = getValidator(element, parserContext); RuntimeBeanReference validator = getValidator(builder, parserContext);
if (validator != null) { if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class); builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.getRawBeanDefinition().setSource(element);
builder.addConstructorArgValue(validator); builder.addConstructorArgValue(validator);
return builder.getBeanDefinition(); return builder.getBeanDefinition();
@@ -194,6 +195,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
validatorDef.setSource(source); validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE); validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef); String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName); return new RuntimeBeanReference(validatorName);
} }
@@ -253,7 +255,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
&& Boolean.parseBoolean(abbreviateFieldNames); && Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) { if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured", context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element); element);
return; return;
} }
@@ -374,6 +376,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
this.delegates = new HashSet<>(Arrays.asList(filters)); this.delegates = new HashSet<>(Arrays.asList(filters));
} }
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory) public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory)
throws IOException { throws IOException {

View File

@@ -47,16 +47,28 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono", private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader()); MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override @Override
protected Class<?> getBeanClass(Element element) { protected Class<?> getBeanClass(Element element) {
return AuditingEntityCallback.class; return AuditingEntityCallback.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override @Override
protected boolean shouldGenerateId() { protected boolean shouldGenerateId() {
return true; return true;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override @Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) { protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {

View File

@@ -18,10 +18,11 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation; import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry; import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar; import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.Ordered; import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler; import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport; import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration; import org.springframework.data.auditing.config.AuditingConfiguration;
@@ -35,42 +36,68 @@ import org.springframework.util.Assert;
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
*/ */
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport implements Ordered { class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override @Override
protected Class<? extends Annotation> getAnnotation() { protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class; return EnableMongoAuditing.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override @Override
protected String getAuditingHandlerBeanName() { protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler"; return "mongoAuditingHandler";
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override @Override
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration, public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
BeanDefinitionRegistry registry) {
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext"); Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
super.registerBeanDefinitions(annotationMetadata, registry);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override @Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) { protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null"); Assert.notNull(configuration, "AuditingConfiguration must not be null!");
return configureDefaultAuditHandlerAttributes(configuration, BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class));
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override @Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition, protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) { BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null"); Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null"); Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEntityCallback.class); .rootBeanDefinition(AuditingEntityCallback.class);
@@ -81,8 +108,4 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport impl
AuditingEntityCallback.class.getName(), registry); AuditingEntityCallback.class.getName(), registry);
} }
@Override
public int getOrder() {
return Ordered.LOWEST_PRECEDENCE;
}
} }

View File

@@ -35,6 +35,10 @@ import org.w3c.dom.Element;
*/ */
public class MongoClientParser implements BeanDefinitionParser { public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) { public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element); Object source = parserContext.extractSource(element);

View File

@@ -30,7 +30,6 @@ import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy; import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy; import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy; import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoManagedTypes;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions; import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions.MongoConverterConfigurationAdapter; import org.springframework.data.mongodb.core.convert.MongoCustomConversions.MongoConverterConfigurationAdapter;
import org.springframework.data.mongodb.core.mapping.Document; import org.springframework.data.mongodb.core.mapping.Document;
@@ -77,13 +76,14 @@ public abstract class MongoConfigurationSupport {
* *
* @see #getMappingBasePackages() * @see #getMappingBasePackages()
* @return * @return
* @throws ClassNotFoundException
*/ */
@Bean @Bean
public MongoMappingContext mongoMappingContext(MongoCustomConversions customConversions, public MongoMappingContext mongoMappingContext(MongoCustomConversions customConversions)
MongoManagedTypes mongoManagedTypes) { throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext(); MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setManagedTypes(mongoManagedTypes); mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions.getSimpleTypeHolder()); mappingContext.setSimpleTypeHolder(customConversions.getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy()); mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation()); mappingContext.setAutoIndexCreation(autoIndexCreation());
@@ -91,16 +91,6 @@ public abstract class MongoConfigurationSupport {
return mappingContext; return mappingContext;
} }
/**
* @return new instance of {@link MongoManagedTypes}.
* @throws ClassNotFoundException
* @since 4.0
*/
@Bean
public MongoManagedTypes mongoManagedTypes() throws ClassNotFoundException {
return MongoManagedTypes.fromIterable(getInitialEntitySet());
}
/** /**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These * Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the * {@link CustomConversions} will be registered with the
@@ -182,7 +172,8 @@ public abstract class MongoConfigurationSupport {
/** /**
* Configures whether to abbreviate field names for domain objects by configuring a * Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. * {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
* *
* @return * @return
*/ */

View File

@@ -51,6 +51,10 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String OPTIONS_DELIMITER = "?"; private static final String OPTIONS_DELIMITER = "?";
private static final String OPTION_VALUE_DELIMITER = "&"; private static final String OPTION_VALUE_DELIMITER = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override @Override
public void setAsText(@Nullable String text) throws IllegalArgumentException { public void setAsText(@Nullable String text) throws IllegalArgumentException {
@@ -117,7 +121,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
userNameAndPassword[1].toCharArray())); userNameAndPassword[1].toCharArray()));
} else { } else {
throw new IllegalArgumentException( throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'", authMechanism)); String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
} }
} }
} else { } else {
@@ -194,7 +198,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
String[] optionArgs = option.split("="); String[] optionArgs = option.split("=");
if (optionArgs.length == 1) { if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value", optionArgs[0])); throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
} }
properties.put(optionArgs[0], optionArgs[1]); properties.put(optionArgs[0], optionArgs[1]);
@@ -209,21 +213,21 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
if (source.length != 2) { if (source.length != 2) {
throw new IllegalArgumentException( throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'"); "Credentials need to specify username and password like in 'username:password@database'!");
} }
} }
private static void verifyDatabasePresent(String source) { private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) { if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'"); throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
} }
} }
private static void verifyUserNamePresent(String[] source) { private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) { if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username"); throw new IllegalArgumentException("Credentials need to specify username!");
} }
} }
@@ -231,7 +235,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
try { try {
return URLDecoder.decode(it, "UTF-8"); return URLDecoder.decode(it, "UTF-8");
} catch (UnsupportedEncodingException e) { } catch (UnsupportedEncodingException e) {
throw new IllegalArgumentException("o_O UTF-8 not supported", e); throw new IllegalArgumentException("o_O UTF-8 not supported!", e);
} }
} }
} }

View File

@@ -62,6 +62,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes); MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext) protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException { throws BeanDefinitionStoreException {
@@ -70,6 +74,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME; return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) { protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
@@ -163,7 +171,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
if (element.getAttributes().getLength() > allowedAttributesCount) { if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error("Configure either MongoDB " + type + " or details individually", parserContext.getReaderContext().error("Configure either MongoDB " + type + " or details individually!",
parserContext.extractSource(element)); parserContext.extractSource(element));
} }

View File

@@ -26,6 +26,10 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*/ */
public class MongoNamespaceHandler extends NamespaceHandlerSupport { public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() { public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser()); registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());

View File

@@ -22,12 +22,9 @@ import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer; import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionValidationException;
import org.springframework.beans.factory.support.ManagedMap; import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientSettingsFactoryBean; import org.springframework.data.mongodb.core.MongoClientSettingsFactoryBean;
import org.springframework.data.mongodb.core.MongoServerApiFactoryBean;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils; import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -40,6 +37,7 @@ import org.w3c.dom.Element;
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
*/ */
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils { abstract class MongoParsingUtils {
private MongoParsingUtils() {} private MongoParsingUtils() {}
@@ -114,20 +112,6 @@ abstract class MongoParsingUtils {
// Field level encryption // Field level encryption
setPropertyReference(clientOptionsDefBuilder, settingsElement, "encryption-settings-ref", "autoEncryptionSettings"); setPropertyReference(clientOptionsDefBuilder, settingsElement, "encryption-settings-ref", "autoEncryptionSettings");
// ServerAPI
if (StringUtils.hasText(settingsElement.getAttribute("server-api-version"))) {
MongoServerApiFactoryBean serverApiFactoryBean = new MongoServerApiFactoryBean();
serverApiFactoryBean.setVersion(settingsElement.getAttribute("server-api-version"));
try {
clientOptionsDefBuilder.addPropertyValue("serverApi", serverApiFactoryBean.getObject());
} catch (Exception exception) {
throw new BeanDefinitionValidationException("Non parsable server-api.", exception);
}
} else {
setPropertyReference(clientOptionsDefBuilder, settingsElement, "server-api-ref", "serverApi");
}
// and the rest // and the rest
mongoClientBuilder.addPropertyValue("mongoClientSettings", clientOptionsDefBuilder.getBeanDefinition()); mongoClientBuilder.addPropertyValue("mongoClientSettings", clientOptionsDefBuilder.getBeanDefinition());

View File

@@ -39,6 +39,10 @@ import org.w3c.dom.Element;
*/ */
class MongoTemplateParser extends AbstractBeanDefinitionParser { class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext) protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException { throws BeanDefinitionStoreException {
@@ -47,6 +51,10 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME; return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) { protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -41,11 +41,19 @@ public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEnti
this.converter = converter; this.converter = converter;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override @Override
public PersistentEntities getObject() { public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext()); return PersistentEntities.of(converter.getMappingContext());
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override @Override
public Class<?> getObjectType() { public Class<?> getObjectType() {
return PersistentEntities.class; return PersistentEntities.class;

View File

@@ -18,9 +18,11 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation; import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry; import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar; import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler; import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport; import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration; import org.springframework.data.auditing.config.AuditingConfiguration;
@@ -32,42 +34,56 @@ import org.springframework.util.Assert;
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation. * {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation.
* *
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
* @since 3.1 * @since 3.1
*/ */
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport { class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override @Override
protected Class<? extends Annotation> getAnnotation() { protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class; return EnableReactiveMongoAuditing.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override @Override
protected String getAuditingHandlerBeanName() { protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler"; return "reactiveMongoAuditingHandler";
} }
@Override /*
protected void postProcess(BeanDefinitionBuilder builder, AuditingConfiguration configuration, * (non-Javadoc)
BeanDefinitionRegistry registry) { * @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
builder.setFactoryMethod("from").addConstructorArgReference("mongoMappingContext"); */
}
@Override @Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) { protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null"); Assert.notNull(configuration, "AuditingConfiguration must not be null!");
return configureDefaultAuditHandlerAttributes(configuration, BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class);
BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class));
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override @Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition, protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) { BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null"); Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null"); Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class); BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);

View File

@@ -32,6 +32,10 @@ import com.mongodb.ReadConcernLevel;
*/ */
public class ReadConcernPropertyEditor extends PropertyEditorSupport { public class ReadConcernPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override @Override
public void setAsText(@Nullable String readConcernString) { public void setAsText(@Nullable String readConcernString) {

View File

@@ -29,6 +29,10 @@ import com.mongodb.ReadPreference;
*/ */
public class ReadPreferencePropertyEditor extends PropertyEditorSupport { public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override @Override
public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException { public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {

View File

@@ -21,8 +21,8 @@ import java.net.UnknownHostException;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
import org.apache.commons.logging.Log; import org.slf4j.Logger;
import org.apache.commons.logging.LogFactory; import org.slf4j.LoggerFactory;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -43,9 +43,13 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :. * A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/ */
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)"; private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'; Check your replica set configuration"; private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class); private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override @Override
public void setAsText(@Nullable String replicaSetString) { public void setAsText(@Nullable String replicaSetString) {
@@ -68,7 +72,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
if (serverAddresses.isEmpty()) { if (serverAddresses.isEmpty()) {
throw new IllegalArgumentException( throw new IllegalArgumentException(
"Could not resolve at least one server of the replica set configuration; Validate your config"); "Could not resolve at least one server of the replica set configuration! Validate your config!");
} }
setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()])); setValue(serverAddresses.toArray(new ServerAddress[serverAddresses.size()]));
@@ -84,18 +88,14 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private ServerAddress parseServerAddress(String source) { private ServerAddress parseServerAddress(String source) {
if (!StringUtils.hasText(source)) { if (!StringUtils.hasText(source)) {
if(LOG.isWarnEnabled()) { LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
return null; return null;
} }
String[] hostAndPort = extractHostAddressAndPort(source.trim()); String[] hostAndPort = extractHostAddressAndPort(source.trim());
if (hostAndPort.length > 2) { if (hostAndPort.length > 2) {
if(LOG.isWarnEnabled()) { LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
return null; return null;
} }
@@ -105,13 +105,9 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port); return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
} catch (UnknownHostException e) { } catch (UnknownHostException e) {
if(LOG.isWarnEnabled()) { LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]));
}
} catch (NumberFormatException e) { } catch (NumberFormatException e) {
if(LOG.isWarnEnabled()) { LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]));
}
} }
return null; return null;
@@ -125,7 +121,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
*/ */
private String[] extractHostAddressAndPort(String addressAndPortSource) { private String[] extractHostAddressAndPort(String addressAndPortSource) {
Assert.notNull(addressAndPortSource, "Address and port source must not be null"); Assert.notNull(addressAndPortSource, "Address and port source must not be null!");
String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN); String[] hostAndPort = addressAndPortSource.split(HOST_PORT_SPLIT_PATTERN);
String hostAddress = hostAndPort[0]; String hostAddress = hostAndPort[0];

View File

@@ -26,6 +26,10 @@ import com.mongodb.WriteConcern;
*/ */
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> { public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) { public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source); WriteConcern writeConcern = WriteConcern.valueOf(source);

View File

@@ -29,6 +29,10 @@ import org.springframework.util.StringUtils;
*/ */
public class UUidRepresentationPropertyEditor extends PropertyEditorSupport { public class UUidRepresentationPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override @Override
public void setAsText(@Nullable String value) { public void setAsText(@Nullable String value) {

View File

@@ -15,6 +15,8 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Optional; import java.util.Optional;
import java.util.stream.Collectors; import java.util.stream.Collectors;
@@ -22,16 +24,22 @@ import java.util.stream.Collectors;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation; import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext; import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions.DomainTypeMapping; import org.springframework.data.mongodb.core.aggregation.AggregationOptions.DomainTypeMapping;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext; import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext; import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation; import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper; import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.Lazy; import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
/** /**

View File

@@ -24,14 +24,10 @@ import org.springframework.data.util.Pair;
import com.mongodb.bulk.BulkWriteResult; import com.mongodb.bulk.BulkWriteResult;
/** /**
* Bulk operations for insert/update/remove actions on a collection. Bulk operations are available since MongoDB 2.6 and * Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* make use of low level bulk commands on the protocol level. This interface defines a fluent API to add multiple single * 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* operations or list of similar operations in sequence which can then eventually be executed by calling * multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}. * {@link #execute()}.
* <p>
* Bulk operations are issued as one batch that pulls together all insert, update, and delete operations. Operations
* that require individual operation results such as optimistic locking (using {@code @Version}) are not supported and
* the version field remains not populated.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Oliver Gierke * @author Oliver Gierke

View File

@@ -36,29 +36,21 @@ import com.mongodb.client.model.changestream.OperationType;
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1 * @since 2.1
*/ */
public class ChangeStreamEvent<T> { public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") // @SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_UPDATER = AtomicReferenceFieldUpdater private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocument"); .newUpdater(ChangeStreamEvent.class, Object.class, "converted");
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "convertedFullDocumentBeforeChange");
private final @Nullable ChangeStreamDocument<Document> raw; private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType; private final Class<T> targetType;
private final MongoConverter converter; private final MongoConverter converter;
// accessed through CONVERTED_FULL_DOCUMENT_UPDATER. // accessed through CONVERTED_UPDATER.
private volatile @Nullable T convertedFullDocument; private volatile @Nullable T converted;
// accessed through CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER.
private volatile @Nullable T convertedFullDocumentBeforeChange;
/** /**
* @param raw can be {@literal null}. * @param raw can be {@literal null}.
@@ -155,43 +147,27 @@ public class ChangeStreamEvent<T> {
@Nullable @Nullable
public T getBody() { public T getBody() {
if (raw == null || raw.getFullDocument() == null) { if (raw == null) {
return null; return null;
} }
return getConvertedFullDocument(raw.getFullDocument()); Document fullDocument = raw.getFullDocument();
if (fullDocument == null) {
return targetType.cast(fullDocument);
} }
/** return getConverted(fullDocument);
* Get the potentially converted {@link ChangeStreamDocument#getFullDocumentBeforeChange() document} before being changed.
*
* @return {@literal null} when {@link #getRaw()} or {@link ChangeStreamDocument#getFullDocumentBeforeChange()} is
* {@literal null}.
* @since 4.0
*/
@Nullable
public T getBodyBeforeChange() {
if (raw == null || raw.getFullDocumentBeforeChange() == null) {
return null;
}
return getConvertedFullDocumentBeforeChange(raw.getFullDocumentBeforeChange());
} }
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private T getConvertedFullDocumentBeforeChange(Document fullDocument) { private T getConverted(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_BEFORE_CHANGE_UPDATER); return (T) doGetConverted(fullDocument);
} }
@SuppressWarnings("unchecked") private Object doGetConverted(Document fullDocument) {
private T getConvertedFullDocument(Document fullDocument) {
return (T) doGetConverted(fullDocument, CONVERTED_FULL_DOCUMENT_UPDATER);
}
private Object doGetConverted(Document fullDocument, AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> updater) { Object result = CONVERTED_UPDATER.get(this);
Object result = updater.get(this);
if (result != null) { if (result != null) {
return result; return result;
@@ -200,19 +176,23 @@ public class ChangeStreamEvent<T> {
if (ClassUtils.isAssignable(Document.class, fullDocument.getClass())) { if (ClassUtils.isAssignable(Document.class, fullDocument.getClass())) {
result = converter.read(targetType, fullDocument); result = converter.read(targetType, fullDocument);
return updater.compareAndSet(this, null, result) ? result : updater.get(this); return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
} }
if (converter.getConversionService().canConvert(fullDocument.getClass(), targetType)) { if (converter.getConversionService().canConvert(fullDocument.getClass(), targetType)) {
result = converter.getConversionService().convert(fullDocument, targetType); result = converter.getConversionService().convert(fullDocument, targetType);
return updater.compareAndSet(this, null, result) ? result : updater.get(this); return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
} }
throw new IllegalArgumentException( throw new IllegalArgumentException(
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType)); String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
} }
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override @Override
public String toString() { public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}'; return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';

View File

@@ -32,7 +32,6 @@ import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument; import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument; import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.model.changestream.FullDocumentBeforeChange;
/** /**
* Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended * Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended
@@ -41,7 +40,6 @@ import com.mongodb.client.model.changestream.FullDocumentBeforeChange;
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
* @author Myroslav Kosinskyi
* @since 2.1 * @since 2.1
*/ */
public class ChangeStreamOptions { public class ChangeStreamOptions {
@@ -49,7 +47,6 @@ public class ChangeStreamOptions {
private @Nullable Object filter; private @Nullable Object filter;
private @Nullable BsonValue resumeToken; private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup; private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation; private @Nullable Collation collation;
private @Nullable Object resumeTimestamp; private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED; private Resume resume = Resume.UNDEFINED;
@@ -77,14 +74,6 @@ public class ChangeStreamOptions {
return Optional.ofNullable(fullDocumentLookup); return Optional.ofNullable(fullDocumentLookup);
} }
/**
* @return {@link Optional#empty()} if not set.
* @since 4.0
*/
public Optional<FullDocumentBeforeChange> getFullDocumentBeforeChangeLookup() {
return Optional.ofNullable(fullDocumentBeforeChangeLookup);
}
/** /**
* @return {@link Optional#empty()} if not set. * @return {@link Optional#empty()} if not set.
*/ */
@@ -159,7 +148,7 @@ public class ChangeStreamOptions {
} }
throw new IllegalArgumentException( throw new IllegalArgumentException(
"o_O that should actually not happen; The timestamp should be an Instant or a BsonTimestamp but was " "o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp)); + ObjectUtils.nullSafeClassName(timestamp));
} }
@@ -181,9 +170,6 @@ public class ChangeStreamOptions {
if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) { if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) {
return false; return false;
} }
if (!ObjectUtils.nullSafeEquals(this.fullDocumentBeforeChangeLookup, that.fullDocumentBeforeChangeLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) { if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) {
return false; return false;
} }
@@ -198,7 +184,6 @@ public class ChangeStreamOptions {
int result = ObjectUtils.nullSafeHashCode(filter); int result = ObjectUtils.nullSafeHashCode(filter);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken); result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup); result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentBeforeChangeLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation); result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp); result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp);
result = 31 * result + ObjectUtils.nullSafeHashCode(resume); result = 31 * result + ObjectUtils.nullSafeHashCode(resume);
@@ -235,7 +220,6 @@ public class ChangeStreamOptions {
private @Nullable Object filter; private @Nullable Object filter;
private @Nullable BsonValue resumeToken; private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup; private @Nullable FullDocument fullDocumentLookup;
private @Nullable FullDocumentBeforeChange fullDocumentBeforeChangeLookup;
private @Nullable Collation collation; private @Nullable Collation collation;
private @Nullable Object resumeTimestamp; private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED; private Resume resume = Resume.UNDEFINED;
@@ -250,7 +234,7 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder collation(Collation collation) { public ChangeStreamOptionsBuilder collation(Collation collation) {
Assert.notNull(collation, "Collation must not be null nor empty"); Assert.notNull(collation, "Collation must not be null nor empty!");
this.collation = collation; this.collation = collation;
return this; return this;
@@ -258,13 +242,13 @@ public class ChangeStreamOptions {
/** /**
* Set the filter to apply. * Set the filter to apply.
* <br /> * <p/>
* Fields on aggregation expression root level are prefixed to map to fields contained in * Fields on aggregation expression root level are prefixed to map to fields contained in
* {@link ChangeStreamDocument#getFullDocument() fullDocument}. However {@literal operationType}, {@literal ns}, * {@link ChangeStreamDocument#getFullDocument() fullDocument}. However {@literal operationType}, {@literal ns},
* {@literal documentKey} and {@literal fullDocument} are reserved words that will be omitted, and therefore taken * {@literal documentKey} and {@literal fullDocument} are reserved words that will be omitted, and therefore taken
* as given, during the mapping procedure. You may want to have a look at the * as given, during the mapping procedure. You may want to have a look at the
* <a href="https://docs.mongodb.com/manual/reference/change-events/">structure of Change Events</a>. * <a href="https://docs.mongodb.com/manual/reference/change-events/">structure of Change Events</a>.
* <br /> * <p/>
* Use {@link org.springframework.data.mongodb.core.aggregation.TypedAggregation} to ensure filter expressions are * Use {@link org.springframework.data.mongodb.core.aggregation.TypedAggregation} to ensure filter expressions are
* mapped to domain type fields. * mapped to domain type fields.
* *
@@ -274,7 +258,7 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder filter(Aggregation filter) { public ChangeStreamOptionsBuilder filter(Aggregation filter) {
Assert.notNull(filter, "Filter must not be null"); Assert.notNull(filter, "Filter must not be null!");
this.filter = filter; this.filter = filter;
return this; return this;
@@ -303,7 +287,7 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) { public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) {
Assert.notNull(resumeToken, "ResumeToken must not be null"); Assert.notNull(resumeToken, "ResumeToken must not be null!");
this.resumeToken = resumeToken; this.resumeToken = resumeToken;
@@ -332,38 +316,12 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) { public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) {
Assert.notNull(lookup, "Lookup must not be null"); Assert.notNull(lookup, "Lookup must not be null!");
this.fullDocumentLookup = lookup; this.fullDocumentLookup = lookup;
return this; return this;
} }
/**
* Set the {@link FullDocumentBeforeChange} lookup to use.
*
* @param lookup must not be {@literal null}.
* @return this.
* @since 4.0
*/
public ChangeStreamOptionsBuilder fullDocumentBeforeChangeLookup(FullDocumentBeforeChange lookup) {
Assert.notNull(lookup, "Lookup must not be null");
this.fullDocumentBeforeChangeLookup = lookup;
return this;
}
/**
* Return the full document before being changed if it is available.
*
* @return this.
* @since 4.0
* @see #fullDocumentBeforeChangeLookup(FullDocumentBeforeChange)
*/
public ChangeStreamOptionsBuilder returnFullDocumentBeforeChange() {
return fullDocumentBeforeChangeLookup(FullDocumentBeforeChange.WHEN_AVAILABLE);
}
/** /**
* Set the cluster time to resume from. * Set the cluster time to resume from.
* *
@@ -372,7 +330,7 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder resumeAt(Instant resumeTimestamp) { public ChangeStreamOptionsBuilder resumeAt(Instant resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null"); Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp; this.resumeTimestamp = resumeTimestamp;
return this; return this;
@@ -387,7 +345,7 @@ public class ChangeStreamOptions {
*/ */
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) { public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null"); Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp; this.resumeTimestamp = resumeTimestamp;
return this; return this;
@@ -433,7 +391,6 @@ public class ChangeStreamOptions {
options.filter = this.filter; options.filter = this.filter;
options.resumeToken = this.resumeToken; options.resumeToken = this.resumeToken;
options.fullDocumentLookup = this.fullDocumentLookup; options.fullDocumentLookup = this.fullDocumentLookup;
options.fullDocumentBeforeChangeLookup = this.fullDocumentBeforeChangeLookup;
options.collation = this.collation; options.collation = this.collation;
options.resumeTimestamp = this.resumeTimestamp; options.resumeTimestamp = this.resumeTimestamp;
options.resume = this.resume; options.resume = this.resume;

View File

@@ -17,11 +17,8 @@ package org.springframework.data.mongodb.core;
import java.util.Optional; import java.util.Optional;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.query.Collation; import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema; import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.timeseries.GranularityDefinition;
import org.springframework.data.mongodb.core.validation.Validator; import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.util.Optionals; import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
@@ -45,20 +42,29 @@ public class CollectionOptions {
private @Nullable Boolean capped; private @Nullable Boolean capped;
private @Nullable Collation collation; private @Nullable Collation collation;
private ValidationOptions validationOptions; private ValidationOptions validationOptions;
private @Nullable TimeSeriesOptions timeSeriesOptions;
private @Nullable CollectionChangeStreamOptions changeStreamOptions; /**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise. Can be {@literal null}.
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none());
}
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped, private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions, @Nullable Collation collation, ValidationOptions validationOptions) {
@Nullable TimeSeriesOptions timeSeriesOptions, @Nullable CollectionChangeStreamOptions changeStreamOptions) {
this.maxDocuments = maxDocuments; this.maxDocuments = maxDocuments;
this.size = size; this.size = size;
this.capped = capped; this.capped = capped;
this.collation = collation; this.collation = collation;
this.validationOptions = validationOptions; this.validationOptions = validationOptions;
this.timeSeriesOptions = timeSeriesOptions;
this.changeStreamOptions = changeStreamOptions;
} }
/** /**
@@ -70,9 +76,9 @@ public class CollectionOptions {
*/ */
public static CollectionOptions just(Collation collation) { public static CollectionOptions just(Collation collation) {
Assert.notNull(collation, "Collation must not be null"); Assert.notNull(collation, "Collation must not be null!");
return new CollectionOptions(null, null, null, collation, ValidationOptions.none(), null, null); return new CollectionOptions(null, null, null, collation, ValidationOptions.none());
} }
/** /**
@@ -82,33 +88,7 @@ public class CollectionOptions {
* @since 2.0 * @since 2.0
*/ */
public static CollectionOptions empty() { public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null, ValidationOptions.none(), null, null); return new CollectionOptions(null, null, null, null, ValidationOptions.none());
}
/**
* Quick way to set up {@link CollectionOptions} for a Time Series collection. For more advanced settings use
* {@link #timeSeries(TimeSeriesOptions)}.
*
* @param timeField The name of the property which contains the date in each time series document. Must not be
* {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @see #timeSeries(TimeSeriesOptions)
* @since 3.3
*/
public static CollectionOptions timeSeries(String timeField) {
return empty().timeSeries(TimeSeriesOptions.timeSeries(timeField));
}
/**
* Quick way to set up {@link CollectionOptions} for emitting (pre & post) change events.
*
* @return new instance of {@link CollectionOptions}.
* @see #changeStream(CollectionChangeStreamOptions)
* @see CollectionChangeStreamOptions#preAndPostImages(boolean)
* @since 4.0
*/
public static CollectionOptions emitChangedRevisions() {
return empty().changeStream(CollectionChangeStreamOptions.preAndPostImages(true));
} }
/** /**
@@ -119,7 +99,7 @@ public class CollectionOptions {
* @since 2.0 * @since 2.0
*/ */
public CollectionOptions capped() { public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions, timeSeriesOptions, changeStreamOptions); return new CollectionOptions(size, maxDocuments, true, collation, validationOptions);
} }
/** /**
@@ -130,7 +110,7 @@ public class CollectionOptions {
* @since 2.0 * @since 2.0
*/ */
public CollectionOptions maxDocuments(long maxDocuments) { public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions); return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
} }
/** /**
@@ -141,7 +121,7 @@ public class CollectionOptions {
* @since 2.0 * @since 2.0
*/ */
public CollectionOptions size(long size) { public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions); return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
} }
/** /**
@@ -152,7 +132,7 @@ public class CollectionOptions {
* @since 2.0 * @since 2.0
*/ */
public CollectionOptions collation(@Nullable Collation collation) { public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions); return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
} }
/** /**
@@ -244,7 +224,7 @@ public class CollectionOptions {
*/ */
public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) { public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) {
Assert.notNull(validationLevel, "ValidationLevel must not be null"); Assert.notNull(validationLevel, "ValidationLevel must not be null!");
return validation(validationOptions.validationLevel(validationLevel)); return validation(validationOptions.validationLevel(validationLevel));
} }
@@ -258,7 +238,7 @@ public class CollectionOptions {
*/ */
public CollectionOptions schemaValidationAction(ValidationAction validationAction) { public CollectionOptions schemaValidationAction(ValidationAction validationAction) {
Assert.notNull(validationAction, "ValidationAction must not be null"); Assert.notNull(validationAction, "ValidationAction must not be null!");
return validation(validationOptions.validationAction(validationAction)); return validation(validationOptions.validationAction(validationAction));
} }
@@ -271,34 +251,8 @@ public class CollectionOptions {
*/ */
public CollectionOptions validation(ValidationOptions validationOptions) { public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null"); Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions); return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param timeSeriesOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions timeSeries(TimeSeriesOptions timeSeriesOptions) {
Assert.notNull(timeSeriesOptions, "TimeSeriesOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions);
}
/**
* Create new {@link CollectionOptions} with the given {@link TimeSeriesOptions}.
*
* @param changeStreamOptions must not be {@literal null}.
* @return new instance of {@link CollectionOptions}.
* @since 3.3
*/
public CollectionOptions changeStream(CollectionChangeStreamOptions changeStreamOptions) {
Assert.notNull(changeStreamOptions, "ChangeStreamOptions must not be null");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions, timeSeriesOptions, changeStreamOptions);
} }
/** /**
@@ -349,26 +303,6 @@ public class CollectionOptions {
return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions); return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions);
} }
/**
* Get the {@link TimeSeriesOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 3.3
*/
public Optional<TimeSeriesOptions> getTimeSeriesOptions() {
return Optional.ofNullable(timeSeriesOptions);
}
/**
* Get the {@link CollectionChangeStreamOptions} if available.
*
* @return {@link Optional#empty()} if not specified.
* @since 4.0
*/
public Optional<CollectionChangeStreamOptions> getChangeStreamOptions() {
return Optional.ofNullable(changeStreamOptions);
}
/** /**
* Encapsulation of ValidationOptions options. * Encapsulation of ValidationOptions options.
* *
@@ -451,7 +385,7 @@ public class CollectionOptions {
/** /**
* Get the {@code validationAction} to perform. * Get the {@code validationAction} to perform.
* *
* @return {@link Optional#empty()} if not set. * @return @return {@link Optional#empty()} if not set.
*/ */
public Optional<ValidationAction> getValidationAction() { public Optional<ValidationAction> getValidationAction() {
return Optional.ofNullable(validationAction); return Optional.ofNullable(validationAction);
@@ -464,117 +398,4 @@ public class CollectionOptions {
return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel()); return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel());
} }
} }
/**
* Encapsulation of options applied to define collections change stream behaviour.
*
* @author Christoph Strobl
* @since 4.0
*/
public static class CollectionChangeStreamOptions {
private final boolean preAndPostImages;
private CollectionChangeStreamOptions(boolean emitChangedRevisions) {
this.preAndPostImages = emitChangedRevisions;
}
/**
* Output the version of a document before and after changes (the document pre- and post-images).
*
* @return new instance of {@link CollectionChangeStreamOptions}.
*/
public static CollectionChangeStreamOptions preAndPostImages(boolean emitChangedRevisions) {
return new CollectionChangeStreamOptions(true);
}
public boolean getPreAndPostImages() {
return preAndPostImages;
}
}
/**
* Options applicable to Time Series collections.
*
* @author Christoph Strobl
* @since 3.3
* @see <a href=
* "https://docs.mongodb.com/manual/core/timeseries-collections">https://docs.mongodb.com/manual/core/timeseries-collections</a>
*/
public static class TimeSeriesOptions {
private final String timeField;
private @Nullable final String metaField;
private final GranularityDefinition granularity;
private TimeSeriesOptions(String timeField, @Nullable String metaField, GranularityDefinition granularity) {
Assert.hasText(timeField, "Time field must not be empty or null");
this.timeField = timeField;
this.metaField = metaField;
this.granularity = granularity;
}
/**
* Create a new instance of {@link TimeSeriesOptions} using the given field as its {@literal timeField}. The one,
* that contains the date in each time series document. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param timeField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public static TimeSeriesOptions timeSeries(String timeField) {
return new TimeSeriesOptions(timeField, null, Granularity.DEFAULT);
}
/**
* Set the name of the field which contains metadata in each time series document. Should not be the {@literal id}
* nor {@link TimeSeriesOptions#timeSeries(String)} timeField} nor point to an {@literal array} or
* {@link java.util.Collection}. <br />
* {@link Field#name() Annotated fieldnames} will be considered during the mapping process.
*
* @param metaField must not be {@literal null}.
* @return new instance of {@link TimeSeriesOptions}.
*/
public TimeSeriesOptions metaField(String metaField) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* Select the {@link GranularityDefinition} parameter to define how data in the time series collection is organized.
* Select one that is closest to the time span between incoming measurements.
*
* @return new instance of {@link TimeSeriesOptions}.
* @see Granularity
*/
public TimeSeriesOptions granularity(GranularityDefinition granularity) {
return new TimeSeriesOptions(timeField, metaField, granularity);
}
/**
* @return never {@literal null}.
*/
public String getTimeField() {
return timeField;
}
/**
* @return can be {@literal null}. Might be an {@literal empty} {@link String} as well, so maybe check via
* {@link org.springframework.util.StringUtils#hasText(String)}.
*/
@Nullable
public String getMetaField() {
return metaField;
}
/**
* @return never {@literal null}.
*/
public GranularityDefinition getGranularity() {
return granularity;
}
}
} }

View File

@@ -23,8 +23,8 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.geo.Point; import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.query.MetricConversion;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
@@ -38,7 +38,7 @@ import org.springframework.util.ObjectUtils;
*/ */
class CountQuery { class CountQuery {
private final Document source; private Document source;
private CountQuery(Document source) { private CountQuery(Document source) {
this.source = source; this.source = source;
@@ -101,7 +101,7 @@ class CountQuery {
} }
if (valueToInspect instanceof Collection) { if (valueToInspect instanceof Collection) {
return requiresRewrite((Collection<?>) valueToInspect); return requiresRewrite((Collection) valueToInspect);
} }
return false; return false;
@@ -157,14 +157,12 @@ class CountQuery {
* @param $and potentially existing {@code $and} condition. * @param $and potentially existing {@code $and} condition.
* @return the rewritten query {@link Document}. * @return the rewritten query {@link Document}.
*/ */
@SuppressWarnings("unchecked")
private static Document createGeoWithin(String key, Document source, @Nullable Object $and) { private static Document createGeoWithin(String key, Document source, @Nullable Object $and) {
boolean spheric = source.containsKey("$nearSphere"); boolean spheric = source.containsKey("$nearSphere");
Object $near = spheric ? source.get("$nearSphere") : source.get("$near"); Object $near = spheric ? source.get("$nearSphere") : source.get("$near");
Number maxDistance = getMaxDistance(source, $near, spheric); Number maxDistance = source.containsKey("$maxDistance") ? (Number) source.get("$maxDistance") : Double.MAX_VALUE;
List<Object> $centerMax = Arrays.asList(toCenterCoordinates($near), maxDistance); List<Object> $centerMax = Arrays.asList(toCenterCoordinates($near), maxDistance);
Document $geoWithinMax = new Document("$geoWithin", Document $geoWithinMax = new Document("$geoWithin",
new Document(spheric ? "$centerSphere" : "$center", $centerMax)); new Document(spheric ? "$centerSphere" : "$center", $centerMax));
@@ -178,51 +176,23 @@ class CountQuery {
Document $geoWithinMin = new Document("$geoWithin", Document $geoWithinMin = new Document("$geoWithin",
new Document(spheric ? "$centerSphere" : "$center", $centerMin)); new Document(spheric ? "$centerSphere" : "$center", $centerMin));
List<Document> criteria; List<Document> criteria = new ArrayList<>();
if ($and != null) { if ($and != null) {
if ($and instanceof Collection) { if ($and instanceof Collection) {
Collection<Document> andElements = (Collection<Document>) $and; criteria.addAll((Collection) $and);
criteria = new ArrayList<>(andElements.size() + 2);
criteria.addAll(andElements);
} else { } else {
throw new IllegalArgumentException( throw new IllegalArgumentException(
"Cannot rewrite query as it contains an '$and' element that is not a Collection: Offending element: " "Cannot rewrite query as it contains an '$and' element that is not a Collection!: Offending element: "
+ $and); + $and);
} }
} else {
criteria = new ArrayList<>(2);
} }
criteria.add(new Document("$nor", Collections.singletonList(new Document(key, $geoWithinMin)))); criteria.add(new Document("$nor", Collections.singletonList(new Document(key, $geoWithinMin))));
criteria.add(new Document(key, $geoWithinMax)); criteria.add(new Document(key, $geoWithinMax));
return new Document("$and", criteria); return new Document("$and", criteria);
} }
private static Number getMaxDistance(Document source, Object $near, boolean spheric) {
Number maxDistance = Double.MAX_VALUE;
if (source.containsKey("$maxDistance")) { // legacy coordinate pair
return (Number) source.get("$maxDistance");
}
if ($near instanceof Document nearDoc) {
if (nearDoc.containsKey("$maxDistance")) {
maxDistance = (Number) nearDoc.get("$maxDistance");
// geojson is in Meters but we need radians x/(6378.1*1000)
if (spheric && nearDoc.containsKey("$geometry")) {
maxDistance = MetricConversion.metersToRadians(maxDistance.doubleValue());
}
}
}
return maxDistance;
}
private static boolean containsNear(Document source) { private static boolean containsNear(Document source) {
return source.containsKey("$near") || source.containsKey("$nearSphere"); return source.containsKey("$near") || source.containsKey("$nearSphere");
} }
@@ -246,16 +216,10 @@ class CountQuery {
return Arrays.asList(((Point) value).getX(), ((Point) value).getY()); return Arrays.asList(((Point) value).getX(), ((Point) value).getY());
} }
if (value instanceof Document document) { if (value instanceof Document && ((Document) value).containsKey("x")) {
if (document.containsKey("x")) { Document point = (Document) value;
return Arrays.asList(document.get("x"), document.get("y")); return Arrays.asList(point.get("x"), point.get("y"));
}
if (document.containsKey("$geometry")) {
Document geoJsonPoint = document.get("$geometry", Document.class);
return geoJsonPoint.get("coordinates");
}
} }
return value; return value;

View File

@@ -61,8 +61,8 @@ public interface CursorPreparer extends ReadPreferenceAware {
default FindIterable<Document> initiateFind(MongoCollection<Document> collection, default FindIterable<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindIterable<Document>> find) { Function<MongoCollection<Document>, FindIterable<Document>> find) {
Assert.notNull(collection, "Collection must not be null"); Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null"); Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) { if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference()); collection = collection.withReadPreference(getReadPreference());

View File

@@ -90,9 +90,9 @@ class DefaultBulkOperations implements BulkOperations {
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName, DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) { BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null"); Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty"); Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null"); Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
this.mongoOperations = mongoOperations; this.mongoOperations = mongoOperations;
this.collectionName = collectionName; this.collectionName = collectionName;
@@ -109,10 +109,14 @@ class DefaultBulkOperations implements BulkOperations {
this.defaultWriteConcern = defaultWriteConcern; this.defaultWriteConcern = defaultWriteConcern;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override @Override
public BulkOperations insert(Object document) { public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null"); Assert.notNull(document, "Document must not be null!");
maybeEmitEvent(new BeforeConvertEvent<>(document, collectionName)); maybeEmitEvent(new BeforeConvertEvent<>(document, collectionName));
Object source = maybeInvokeBeforeConvertCallback(document); Object source = maybeInvokeBeforeConvertCallback(document);
@@ -121,30 +125,42 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override @Override
public BulkOperations insert(List<? extends Object> documents) { public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null"); Assert.notNull(documents, "Documents must not be null!");
documents.forEach(this::insert); documents.forEach(this::insert);
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) { public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null"); Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update))); return updateOne(Collections.singletonList(Pair.of(query, update)));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override @Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) { public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null"); Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) { for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false); update(update.getFirst(), update.getSecond(), false, false);
@@ -153,20 +169,28 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) { public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null"); Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update))); return updateMulti(Collections.singletonList(Pair.of(query, update)));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override @Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) { public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null"); Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) { for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true); update(update.getFirst(), update.getSecond(), false, true);
@@ -175,11 +199,19 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override @Override
public BulkOperations upsert(Query query, Update update) { public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true); return update(query, update, true, true);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override @Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) { public BulkOperations upsert(List<Pair<Query, Update>> updates) {
@@ -190,10 +222,14 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public BulkOperations remove(Query query) { public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
DeleteOptions deleteOptions = new DeleteOptions(); DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation); query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
@@ -203,10 +239,14 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override @Override
public BulkOperations remove(List<Query> removes) { public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null"); Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) { for (Query query : removes) {
remove(query); remove(query);
@@ -215,12 +255,16 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override @Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) { public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(replacement, "Replacement must not be null"); Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(options, "Options must not be null"); Assert.notNull(options, "Options must not be null!");
ReplaceOptions replaceOptions = new ReplaceOptions(); ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert()); replaceOptions.upsert(options.isUpsert());
@@ -234,6 +278,10 @@ class DefaultBulkOperations implements BulkOperations {
return this; return this;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override @Override
public com.mongodb.bulk.BulkWriteResult execute() { public com.mongodb.bulk.BulkWriteResult execute() {
@@ -241,7 +289,7 @@ class DefaultBulkOperations implements BulkOperations {
com.mongodb.bulk.BulkWriteResult result = mongoOperations.execute(collectionName, this::bulkWriteTo); com.mongodb.bulk.BulkWriteResult result = mongoOperations.execute(collectionName, this::bulkWriteTo);
Assert.state(result != null, "Result must not be null"); Assert.state(result != null, "Result must not be null.");
models.forEach(this::maybeEmitAfterSaveEvent); models.forEach(this::maybeEmitAfterSaveEvent);
models.forEach(this::maybeInvokeAfterSaveCallback); models.forEach(this::maybeInvokeAfterSaveCallback);
@@ -308,8 +356,8 @@ class DefaultBulkOperations implements BulkOperations {
*/ */
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) { private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null"); Assert.notNull(update, "Update must not be null!");
UpdateOptions options = computeUpdateOptions(query, update, upsert); UpdateOptions options = computeUpdateOptions(query, update, upsert);
@@ -470,7 +518,7 @@ class DefaultBulkOperations implements BulkOperations {
return options.ordered(false); return options.ordered(false);
} }
throw new IllegalStateException("BulkMode was null"); throw new IllegalStateException("BulkMode was null!");
} }
/** /**

View File

@@ -83,9 +83,9 @@ public class DefaultIndexOperations implements IndexOperations {
public DefaultIndexOperations(MongoDatabaseFactory mongoDbFactory, String collectionName, QueryMapper queryMapper, public DefaultIndexOperations(MongoDatabaseFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
@Nullable Class<?> type) { @Nullable Class<?> type) {
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null"); Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null"); Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null"); Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.collectionName = collectionName; this.collectionName = collectionName;
this.mapper = queryMapper; this.mapper = queryMapper;
@@ -103,8 +103,8 @@ public class DefaultIndexOperations implements IndexOperations {
*/ */
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, @Nullable Class<?> type) { public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, @Nullable Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null"); Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations; this.mongoOperations = mongoOperations;
this.mapper = new QueryMapper(mongoOperations.getConverter()); this.mapper = new QueryMapper(mongoOperations.getConverter());
@@ -112,6 +112,10 @@ public class DefaultIndexOperations implements IndexOperations {
this.type = type; this.type = type;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public String ensureIndex(final IndexDefinition indexDefinition) { public String ensureIndex(final IndexDefinition indexDefinition) {
return execute(collection -> { return execute(collection -> {
@@ -146,6 +150,10 @@ public class DefaultIndexOperations implements IndexOperations {
return null; return null;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) { public void dropIndex(final String name) {
execute(collection -> { execute(collection -> {
@@ -155,10 +163,18 @@ public class DefaultIndexOperations implements IndexOperations {
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() { public void dropAllIndexes() {
dropIndex("*"); dropIndex("*");
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() { public List<IndexInfo> getIndexInfo() {
return execute(new CollectionCallback<List<IndexInfo>>() { return execute(new CollectionCallback<List<IndexInfo>>() {
@@ -172,8 +188,7 @@ public class DefaultIndexOperations implements IndexOperations {
private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) { private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
int available = cursor.available(); List<IndexInfo> indexInfoList = new ArrayList<>();
List<IndexInfo> indexInfoList = available > 0 ? new ArrayList<>(available) : new ArrayList<>();
while (cursor.hasNext()) { while (cursor.hasNext()) {
@@ -190,7 +205,7 @@ public class DefaultIndexOperations implements IndexOperations {
@Nullable @Nullable
public <T> T execute(CollectionCallback<T> callback) { public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null"); Assert.notNull(callback, "CollectionCallback must not be null!");
if (type != null) { if (type != null) {
return mongoOperations.execute(type, callback); return mongoOperations.execute(type, callback);

View File

@@ -42,6 +42,10 @@ class DefaultIndexOperationsProvider implements IndexOperationsProvider {
this.mapper = mapper; this.mapper = mapper;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override @Override
public IndexOperations indexOps(String collectionName, Class<?> type) { public IndexOperations indexOps(String collectionName, Class<?> type) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type); return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type);

View File

@@ -76,9 +76,9 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
private DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName, private DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName,
QueryMapper queryMapper, Optional<Class<?>> type) { QueryMapper queryMapper, Optional<Class<?>> type) {
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null"); Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null!");
Assert.notNull(collectionName, "Collection must not be null"); Assert.notNull(collectionName, "Collection must not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null"); Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.mongoOperations = mongoOperations; this.mongoOperations = mongoOperations;
this.collectionName = collectionName; this.collectionName = collectionName;
@@ -86,6 +86,10 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type; this.type = type;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) { public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, collection -> { return mongoOperations.execute(collectionName, collection -> {
@@ -115,14 +119,26 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null); .orElse(null);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) { public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then(); return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() { public Mono<Void> dropAllIndexes() {
return dropIndex("*"); return dropIndex("*");
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() { public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) // return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //

View File

@@ -31,6 +31,7 @@ import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript; import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript; import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
@@ -64,29 +65,41 @@ class DefaultScriptOperations implements ScriptOperations {
*/ */
public DefaultScriptOperations(MongoOperations mongoOperations) { public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null"); Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations; this.mongoOperations = mongoOperations;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override @Override
public NamedMongoScript register(ExecutableMongoScript script) { public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script)); return register(new NamedMongoScript(generateScriptName(), script));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override @Override
public NamedMongoScript register(NamedMongoScript script) { public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null"); Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME); mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script; return script;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override @Override
public Object execute(final ExecutableMongoScript script, final Object... args) { public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null"); Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() { return mongoOperations.execute(new DbCallback<Object>() {
@@ -102,10 +115,14 @@ class DefaultScriptOperations implements ScriptOperations {
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override @Override
public Object call(final String scriptName, final Object... args) { public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty"); Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() { return mongoOperations.execute(new DbCallback<Object>() {
@@ -118,14 +135,22 @@ class DefaultScriptOperations implements ScriptOperations {
}); });
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override @Override
public boolean exists(String scriptName) { public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty"); Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME); return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override @Override
public Set<String> getScriptNames() { public Set<String> getScriptNames() {

View File

@@ -1,60 +0,0 @@
/*
* Copyright 2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.lang.Nullable;
/**
* Delegate class to encapsulate lifecycle event configuration and publishing.
*
* @author Mark Paluch
* @since 4.0
* @see ApplicationEventPublisher
*/
class EntityLifecycleEventDelegate {
private @Nullable ApplicationEventPublisher publisher;
private boolean eventsEnabled = true;
public void setPublisher(@Nullable ApplicationEventPublisher publisher) {
this.publisher = publisher;
}
public boolean isEventsEnabled() {
return eventsEnabled;
}
public void setEventsEnabled(boolean eventsEnabled) {
this.eventsEnabled = eventsEnabled;
}
/**
* Publish an application event if event publishing is enabled.
*
* @param event the application event.
*/
public void publishEvent(Object event) {
if (canPublishEvent()) {
publisher.publishEvent(event);
}
}
private boolean canPublishEvent() {
return publisher != null && eventsEnabled;
}
}

View File

@@ -23,43 +23,25 @@ import java.util.Optional;
import org.bson.Document; import org.bson.Document;
import org.springframework.core.convert.ConversionService; import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException; import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.IdentifierAccessor; import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException; import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity; import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor; import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor; import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.CollectionOptions.TimeSeriesOptions;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter; import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes; import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.TimeSeries;
import org.springframework.data.mongodb.core.query.Collation; import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria; import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.EntityProjectionIntrospector;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import org.springframework.util.LinkedMultiValueMap; import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap; import org.springframework.util.MultiValueMap;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.ChangeStreamPreAndPostImagesOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.TimeSeriesGranularity;
import com.mongodb.client.model.ValidationOptions;
/** /**
* Common operations performed on an entity in the context of it's mapping metadata. * Common operations performed on an entity in the context of it's mapping metadata.
@@ -76,31 +58,9 @@ class EntityOperations {
private static final String ID_FIELD = "_id"; private static final String ID_FIELD = "_id";
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context; private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
private final QueryMapper queryMapper;
private final EntityProjectionIntrospector introspector; EntityOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
private final MongoJsonSchemaMapper schemaMapper;
EntityOperations(MongoConverter converter) {
this(converter, new QueryMapper(converter));
}
EntityOperations(MongoConverter converter, QueryMapper queryMapper) {
this(converter, converter.getMappingContext(), converter.getCustomConversions(), converter.getProjectionFactory(),
queryMapper);
}
EntityOperations(MongoConverter converter,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
CustomConversions conversions, ProjectionFactory projectionFactory, QueryMapper queryMapper) {
this.context = context; this.context = context;
this.queryMapper = queryMapper;
this.introspector = EntityProjectionIntrospector.create(projectionFactory,
EntityProjectionIntrospector.ProjectionPredicate.typeHierarchy()
.and(((target, underlyingType) -> !conversions.isSimpleType(target))),
context);
this.schemaMapper = new MongoJsonSchemaMapper(converter);
} }
/** /**
@@ -112,7 +72,7 @@ class EntityOperations {
@SuppressWarnings({ "unchecked", "rawtypes" }) @SuppressWarnings({ "unchecked", "rawtypes" })
<T> Entity<T> forEntity(T entity) { <T> Entity<T> forEntity(T entity) {
Assert.notNull(entity, "Bean must not be null"); Assert.notNull(entity, "Bean must not be null!");
if (entity instanceof String) { if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString())); return new UnmappedEntity(parse(entity.toString()));
@@ -135,8 +95,8 @@ class EntityOperations {
@SuppressWarnings({ "unchecked", "rawtypes" }) @SuppressWarnings({ "unchecked", "rawtypes" })
<T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) { <T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
Assert.notNull(entity, "Bean must not be null"); Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(conversionService, "ConversionService must not be null"); Assert.notNull(conversionService, "ConversionService must not be null!");
if (entity instanceof String) { if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString())); return new UnmappedEntity(parse(entity.toString()));
@@ -171,17 +131,10 @@ class EntityOperations {
if (entityClass == null) { if (entityClass == null) {
throw new InvalidDataAccessApiUsageException( throw new InvalidDataAccessApiUsageException(
"No class parameter provided, entity collection can't be determined"); "No class parameter provided, entity collection can't be determined!");
} }
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(entityClass); return context.getRequiredPersistentEntity(entityClass).getCollection();
if (persistentEntity == null) {
throw new MappingException(String.format(
"Cannot determine collection name from type '%s'. Is it a store native type?", entityClass.getName()));
}
return persistentEntity.getCollection();
} }
public Query getByIdInQuery(Collection<?> entities) { public Query getByIdInQuery(Collection<?> entities) {
@@ -208,7 +161,7 @@ class EntityOperations {
*/ */
public String getIdPropertyName(Class<?> type) { public String getIdPropertyName(Class<?> type) {
Assert.notNull(type, "Type must not be null"); Assert.notNull(type, "Type must not be null!");
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(type); MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(type);
@@ -247,12 +200,12 @@ class EntityOperations {
try { try {
return Document.parse(source); return Document.parse(source);
} catch (org.bson.json.JsonParseException o_O) { } catch (org.bson.json.JsonParseException o_O) {
throw new MappingException("Could not parse given String to save into a JSON document", o_O); throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
} catch (RuntimeException o_O) { } catch (RuntimeException o_O) {
// legacy 3.x exception // legacy 3.x exception
if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) { if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) {
throw new MappingException("Could not parse given String to save into a JSON document", o_O); throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
} }
throw o_O; throw o_O;
} }
@@ -272,92 +225,6 @@ class EntityOperations {
return UntypedOperations.instance(); return UntypedOperations.instance();
} }
/**
* Introspect the given {@link Class result type} in the context of the {@link Class entity type} whether the returned
* type is a projection and what property paths are participating in the projection.
*
* @param resultType the type to project on. Must not be {@literal null}.
* @param entityType the source domain type. Must not be {@literal null}.
* @return the introspection result.
* @since 3.4
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
return introspector.introspect(resultType, entityType);
}
/**
* Convert {@link CollectionOptions} to {@link CreateCollectionOptions} using {@link Class entityType} to obtain
* mapping metadata.
*
* @param collectionOptions
* @param entityType
* @return
* @since 3.4
*/
public CreateCollectionOptions convertToCreateCollectionOptions(@Nullable CollectionOptions collectionOptions,
Class<?> entityType) {
Optional<Collation> collation = Optionals.firstNonEmpty(
() -> Optional.ofNullable(collectionOptions).flatMap(CollectionOptions::getCollation),
() -> forType(entityType).getCollation());//
CreateCollectionOptions result = new CreateCollectionOptions();
collation.map(Collation::toMongoCollation).ifPresent(result::collation);
if (collectionOptions == null) {
return result;
}
collectionOptions.getCapped().ifPresent(result::capped);
collectionOptions.getSize().ifPresent(result::sizeInBytes);
collectionOptions.getMaxDocuments().ifPresent(result::maxDocuments);
collectionOptions.getCollation().map(Collation::toMongoCollation).ifPresent(result::collation);
collectionOptions.getValidationOptions().ifPresent(it -> {
ValidationOptions validationOptions = new ValidationOptions();
it.getValidationAction().ifPresent(validationOptions::validationAction);
it.getValidationLevel().ifPresent(validationOptions::validationLevel);
it.getValidator().ifPresent(val -> validationOptions.validator(getMappedValidator(val, entityType)));
result.validationOptions(validationOptions);
});
collectionOptions.getTimeSeriesOptions().map(forType(entityType)::mapTimeSeriesOptions).ifPresent(it -> {
com.mongodb.client.model.TimeSeriesOptions options = new com.mongodb.client.model.TimeSeriesOptions(
it.getTimeField());
if (StringUtils.hasText(it.getMetaField())) {
options.metaField(it.getMetaField());
}
if (!Granularity.DEFAULT.equals(it.getGranularity())) {
options.granularity(TimeSeriesGranularity.valueOf(it.getGranularity().name().toUpperCase()));
}
result.timeSeriesOptions(options);
});
collectionOptions.getChangeStreamOptions().ifPresent(it -> result
.changeStreamPreAndPostImagesOptions(new ChangeStreamPreAndPostImagesOptions(it.getPreAndPostImages())));
return result;
}
private Document getMappedValidator(Validator validator, Class<?> domainType) {
Document validationRules = validator.toDocument();
if (validationRules.containsKey("$jsonSchema")) {
return schemaMapper.mapSchema(validationRules, domainType);
}
return queryMapper.getMappedObject(validationRules, context.getPersistentEntity(domainType));
}
/** /**
* A representation of information about an entity. * A representation of information about an entity.
* *
@@ -502,21 +369,37 @@ class EntityOperations {
this.map = map; this.map = map;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override @Override
public String getIdFieldName() { public String getIdFieldName() {
return ID_FIELD; return ID_FIELD;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override @Override
public Object getId() { public Object getId() {
return map.get(ID_FIELD); return map.get(ID_FIELD);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override @Override
public Query getByIdQuery() { public Query getByIdQuery() {
return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD))); return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD)));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#populateIdIfNecessary(java.lang.Object)
*/
@Nullable @Nullable
@Override @Override
public T populateIdIfNecessary(@Nullable Object id) { public T populateIdIfNecessary(@Nullable Object id) {
@@ -526,11 +409,19 @@ class EntityOperations {
return map; return map;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion()
*/
@Override @Override
public Query getQueryForVersion() { public Query getQueryForVersion() {
throw new MappingException("Cannot query for version on plain Documents"); throw new MappingException("Cannot query for version on plain Documents!");
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override @Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) { public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document // return MappedDocument.of(map instanceof Document //
@@ -538,27 +429,47 @@ class EntityOperations {
: new Document(map)); : new Document(map));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#initializeVersionProperty()
*/
@Override @Override
public T initializeVersionProperty() { public T initializeVersionProperty() {
return map; return map;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#getVersion()
*/
@Override @Override
@Nullable @Nullable
public Number getVersion() { public Number getVersion() {
return null; return null;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#incrementVersion()
*/
@Override @Override
public T incrementVersion() { public T incrementVersion() {
return map; return map;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override @Override
public T getBean() { public T getBean() {
return map; return map;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override @Override
public boolean isNew() { public boolean isNew() {
return map.get(ID_FIELD) != null; return map.get(ID_FIELD) != null;
@@ -571,6 +482,10 @@ class EntityOperations {
super(map); super(map);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) { public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -610,21 +525,33 @@ class EntityOperations {
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor); return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override @Override
public String getIdFieldName() { public String getIdFieldName() {
return entity.getRequiredIdProperty().getFieldName(); return entity.getRequiredIdProperty().getFieldName();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override @Override
public Object getId() { public Object getId() {
return idAccessor.getRequiredIdentifier(); return idAccessor.getRequiredIdentifier();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override @Override
public Query getByIdQuery() { public Query getByIdQuery() {
if (!entity.hasIdProperty()) { if (!entity.hasIdProperty()) {
throw new MappingException("No id property found for object of type " + entity.getType()); throw new MappingException("No id property found for object of type " + entity.getType() + "!");
} }
MongoPersistentProperty idProperty = entity.getRequiredIdProperty(); MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
@@ -632,6 +559,10 @@ class EntityOperations {
return Query.query(Criteria.where(idProperty.getName()).is(getId())); return Query.query(Criteria.where(idProperty.getName()).is(getId()));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion(java.lang.Object)
*/
@Override @Override
public Query getQueryForVersion() { public Query getQueryForVersion() {
@@ -642,6 +573,10 @@ class EntityOperations {
.and(versionProperty.getName()).is(getVersion())); .and(versionProperty.getName()).is(getVersion()));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override @Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) { public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -657,6 +592,10 @@ class EntityOperations {
return MappedDocument.of(document); return MappedDocument.of(document);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#assertUpdateableIdIfNotSet()
*/
public void assertUpdateableIdIfNotSet() { public void assertUpdateableIdIfNotSet() {
if (!entity.hasIdProperty()) { if (!entity.hasIdProperty()) {
@@ -672,27 +611,43 @@ class EntityOperations {
if (!MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(property.getType())) { if (!MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(property.getType())) {
throw new InvalidDataAccessApiUsageException( throw new InvalidDataAccessApiUsageException(
String.format("Cannot autogenerate id of type %s for entity of type %s", property.getType().getName(), String.format("Cannot autogenerate id of type %s for entity of type %s!", property.getType().getName(),
entity.getType().getName())); entity.getType().getName()));
} }
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#isVersionedEntity()
*/
@Override @Override
public boolean isVersionedEntity() { public boolean isVersionedEntity() {
return entity.hasVersionProperty(); return entity.hasVersionProperty();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getVersion()
*/
@Override @Override
@Nullable @Nullable
public Object getVersion() { public Object getVersion() {
return propertyAccessor.getProperty(entity.getRequiredVersionProperty()); return propertyAccessor.getProperty(entity.getRequiredVersionProperty());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override @Override
public T getBean() { public T getBean() {
return propertyAccessor.getBean(); return propertyAccessor.getBean();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override @Override
public boolean isNew() { public boolean isNew() {
return entity.isNew(propertyAccessor.getBean()); return entity.isNew(propertyAccessor.getBean());
@@ -727,6 +682,10 @@ class EntityOperations {
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService)); new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#populateIdIfNecessary(java.lang.Object)
*/
@Nullable @Nullable
@Override @Override
public T populateIdIfNecessary(@Nullable Object id) { public T populateIdIfNecessary(@Nullable Object id) {
@@ -748,6 +707,10 @@ class EntityOperations {
return propertyAccessor.getBean(); return propertyAccessor.getBean();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MappedEntity#getVersion()
*/
@Override @Override
@Nullable @Nullable
public Number getVersion() { public Number getVersion() {
@@ -757,6 +720,10 @@ class EntityOperations {
return propertyAccessor.getProperty(versionProperty, Number.class); return propertyAccessor.getProperty(versionProperty, Number.class);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#initializeVersionProperty()
*/
@Override @Override
public T initializeVersionProperty() { public T initializeVersionProperty() {
@@ -771,6 +738,10 @@ class EntityOperations {
return propertyAccessor.getBean(); return propertyAccessor.getBean();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#incrementVersion()
*/
@Override @Override
public T incrementVersion() { public T incrementVersion() {
@@ -807,24 +778,6 @@ class EntityOperations {
* @return * @return
*/ */
Optional<Collation> getCollation(Query query); Optional<Collation> getCollation(Query query);
/**
* Derive the applicable {@link CollectionOptions} for the given type.
*
* @return never {@literal null}.
* @since 3.3
*/
CollectionOptions getCollectionOptions();
/**
* Map the fields of a given {@link TimeSeriesOptions} against the target domain type to consider potentially
* annotated field names.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 3.3
*/
TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options);
} }
/** /**
@@ -842,11 +795,19 @@ class EntityOperations {
return (TypedOperations) INSTANCE; return (TypedOperations) INSTANCE;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override @Override
public Optional<Collation> getCollation() { public Optional<Collation> getCollation() {
return Optional.empty(); return Optional.empty();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public Optional<Collation> getCollation(Query query) { public Optional<Collation> getCollation(Query query) {
@@ -856,16 +817,6 @@ class EntityOperations {
return query.getCollation(); return query.getCollation();
} }
@Override
public CollectionOptions getCollectionOptions() {
return CollectionOptions.empty();
}
@Override
public TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions options) {
return options;
}
} }
/** /**
@@ -881,11 +832,19 @@ class EntityOperations {
this.entity = entity; this.entity = entity;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override @Override
public Optional<Collation> getCollation() { public Optional<Collation> getCollation() {
return Optional.ofNullable(entity.getCollation()); return Optional.ofNullable(entity.getCollation());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public Optional<Collation> getCollation(Query query) { public Optional<Collation> getCollation(Query query) {
@@ -895,58 +854,6 @@ class EntityOperations {
return Optional.ofNullable(entity.getCollation()); return Optional.ofNullable(entity.getCollation());
} }
@Override
public CollectionOptions getCollectionOptions() {
CollectionOptions collectionOptions = CollectionOptions.empty();
if (entity.hasCollation()) {
collectionOptions = collectionOptions.collation(entity.getCollation());
}
if (entity.isAnnotationPresent(TimeSeries.class)) {
TimeSeries timeSeries = entity.getRequiredAnnotation(TimeSeries.class);
if (entity.getPersistentProperty(timeSeries.timeField()) == null) {
throw new MappingException(String.format("Time series field '%s' does not exist in type %s",
timeSeries.timeField(), entity.getName()));
}
TimeSeriesOptions options = TimeSeriesOptions.timeSeries(timeSeries.timeField());
if (StringUtils.hasText(timeSeries.metaField())) {
if (entity.getPersistentProperty(timeSeries.metaField()) == null) {
throw new MappingException(
String.format("Meta field '%s' does not exist in type %s", timeSeries.metaField(), entity.getName()));
}
options = options.metaField(timeSeries.metaField());
}
if (!Granularity.DEFAULT.equals(timeSeries.granularity())) {
options = options.granularity(timeSeries.granularity());
}
collectionOptions = collectionOptions.timeSeries(options);
}
return collectionOptions;
}
@Override
public TimeSeriesOptions mapTimeSeriesOptions(TimeSeriesOptions source) {
TimeSeriesOptions target = TimeSeriesOptions.timeSeries(mappedNameOrDefault(source.getTimeField()));
if (StringUtils.hasText(source.getMetaField())) {
target = target.metaField(mappedNameOrDefault(source.getMetaField()));
}
return target.granularity(source.getGranularity());
}
private String mappedNameOrDefault(String name) {
MongoPersistentProperty persistentProperty = entity.getPersistentProperty(name);
return persistentProperty != null ? persistentProperty.getFieldName() : name;
}
} }
} }

View File

@@ -15,10 +15,9 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation; import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults; import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.util.CloseableIterator;
/** /**
* {@link ExecutableAggregationOperation} allows creation and execution of MongoDB aggregation operations in a fluent * {@link ExecutableAggregationOperation} allows creation and execution of MongoDB aggregation operations in a fluent
@@ -89,12 +88,12 @@ public interface ExecutableAggregationOperation {
/** /**
* Apply pipeline operations as specified and stream all matching elements. <br /> * Apply pipeline operations as specified and stream all matching elements. <br />
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} * Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable}
* *
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g. * @return a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* through a try-with-resources clause). * Never {@literal null}.
*/ */
Stream<T> stream(); CloseableIterator<T> stream();
} }
/** /**

View File

@@ -15,11 +15,10 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation; import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults; import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation; import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -38,10 +37,14 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.template = template; this.template = template;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override @Override
public <T> ExecutableAggregation<T> aggregateAndReturn(Class<T> domainType) { public <T> ExecutableAggregation<T> aggregateAndReturn(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableAggregationSupport<>(template, domainType, null, null); return new ExecutableAggregationSupport<>(template, domainType, null, null);
} }
@@ -66,29 +69,45 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.collection = collection; this.collection = collection;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithCollection#inCollection(java.lang.String)
*/
@Override @Override
public AggregationWithAggregation<T> inCollection(String collection) { public AggregationWithAggregation<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty"); Assert.hasText(collection, "Collection must not be null nor empty!");
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection); return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithAggregation#by(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override @Override
public TerminatingAggregation<T> by(Aggregation aggregation) { public TerminatingAggregation<T> by(Aggregation aggregation) {
Assert.notNull(aggregation, "Aggregation must not be null"); Assert.notNull(aggregation, "Aggregation must not be null!");
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection); return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#all()
*/
@Override @Override
public AggregationResults<T> all() { public AggregationResults<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType); return template.aggregate(aggregation, getCollectionName(aggregation), domainType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#stream()
*/
@Override @Override
public Stream<T> stream() { public CloseableIterator<T> stream() {
return template.aggregateStream(aggregation, getCollectionName(aggregation), domainType); return template.aggregateStream(aggregation, getCollectionName(aggregation), domainType);
} }

View File

@@ -118,14 +118,14 @@ public interface ExecutableFindOperation {
/** /**
* Stream all matching elements. * Stream all matching elements.
* *
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g. * @return a {@link Stream} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed. Never
* through a try-with-resources clause). * {@literal null}.
*/ */
Stream<T> stream(); Stream<T> stream();
/** /**
* Get the number of matching elements. * Get the number of matching elements.
* <br /> * <p />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation * This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard, * execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use * session and transaction compliance. In case an inaccurate count satisfies the applications needs use

View File

@@ -20,11 +20,12 @@ import java.util.Optional;
import java.util.stream.Stream; import java.util.stream.Stream;
import org.bson.Document; import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException; import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils; import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
@@ -50,10 +51,14 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.template = template; this.template = template;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation#query(java.lang.Class)
*/
@Override @Override
public <T> ExecutableFind<T> query(Class<T> domainType) { public <T> ExecutableFind<T> query(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableFindSupport<>(template, domainType, domainType, null, ALL_QUERY); return new ExecutableFindSupport<>(template, domainType, domainType, null, ALL_QUERY);
} }
@@ -69,11 +74,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final MongoTemplate template; private final MongoTemplate template;
private final Class<?> domainType; private final Class<?> domainType;
private final Class<T> returnType; private final Class<T> returnType;
private final @Nullable String collection; @Nullable private final String collection;
private final Query query; private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType, ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
@Nullable String collection, Query query) { String collection, Query query) {
this.template = template; this.template = template;
this.domainType = domainType; this.domainType = domainType;
this.returnType = returnType; this.returnType = returnType;
@@ -81,30 +86,46 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.query = query; this.query = query;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override @Override
public FindWithProjection<T> inCollection(String collection) { public FindWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty"); Assert.hasText(collection, "Collection name must not be null nor empty!");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query); return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection#as(Class)
*/
@Override @Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) { public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
Assert.notNull(returnType, "ReturnType must not be null"); Assert.notNull(returnType, "ReturnType must not be null!");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query); return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public TerminatingFind<T> matching(Query query) { public TerminatingFind<T> matching(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query); return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#oneValue()
*/
@Override @Override
public T oneValue() { public T oneValue() {
@@ -115,12 +136,16 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
} }
if (result.size() > 1) { if (result.size() > 1) {
throw new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result", 1); throw new IncorrectResultSizeDataAccessException("Query " + asString() + " returned non unique result.", 1);
} }
return result.iterator().next(); return result.iterator().next();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#firstValue()
*/
@Override @Override
public T firstValue() { public T firstValue() {
@@ -129,36 +154,60 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return ObjectUtils.isEmpty(result) ? null : result.iterator().next(); return ObjectUtils.isEmpty(result) ? null : result.iterator().next();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#all()
*/
@Override @Override
public List<T> all() { public List<T> all() {
return doFind(null); return doFind(null);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#stream()
*/
@Override @Override
public Stream<T> stream() { public Stream<T> stream() {
return doStream(); return StreamUtils.createStreamFromIterator(doStream());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override @Override
public TerminatingFindNear<T> near(NearQuery nearQuery) { public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType); return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#count()
*/
@Override @Override
public long count() { public long count() {
return template.count(query, domainType, getCollectionName()); return template.count(query, domainType, getCollectionName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#exists()
*/
@Override @Override
public boolean exists() { public boolean exists() {
return template.exists(query, domainType, getCollectionName()); return template.exists(query, domainType, getCollectionName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
@Override @Override
public TerminatingDistinct<Object> distinct(String field) { public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null"); Assert.notNull(field, "Field must not be null!");
return new DistinctOperationSupport(this, field); return new DistinctOperationSupport(this, field);
} }
@@ -178,7 +227,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
returnType == domainType ? (Class<T>) Object.class : returnType); returnType == domainType ? (Class<T>) Object.class : returnType);
} }
private Stream<T> doStream() { private CloseableIterator<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType); return template.doStream(query, domainType, getCollectionName(), returnType);
} }
@@ -208,6 +257,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.delegate = delegate; this.delegate = delegate;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.clientFindIterable)
*/
@Override @Override
public FindIterable<Document> prepare(FindIterable<Document> iterable) { public FindIterable<Document> prepare(FindIterable<Document> iterable) {
@@ -242,23 +295,35 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.field = field; this.field = field;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) { public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null"); Assert.notNull(resultType, "ResultType must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field); return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public TerminatingDistinct<T> matching(Query query) { public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field); return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override @Override
public List<T> all() { public List<T> all() {
return delegate.doFindDistinct(field); return delegate.doFindDistinct(field);

View File

@@ -40,10 +40,14 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.template = template; this.template = template;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.coreExecutableInsertOperation#insert(java.lan.Class)
*/
@Override @Override
public <T> ExecutableInsert<T> insert(Class<T> domainType) { public <T> ExecutableInsert<T> insert(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableInsertSupport<>(template, domainType, null, null); return new ExecutableInsertSupport<>(template, domainType, null, null);
} }
@@ -67,43 +71,63 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.bulkMode = bulkMode; this.bulkMode = bulkMode;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#insert(java.lang.Class)
*/
@Override @Override
public T one(T object) { public T one(T object) {
Assert.notNull(object, "Object must not be null"); Assert.notNull(object, "Object must not be null!");
return template.insert(object, getCollectionName()); return template.insert(object, getCollectionName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override @Override
public Collection<T> all(Collection<? extends T> objects) { public Collection<T> all(Collection<? extends T> objects) {
Assert.notNull(objects, "Objects must not be null"); Assert.notNull(objects, "Objects must not be null!");
return template.insert(objects, getCollectionName()); return template.insert(objects, getCollectionName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingBulkInsert#bulk(java.util.Collection)
*/
@Override @Override
public BulkWriteResult bulk(Collection<? extends T> objects) { public BulkWriteResult bulk(Collection<? extends T> objects) {
Assert.notNull(objects, "Objects must not be null"); Assert.notNull(objects, "Objects must not be null!");
return template.bulkOps(bulkMode != null ? bulkMode : BulkMode.ORDERED, domainType, getCollectionName()) return template.bulkOps(bulkMode != null ? bulkMode : BulkMode.ORDERED, domainType, getCollectionName())
.insert(new ArrayList<>(objects)).execute(); .insert(new ArrayList<>(objects)).execute();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override @Override
public InsertWithBulkMode<T> inCollection(String collection) { public InsertWithBulkMode<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty"); Assert.hasText(collection, "Collection must not be null nor empty.");
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode); return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithBulkMode#withBulkMode(org.springframework.data.mongodb.core.BulkMode)
*/
@Override @Override
public TerminatingBulkInsert<T> withBulkMode(BulkMode bulkMode) { public TerminatingBulkInsert<T> withBulkMode(BulkMode bulkMode) {
Assert.notNull(bulkMode, "BulkMode must not be null"); Assert.notNull(bulkMode, "BulkMode must not be null!");
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode); return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
} }

View File

@@ -187,9 +187,7 @@ public interface ExecutableMapReduceOperation {
* *
* @author Christoph Strobl * @author Christoph Strobl
* @since 2.1 * @since 2.1
* @deprecated since 4.0 in favor of {@link org.springframework.data.mongodb.core.aggregation}.
*/ */
@Deprecated
interface MapReduceWithOptions<T> { interface MapReduceWithOptions<T> {
/** /**

View File

@@ -37,7 +37,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
ExecutableMapReduceOperationSupport(MongoTemplate template) { ExecutableMapReduceOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null"); Assert.notNull(template, "Template must not be null!");
this.template = template; this.template = template;
} }
@@ -48,7 +48,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public <T> ExecutableMapReduceSupport<T> mapReduce(Class<T> domainType) { public <T> ExecutableMapReduceSupport<T> mapReduce(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null); return new ExecutableMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null);
} }
@@ -101,7 +101,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public MapReduceWithProjection<T> inCollection(String collection) { public MapReduceWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty"); Assert.hasText(collection, "Collection name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);
@@ -114,7 +114,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public TerminatingMapReduce<T> matching(Query query) { public TerminatingMapReduce<T> matching(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);
@@ -127,7 +127,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public <R> MapReduceWithQuery<R> as(Class<R> resultType) { public <R> MapReduceWithQuery<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null"); Assert.notNull(resultType, "ResultType must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);
@@ -140,7 +140,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public ExecutableMapReduce<T> with(MapReduceOptions options) { public ExecutableMapReduce<T> with(MapReduceOptions options) {
Assert.notNull(options, "Options must not be null Please consider empty MapReduceOptions#options() instead"); Assert.notNull(options, "Options must not be null! Please consider empty MapReduceOptions#options() instead.");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);
@@ -153,7 +153,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public MapReduceWithReduceFunction<T> map(String mapFunction) { public MapReduceWithReduceFunction<T> map(String mapFunction) {
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty"); Assert.hasText(mapFunction, "MapFunction name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);
@@ -166,7 +166,7 @@ class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperatio
@Override @Override
public ExecutableMapReduce<T> reduce(String reduceFunction) { public ExecutableMapReduce<T> reduce(String reduceFunction) {
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty"); Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction, return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options); reduceFunction, options);

View File

@@ -41,10 +41,14 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.tempate = tempate; this.tempate = tempate;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation#remove(java.lang.Class)
*/
@Override @Override
public <T> ExecutableRemove<T> remove(Class<T> domainType) { public <T> ExecutableRemove<T> remove(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableRemoveSupport<>(tempate, domainType, ALL_QUERY, null); return new ExecutableRemoveSupport<>(tempate, domainType, ALL_QUERY, null);
} }
@@ -67,32 +71,52 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.collection = collection; this.collection = collection;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithCollection#inCollection(java.lang.String)
*/
@Override @Override
public RemoveWithQuery<T> inCollection(String collection) { public RemoveWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty"); Assert.hasText(collection, "Collection must not be null nor empty!");
return new ExecutableRemoveSupport<>(template, domainType, query, collection); return new ExecutableRemoveSupport<>(template, domainType, query, collection);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public TerminatingRemove<T> matching(Query query) { public TerminatingRemove<T> matching(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
return new ExecutableRemoveSupport<>(template, domainType, query, collection); return new ExecutableRemoveSupport<>(template, domainType, query, collection);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#all()
*/
@Override @Override
public DeleteResult all() { public DeleteResult all() {
return template.doRemove(getCollectionName(), query, domainType, true); return template.doRemove(getCollectionName(), query, domainType, true);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#one()
*/
@Override @Override
public DeleteResult one() { public DeleteResult one() {
return template.doRemove(getCollectionName(), query, domainType, false); return template.doRemove(getCollectionName(), query, domainType, false);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override @Override
public List<T> findAndRemove() { public List<T> findAndRemove() {

View File

@@ -89,7 +89,7 @@ public interface ExecutableUpdateOperation {
/** /**
* Trigger * Trigger
* <a href="https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/">findOneAndReplace</a> * <a href="https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/">findOneAndReplace<a/>
* execution by calling one of the terminating methods. * execution by calling one of the terminating methods.
* *
* @author Mark Paluch * @author Mark Paluch

View File

@@ -40,10 +40,14 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.template = template; this.template = template;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation#update(java.lang.Class)
*/
@Override @Override
public <T> ExecutableUpdate<T> update(Class<T> domainType) { public <T> ExecutableUpdate<T> update(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null"); Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType); return new ExecutableUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType);
} }
@@ -81,84 +85,128 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.targetType = targetType; this.targetType = targetType;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override @Override
public TerminatingUpdate<T> apply(UpdateDefinition update) { public TerminatingUpdate<T> apply(UpdateDefinition update) {
Assert.notNull(update, "Update must not be null"); Assert.notNull(update, "Update must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType); findAndReplaceOptions, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override @Override
public UpdateWithQuery<T> inCollection(String collection) { public UpdateWithQuery<T> inCollection(String collection) {
Assert.hasText(collection, "Collection must not be null nor empty"); Assert.hasText(collection, "Collection must not be null nor empty!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType); findAndReplaceOptions, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override @Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) { public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
Assert.notNull(options, "Options must not be null"); Assert.notNull(options, "Options must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, options, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, options,
findAndReplaceOptions, replacement, targetType); findAndReplaceOptions, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#replaceWith(Object)
*/
@Override @Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) { public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
Assert.notNull(replacement, "Replacement must not be null"); Assert.notNull(replacement, "Replacement must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType); findAndReplaceOptions, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override @Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) { public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
Assert.notNull(options, "Options must not be null"); Assert.notNull(options, "Options must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
options, replacement, targetType); options, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override @Override
public UpdateWithUpdate<T> matching(Query query) { public UpdateWithUpdate<T> matching(Query query) {
Assert.notNull(query, "Query must not be null"); Assert.notNull(query, "Query must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, targetType); findAndReplaceOptions, replacement, targetType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override @Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) { public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null"); Assert.notNull(resultType, "ResultType must not be null!");
return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions, return new ExecutableUpdateSupport<>(template, domainType, query, update, collection, findAndModifyOptions,
findAndReplaceOptions, replacement, resultType); findAndReplaceOptions, replacement, resultType);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#all()
*/
@Override @Override
public UpdateResult all() { public UpdateResult all() {
return doUpdate(true, false); return doUpdate(true, false);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#first()
*/
@Override @Override
public UpdateResult first() { public UpdateResult first() {
return doUpdate(false, false); return doUpdate(false, false);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#upsert()
*/
@Override @Override
public UpdateResult upsert() { public UpdateResult upsert() {
return doUpdate(true, true); return doUpdate(true, true);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue()
*/
@Override @Override
public @Nullable T findAndModifyValue() { public @Nullable T findAndModifyValue() {
@@ -167,6 +215,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
getCollectionName()); getCollectionName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndReplace#findAndReplaceValue()
*/
@Override @Override
public @Nullable T findAndReplaceValue() { public @Nullable T findAndReplaceValue() {

View File

@@ -35,7 +35,7 @@ public class FindAndModifyOptions {
private static final FindAndModifyOptions NONE = new FindAndModifyOptions() { private static final FindAndModifyOptions NONE = new FindAndModifyOptions() {
private static final String ERROR_MSG = "FindAndModifyOptions.none() cannot be changed; Please use FindAndModifyOptions.options() instead"; private static final String ERROR_MSG = "FindAndModifyOptions.none() cannot be changed. Please use FindAndModifyOptions.options() instead.";
@Override @Override
public FindAndModifyOptions returnNew(boolean returnNew) { public FindAndModifyOptions returnNew(boolean returnNew) {

View File

@@ -17,7 +17,7 @@ package org.springframework.data.mongodb.core;
/** /**
* Options for * Options for
* <a href="https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/">findOneAndReplace</a>. * <a href="https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/">findOneAndReplace<a/>.
* <br /> * <br />
* Defaults to * Defaults to
* <dl> * <dl>
@@ -38,7 +38,7 @@ public class FindAndReplaceOptions {
private static final FindAndReplaceOptions NONE = new FindAndReplaceOptions() { private static final FindAndReplaceOptions NONE = new FindAndReplaceOptions() {
private static final String ERROR_MSG = "FindAndReplaceOptions.none() cannot be changed; Please use FindAndReplaceOptions.options() instead"; private static final String ERROR_MSG = "FindAndReplaceOptions.none() cannot be changed. Please use FindAndReplaceOptions.options() instead.";
@Override @Override
public FindAndReplaceOptions returnNew() { public FindAndReplaceOptions returnNew() {

View File

@@ -61,8 +61,8 @@ public interface FindPublisherPreparer extends ReadPreferenceAware {
default FindPublisher<Document> initiateFind(MongoCollection<Document> collection, default FindPublisher<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindPublisher<Document>> find) { Function<MongoCollection<Document>, FindPublisher<Document>> find) {
Assert.notNull(collection, "Collection must not be null"); Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null"); Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) { if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference()); collection = collection.withReadPreference(getReadPreference());

View File

@@ -39,7 +39,7 @@ class GeoCommandStatistics {
*/ */
private GeoCommandStatistics(Document source) { private GeoCommandStatistics(Document source) {
Assert.notNull(source, "Source document must not be null"); Assert.notNull(source, "Source document must not be null!");
this.source = source; this.source = source;
} }
@@ -51,7 +51,7 @@ class GeoCommandStatistics {
*/ */
public static GeoCommandStatistics from(Document commandResult) { public static GeoCommandStatistics from(Document commandResult) {
Assert.notNull(commandResult, "Command result must not be null"); Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats"); Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((Document) stats); return stats == null ? NONE : new GeoCommandStatistics((Document) stats);

View File

@@ -115,10 +115,6 @@ abstract class IndexConverters {
ops = ops.collation(fromDocument(indexOptions.get("collation", Document.class))); ops = ops.collation(fromDocument(indexOptions.get("collation", Document.class)));
} }
if (indexOptions.containsKey("wildcardProjection")) {
ops.wildcardProjection(indexOptions.get("wildcardProjection", Document.class));
}
return ops; return ops;
}; };
} }

View File

@@ -97,16 +97,6 @@ public class MappedDocument {
return this.document; return this.document;
} }
/**
* Updates the documents {@link #ID_FIELD}.
*
* @param value the {@literal _id} value to set.
* @since 3.4.3
*/
public void updateId(Object value) {
document.put(ID_FIELD, value);
}
/** /**
* An {@link UpdateDefinition} that indicates that the {@link #getUpdateObject() update object} has already been * An {@link UpdateDefinition} that indicates that the {@link #getUpdateObject() update object} has already been
* mapped to the specific domain type. * mapped to the specific domain type.
@@ -122,31 +112,55 @@ public class MappedDocument {
this.delegate = delegate; this.delegate = delegate;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override @Override
public Document getUpdateObject() { public Document getUpdateObject() {
return delegate.getUpdateObject(); return delegate.getUpdateObject();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override @Override
public boolean modifies(String key) { public boolean modifies(String key) {
return delegate.modifies(key); return delegate.modifies(key);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override @Override
public void inc(String version) { public void inc(String version) {
delegate.inc(version); delegate.inc(version);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override @Override
public Boolean isIsolated() { public Boolean isIsolated() {
return delegate.isIsolated(); return delegate.isIsolated();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override @Override
public List<ArrayFilter> getArrayFilters() { public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters(); return delegate.getArrayFilters();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override @Override
public boolean hasArrayFilters() { public boolean hasArrayFilters() {
return delegate.hasArrayFilters(); return delegate.hasArrayFilters();

View File

@@ -20,19 +20,13 @@ import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.EnumSet; import java.util.EnumSet;
import java.util.List; import java.util.List;
import java.util.function.Predicate;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.PersistentProperty; import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverter; import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.Encrypted;
import org.springframework.data.mongodb.core.mapping.Field; import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.ArrayJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.EncryptedJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.ObjectJsonSchemaProperty; import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.ObjectJsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject; import org.springframework.data.mongodb.core.schema.JsonSchemaObject;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type; import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type;
@@ -40,13 +34,10 @@ import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema; import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.MongoJsonSchemaBuilder; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.MongoJsonSchemaBuilder;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject; import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/** /**
* {@link MongoJsonSchemaCreator} implementation using both {@link MongoConverter} and {@link MappingContext} to obtain * {@link MongoJsonSchemaCreator} implementation using both {@link MongoConverter} and {@link MappingContext} to obtain
@@ -61,8 +52,6 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private final MongoConverter converter; private final MongoConverter converter;
private final MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext; private final MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final Predicate<JsonSchemaPropertyContext> filter;
private final LinkedMultiValueMap<String, Class<?>> mergeProperties;
/** /**
* Create a new instance of {@link MappingMongoJsonSchemaCreator}. * Create a new instance of {@link MappingMongoJsonSchemaCreator}.
@@ -72,78 +61,27 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
MappingMongoJsonSchemaCreator(MongoConverter converter) { MappingMongoJsonSchemaCreator(MongoConverter converter) {
this(converter, (MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty>) converter.getMappingContext(), Assert.notNull(converter, "Converter must not be null!");
(property) -> true, new LinkedMultiValueMap<>());
}
@SuppressWarnings("unchecked")
MappingMongoJsonSchemaCreator(MongoConverter converter,
MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
Predicate<JsonSchemaPropertyContext> filter, LinkedMultiValueMap<String, Class<?>> mergeProperties) {
Assert.notNull(converter, "Converter must not be null");
this.converter = converter; this.converter = converter;
this.mappingContext = mappingContext; this.mappingContext = (MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty>) converter
this.filter = filter; .getMappingContext();
this.mergeProperties = mergeProperties;
} }
@Override /*
public MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter) { * (non-Javadoc)
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, mergeProperties); * org.springframework.data.mongodb.core.MongoJsonSchemaCreator#createSchemaFor(java.lang.Class)
}
@Override
public PropertySpecifier property(String path) {
return types -> withTypesFor(path, types);
}
/**
* Specify additional types to be considered wehen rendering the schema for the given path.
*
* @param path path the path using {@literal dot '.'} notation.
* @param types must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.4
*/ */
public MongoJsonSchemaCreator withTypesFor(String path, Class<?>... types) {
LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone();
for (Class<?> type : types) {
clone.add(path, type);
}
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, clone);
}
@Override @Override
public MongoJsonSchema createSchemaFor(Class<?> type) { public MongoJsonSchema createSchemaFor(Class<?> type) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(type); MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(type);
MongoJsonSchemaBuilder schemaBuilder = MongoJsonSchema.builder(); MongoJsonSchemaBuilder schemaBuilder = MongoJsonSchema.builder();
{
Encrypted encrypted = entity.findAnnotation(Encrypted.class);
if (encrypted != null) {
Document encryptionMetadata = new Document();
Collection<Object> encryptionKeyIds = entity.getEncryptionKeyIds();
if (!CollectionUtils.isEmpty(encryptionKeyIds)) {
encryptionMetadata.append("keyId", encryptionKeyIds);
}
if (StringUtils.hasText(encrypted.algorithm())) {
encryptionMetadata.append("algorithm", encrypted.algorithm());
}
schemaBuilder.encryptionMetadata(encryptionMetadata);
}
}
List<JsonSchemaProperty> schemaProperties = computePropertiesForEntity(Collections.emptyList(), entity); List<JsonSchemaProperty> schemaProperties = computePropertiesForEntity(Collections.emptyList(), entity);
schemaBuilder.properties(schemaProperties.toArray(new JsonSchemaProperty[0])); schemaBuilder.properties(schemaProperties.toArray(new JsonSchemaProperty[0]));
return schemaBuilder.build(); return schemaBuilder.build();
} }
private List<JsonSchemaProperty> computePropertiesForEntity(List<MongoPersistentProperty> path, private List<JsonSchemaProperty> computePropertiesForEntity(List<MongoPersistentProperty> path,
@@ -155,14 +93,6 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
List<MongoPersistentProperty> currentPath = new ArrayList<>(path); List<MongoPersistentProperty> currentPath = new ArrayList<>(path);
String stringPath = currentPath.stream().map(PersistentProperty::getName).collect(Collectors.joining("."));
stringPath = StringUtils.hasText(stringPath) ? (stringPath + "." + nested.getName()) : nested.getName();
if (!filter.test(new PropertyContext(stringPath, nested))) {
if (!mergeProperties.containsKey(stringPath)) {
continue;
}
}
if (path.contains(nested)) { // cycle guard if (path.contains(nested)) { // cycle guard
schemaProperties.add(createSchemaProperty(computePropertyFieldName(CollectionUtils.lastElement(currentPath)), schemaProperties.add(createSchemaProperty(computePropertyFieldName(CollectionUtils.lastElement(currentPath)),
Object.class, false)); Object.class, false));
@@ -178,114 +108,27 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private JsonSchemaProperty computeSchemaForProperty(List<MongoPersistentProperty> path) { private JsonSchemaProperty computeSchemaForProperty(List<MongoPersistentProperty> path) {
String stringPath = path.stream().map(MongoPersistentProperty::getName).collect(Collectors.joining("."));
MongoPersistentProperty property = CollectionUtils.lastElement(path); MongoPersistentProperty property = CollectionUtils.lastElement(path);
boolean required = isRequiredProperty(property); boolean required = isRequiredProperty(property);
Class<?> rawTargetType = computeTargetType(property); // target type before conversion Class<?> rawTargetType = computeTargetType(property); // target type before conversion
Class<?> targetType = converter.getTypeMapper().getWriteTargetTypeFor(rawTargetType); // conversion target type Class<?> targetType = converter.getTypeMapper().getWriteTargetTypeFor(rawTargetType); // conversion target type
if (!isCollection(property) && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) { if (property.isEntity() && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) {
if (property.isEntity() || mergeProperties.containsKey(stringPath)) { return createObjectSchemaPropertyForEntity(path, property, required);
List<JsonSchemaProperty> targetProperties = new ArrayList<>();
if (property.isEntity()) {
targetProperties.add(createObjectSchemaPropertyForEntity(path, property, required));
}
if (mergeProperties.containsKey(stringPath)) {
for (Class<?> theType : mergeProperties.get(stringPath)) {
ObjectJsonSchemaProperty target = JsonSchemaProperty.object(property.getName());
List<JsonSchemaProperty> nestedProperties = computePropertiesForEntity(path,
mappingContext.getRequiredPersistentEntity(theType));
targetProperties.add(createPotentiallyRequiredSchemaProperty(
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required));
}
}
return targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.merged(targetProperties);
}
} }
String fieldName = computePropertyFieldName(property); String fieldName = computePropertyFieldName(property);
JsonSchemaProperty schemaProperty; if (property.isCollectionLike()) {
if (isCollection(property)) { return createSchemaProperty(fieldName, targetType, required);
schemaProperty = createArraySchemaProperty(fieldName, property, required);
} else if (property.isMap()) { } else if (property.isMap()) {
schemaProperty = createSchemaProperty(fieldName, Type.objectType(), required); return createSchemaProperty(fieldName, Type.objectType(), required);
} else if (ClassUtils.isAssignable(Enum.class, targetType)) { } else if (ClassUtils.isAssignable(Enum.class, targetType)) {
schemaProperty = createEnumSchemaProperty(fieldName, targetType, required); return createEnumSchemaProperty(fieldName, targetType, required);
} else {
schemaProperty = createSchemaProperty(fieldName, targetType, required);
} }
return applyEncryptionDataIfNecessary(property, schemaProperty); return createSchemaProperty(fieldName, targetType, required);
}
private JsonSchemaProperty createArraySchemaProperty(String fieldName, MongoPersistentProperty property,
boolean required) {
ArrayJsonSchemaProperty schemaProperty = JsonSchemaProperty.array(fieldName);
if (isSpecificType(property)) {
schemaProperty = potentiallyEnhanceArraySchemaProperty(property, schemaProperty);
}
return createPotentiallyRequiredSchemaProperty(schemaProperty, required);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private ArrayJsonSchemaProperty potentiallyEnhanceArraySchemaProperty(MongoPersistentProperty property,
ArrayJsonSchemaProperty schemaProperty) {
MongoPersistentEntity<?> persistentEntity = mappingContext
.getPersistentEntity(property.getTypeInformation().getRequiredComponentType());
if (persistentEntity != null) {
List<JsonSchemaProperty> nestedProperties = computePropertiesForEntity(Collections.emptyList(), persistentEntity);
if (nestedProperties.isEmpty()) {
return schemaProperty;
}
return schemaProperty
.items(JsonSchemaObject.object().properties(nestedProperties.toArray(new JsonSchemaProperty[0])));
}
if (ClassUtils.isAssignable(Enum.class, property.getActualType())) {
List<Object> possibleValues = getPossibleEnumValues((Class<Enum>) property.getActualType());
return schemaProperty
.items(createSchemaObject(computeTargetType(property.getActualType(), possibleValues), possibleValues));
}
return schemaProperty.items(JsonSchemaObject.of(property.getActualType()));
}
private boolean isSpecificType(MongoPersistentProperty property) {
return !TypeInformation.OBJECT.equals(property.getTypeInformation().getActualType());
}
private JsonSchemaProperty applyEncryptionDataIfNecessary(MongoPersistentProperty property,
JsonSchemaProperty schemaProperty) {
Encrypted encrypted = property.findAnnotation(Encrypted.class);
if (encrypted == null) {
return schemaProperty;
}
EncryptedJsonSchemaProperty enc = new EncryptedJsonSchemaProperty(schemaProperty);
if (StringUtils.hasText(encrypted.algorithm())) {
enc = enc.algorithm(encrypted.algorithm());
}
if (!ObjectUtils.isEmpty(encrypted.keyId())) {
enc = enc.keys(property.getEncryptionKeyIds());
}
return enc;
} }
private JsonSchemaProperty createObjectSchemaPropertyForEntity(List<MongoPersistentProperty> path, private JsonSchemaProperty createObjectSchemaPropertyForEntity(List<MongoPersistentProperty> path,
@@ -299,12 +142,15 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required); target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required);
} }
@SuppressWarnings({ "unchecked", "rawtypes" })
private JsonSchemaProperty createEnumSchemaProperty(String fieldName, Class<?> targetType, boolean required) { private JsonSchemaProperty createEnumSchemaProperty(String fieldName, Class<?> targetType, boolean required) {
List<Object> possibleValues = getPossibleEnumValues((Class<Enum>) targetType); List<Object> possibleValues = new ArrayList<>();
targetType = computeTargetType(targetType, possibleValues); for (Object enumValue : EnumSet.allOf((Class) targetType)) {
possibleValues.add(converter.convertToMongoType(enumValue));
}
targetType = possibleValues.isEmpty() ? targetType : possibleValues.iterator().next().getClass();
return createSchemaProperty(fieldName, targetType, required, possibleValues); return createSchemaProperty(fieldName, targetType, required, possibleValues);
} }
@@ -315,20 +161,14 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
JsonSchemaProperty createSchemaProperty(String fieldName, Object type, boolean required, JsonSchemaProperty createSchemaProperty(String fieldName, Object type, boolean required,
Collection<?> possibleValues) { Collection<?> possibleValues) {
TypedJsonSchemaObject schemaObject = createSchemaObject(type, possibleValues);
return createPotentiallyRequiredSchemaProperty(JsonSchemaProperty.named(fieldName).with(schemaObject), required);
}
private TypedJsonSchemaObject createSchemaObject(Object type, Collection<?> possibleValues) {
TypedJsonSchemaObject schemaObject = type instanceof Type ? JsonSchemaObject.of(Type.class.cast(type)) TypedJsonSchemaObject schemaObject = type instanceof Type ? JsonSchemaObject.of(Type.class.cast(type))
: JsonSchemaObject.of(Class.class.cast(type)); : JsonSchemaObject.of(Class.class.cast(type));
if (!CollectionUtils.isEmpty(possibleValues)) { if (!CollectionUtils.isEmpty(possibleValues)) {
schemaObject = schemaObject.possibleValues(possibleValues); schemaObject = schemaObject.possibleValues(possibleValues);
} }
return schemaObject;
return createPotentiallyRequiredSchemaProperty(JsonSchemaProperty.named(fieldName).with(schemaObject), required);
} }
private String computePropertyFieldName(PersistentProperty property) { private String computePropertyFieldName(PersistentProperty property) {
@@ -359,53 +199,12 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
return mongoProperty.getFieldType() != mongoProperty.getActualType() ? Object.class : mongoProperty.getFieldType(); return mongoProperty.getFieldType() != mongoProperty.getActualType() ? Object.class : mongoProperty.getFieldType();
} }
private static Class<?> computeTargetType(Class<?> fallback, List<Object> possibleValues) {
return possibleValues.isEmpty() ? fallback : possibleValues.iterator().next().getClass();
}
private <E extends Enum<E>> List<Object> getPossibleEnumValues(Class<E> targetType) {
EnumSet<E> enumSet = EnumSet.allOf(targetType);
List<Object> possibleValues = new ArrayList<>(enumSet.size());
for (Object enumValue : enumSet) {
possibleValues.add(converter.convertToMongoType(enumValue));
}
return possibleValues;
}
private static boolean isCollection(MongoPersistentProperty property) {
return property.isCollectionLike() && !property.getType().equals(byte[].class);
}
static JsonSchemaProperty createPotentiallyRequiredSchemaProperty(JsonSchemaProperty property, boolean required) { static JsonSchemaProperty createPotentiallyRequiredSchemaProperty(JsonSchemaProperty property, boolean required) {
return required ? JsonSchemaProperty.required(property) : property;
}
class PropertyContext implements JsonSchemaPropertyContext { if (!required) {
private final String path;
private final MongoPersistentProperty property;
public PropertyContext(String path, MongoPersistentProperty property) {
this.path = path;
this.property = property;
}
@Override
public String getPath() {
return path;
}
@Override
public MongoPersistentProperty getProperty() {
return property; return property;
} }
@Override return JsonSchemaProperty.required(property);
public <T> MongoPersistentEntity<T> resolveEntity(MongoPersistentProperty property) {
return (MongoPersistentEntity<T>) mappingContext.getPersistentEntity(property);
}
} }
} }

View File

@@ -57,8 +57,8 @@ public class MongoAction {
public MongoAction(@Nullable WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, public MongoAction(@Nullable WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, @Nullable Class<?> entityType, @Nullable Document document, @Nullable Document query) { String collectionName, @Nullable Class<?> entityType, @Nullable Document document, @Nullable Document query) {
Assert.hasText(collectionName, "Collection name must not be null or empty"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(mongoActionOperation, "MongoActionOperation must not be null"); Assert.notNull(mongoActionOperation, "MongoActionOperation must not be null!");
this.defaultWriteConcern = defaultWriteConcern; this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation; this.mongoActionOperation = mongoActionOperation;

View File

@@ -42,20 +42,29 @@ public class MongoAdmin implements MongoAdminOperations {
*/ */
public MongoAdmin(MongoClient client) { public MongoAdmin(MongoClient client) {
Assert.notNull(client, "Client must not be null"); Assert.notNull(client, "Client must not be null!");
this.mongoClient = client; this.mongoClient = client;
} }
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation @ManagedOperation
public void dropDatabase(String databaseName) { public void dropDatabase(String databaseName) {
getDB(databaseName).drop(); getDB(databaseName).drop();
} }
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation @ManagedOperation
public void createDatabase(String databaseName) { public void createDatabase(String databaseName) {
getDB(databaseName); getDB(databaseName);
} }
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation @ManagedOperation
public String getDatabaseStats(String databaseName) { public String getDatabaseStats(String databaseName) {
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson(); return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson();

View File

@@ -119,15 +119,27 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator; this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends MongoClient> getObjectType() { public Class<? extends MongoClient> getObjectType() {
return MongoClient.class; return MongoClient.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable @Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) { public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex); return exceptionTranslator.translateExceptionIfPossible(ex);
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override @Override
protected MongoClient createInstance() throws Exception { protected MongoClient createInstance() throws Exception {
return createMongoClient(computeClientSetting()); return createMongoClient(computeClientSetting());
@@ -146,7 +158,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
protected MongoClientSettings computeClientSetting() { protected MongoClientSettings computeClientSetting() {
if (connectionString != null && (StringUtils.hasText(host) || port != null)) { if (connectionString != null && (StringUtils.hasText(host) || port != null)) {
throw new IllegalStateException("ConnectionString and host/port configuration exclude one another"); throw new IllegalStateException("ConnectionString and host/port configuration exclude one another!");
} }
ConnectionString connectionString = this.connectionString != null ? this.connectionString ConnectionString connectionString = this.connectionString != null ? this.connectionString
@@ -324,6 +336,10 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
return !fromConnectionStringIsDefault ? fromConnectionString : defaultValue; return !fromConnectionStringIsDefault ? fromConnectionString : defaultValue;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override @Override
protected void destroyInstance(@Nullable MongoClient instance) throws Exception { protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
@@ -337,11 +353,6 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
} }
private String getOrDefault(Object value, String defaultValue) { private String getOrDefault(Object value, String defaultValue) {
return !StringUtils.isEmpty(value) ? value.toString() : defaultValue;
if(value == null) {
return defaultValue;
}
String sValue = value.toString();
return StringUtils.hasText(sValue) ? sValue : defaultValue;
} }
} }

View File

@@ -36,7 +36,6 @@ import com.mongodb.MongoClientSettings.Builder;
import com.mongodb.ReadConcern; import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference; import com.mongodb.ReadPreference;
import com.mongodb.ServerAddress; import com.mongodb.ServerAddress;
import com.mongodb.ServerApi;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
import com.mongodb.connection.ClusterConnectionMode; import com.mongodb.connection.ClusterConnectionMode;
import com.mongodb.connection.ClusterType; import com.mongodb.connection.ClusterType;
@@ -114,7 +113,6 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
// encryption and retry // encryption and retry
private @Nullable AutoEncryptionSettings autoEncryptionSettings; private @Nullable AutoEncryptionSettings autoEncryptionSettings;
private @Nullable ServerApi serverApi;
/** /**
* @param socketConnectTimeoutMS in msec * @param socketConnectTimeoutMS in msec
@@ -397,15 +395,6 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
this.autoEncryptionSettings = autoEncryptionSettings; this.autoEncryptionSettings = autoEncryptionSettings;
} }
/**
* @param serverApi can be {@literal null}.
* @see MongoClientSettings.Builder#serverApi(ServerApi)
* @since 3.3
*/
public void setServerApi(@Nullable ServerApi serverApi) {
this.serverApi = serverApi;
}
@Override @Override
public Class<?> getObjectType() { public Class<?> getObjectType() {
return MongoClientSettings.class; return MongoClientSettings.class;
@@ -487,11 +476,9 @@ public class MongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoCli
if (retryWrites != null) { if (retryWrites != null) {
builder = builder.retryWrites(retryWrites); builder = builder.retryWrites(retryWrites);
} }
if (uUidRepresentation != null) { if (uUidRepresentation != null) {
builder = builder.uuidRepresentation(uUidRepresentation); builder.uuidRepresentation(uUidRepresentation);
}
if (serverApi != null) {
builder = builder.serverApi(serverApi);
} }
return builder.build(); return builder.build();

View File

@@ -44,8 +44,8 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
super(message); super(message);
Assert.notNull(writeResult, "WriteResult must not be null"); Assert.notNull(writeResult, "WriteResult must not be null!");
Assert.notNull(actionOperation, "MongoActionOperation must not be null"); Assert.notNull(actionOperation, "MongoActionOperation must not be null!");
this.writeResult = writeResult; this.writeResult = writeResult;
this.actionOperation = actionOperation; this.actionOperation = actionOperation;

View File

@@ -33,7 +33,7 @@ import com.mongodb.client.MongoDatabase;
/** /**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as * Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as
* database name and exception translator. * database name and exception translator.
* <br /> * <p/>
* Not intended to be used directly. * Not intended to be used directly.
* *
* @author Christoph Strobl * @author Christoph Strobl
@@ -64,10 +64,10 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
protected MongoDatabaseFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated, protected MongoDatabaseFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) { PersistenceExceptionTranslator exceptionTranslator) {
Assert.notNull(mongoClient, "MongoClient must not be null"); Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty"); Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"), Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs"); "Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongoClient = mongoClient; this.mongoClient = mongoClient;
this.databaseName = databaseName; this.databaseName = databaseName;
@@ -84,14 +84,22 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.writeConcern = writeConcern; this.writeConcern = writeConcern;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
public MongoDatabase getMongoDatabase() throws DataAccessException { public MongoDatabase getMongoDatabase() throws DataAccessException {
return getMongoDatabase(getDefaultDatabaseName()); return getMongoDatabase(getDefaultDatabaseName());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override @Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException { public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty"); Assert.hasText(dbName, "Database name must not be empty!");
MongoDatabase db = doGetMongoDatabase(dbName); MongoDatabase db = doGetMongoDatabase(dbName);
@@ -110,16 +118,28 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
*/ */
protected abstract MongoDatabase doGetMongoDatabase(String dbName); protected abstract MongoDatabase doGetMongoDatabase(String dbName);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception { public void destroy() throws Exception {
if (mongoInstanceCreated) { if (mongoInstanceCreated) {
closeClient(); closeClient();
} }
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() { public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator; return this.exceptionTranslator;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.Session)
*/
public MongoDatabaseFactory withSession(ClientSession session) { public MongoDatabaseFactory withSession(ClientSession session) {
return new MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session, this); return new MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session, this);
} }
@@ -160,31 +180,55 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.delegate = delegate; this.delegate = delegate;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
@Override @Override
public MongoDatabase getMongoDatabase() throws DataAccessException { public MongoDatabase getMongoDatabase() throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase()); return proxyMongoDatabase(delegate.getMongoDatabase());
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override @Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException { public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase(dbName)); return proxyMongoDatabase(delegate.getMongoDatabase(dbName));
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override @Override
public PersistenceExceptionTranslator getExceptionTranslator() { public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator(); return delegate.getExceptionTranslator();
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override @Override
public ClientSession getSession(ClientSessionOptions options) { public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options); return delegate.getSession(options);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override @Override
public MongoDatabaseFactory withSession(ClientSession session) { public MongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session); return delegate.withSession(session);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#isTransactionActive()
*/
@Override @Override
public boolean isTransactionActive() { public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction(); return session != null && session.hasActiveTransaction();

View File

@@ -0,0 +1,50 @@
/*
* Copyright 2018-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.support.PersistenceExceptionTranslator;
/**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as
* database name and exception translator.
* <p/>
* Not intended to be used directly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @param <C> Client type.
* @since 2.1
* @see SimpleMongoClientDatabaseFactory
* @deprecated since 3.0, use {@link MongoDatabaseFactorySupport} instead.
*/
@Deprecated
public abstract class MongoDbFactorySupport<C> extends MongoDatabaseFactorySupport<C> {
/**
* Create a new {@link MongoDbFactorySupport} object given {@code mongoClient}, {@code databaseName},
* {@code mongoInstanceCreated} and {@link PersistenceExceptionTranslator}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated {@literal true} if the client instance was created by a subclass of
* {@link MongoDbFactorySupport} to close the client on {@link #destroy()}.
* @param exceptionTranslator must not be {@literal null}.
*/
protected MongoDbFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
super(mongoClient, databaseName, mongoInstanceCreated, exceptionTranslator);
}
}

View File

@@ -88,6 +88,10 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
this.schemaMap = schemaMap; this.schemaMap = schemaMap;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override @Override
public AutoEncryptionSettings getObject() { public AutoEncryptionSettings getObject() {
@@ -105,6 +109,10 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
return source != null ? source : Collections.emptyMap(); return source != null ? source : Collections.emptyMap();
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override @Override
public Class<?> getObjectType() { public Class<?> getObjectType() {
return AutoEncryptionSettings.class; return AutoEncryptionSettings.class;

View File

@@ -68,6 +68,10 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>( private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException")); Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable @Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) { public DataAccessException translateExceptionIfPossible(RuntimeException ex) {

View File

@@ -15,24 +15,7 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import java.util.function.Predicate;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter; import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import org.springframework.data.mongodb.core.mapping.Encrypted;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.mapping.Unwrapped.Nullable;
import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema; import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.util.Assert; import org.springframework.util.Assert;
@@ -41,7 +24,6 @@ import org.springframework.util.Assert;
* following mapping rules. * following mapping rules.
* <p> * <p>
* <strong>Required Properties</strong> * <strong>Required Properties</strong>
* </p>
* <ul> * <ul>
* <li>Properties of primitive type</li> * <li>Properties of primitive type</li>
* </ul> * </ul>
@@ -63,7 +45,7 @@ import org.springframework.util.Assert;
* {@link org.springframework.data.annotation.Id _id} properties using types that can be converted into * {@link org.springframework.data.annotation.Id _id} properties using types that can be converted into
* {@link org.bson.types.ObjectId} like {@link String} will be mapped to {@code type : 'object'} unless there is more * {@link org.bson.types.ObjectId} like {@link String} will be mapped to {@code type : 'object'} unless there is more
* specific information available via the {@link org.springframework.data.mongodb.core.mapping.MongoId} annotation. * specific information available via the {@link org.springframework.data.mongodb.core.mapping.MongoId} annotation.
* {@link Encrypted} properties will contain {@literal encrypt} information. * </p>
* *
* @author Christoph Strobl * @author Christoph Strobl
* @since 2.2 * @since 2.2
@@ -78,111 +60,6 @@ public interface MongoJsonSchemaCreator {
*/ */
MongoJsonSchema createSchemaFor(Class<?> type); MongoJsonSchema createSchemaFor(Class<?> type);
/**
* Create a merged {@link MongoJsonSchema} out of the individual schemas of the given types by merging their
* properties into one large {@link MongoJsonSchema schema}.
*
* @param types must not be {@literal null} nor contain {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema mergedSchemaFor(Class<?>... types) {
MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new);
return MongoJsonSchema.merge(schemas);
}
/**
* Filter matching {@link JsonSchemaProperty properties}.
*
* @param filter the {@link Predicate} to evaluate for inclusion. Must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter);
/**
* Entry point to specify additional behavior for a given path.
*
* @param path the path using {@literal dot '.'} notation.
* @return new instance of {@link PropertySpecifier}.
* @since 3.4
*/
PropertySpecifier property(String path);
/**
* The context in which a specific {@link #getProperty()} is encountered during schema creation.
*
* @since 3.3
*/
interface JsonSchemaPropertyContext {
/**
* The path to a given field/property in dot notation.
*
* @return never {@literal null}.
*/
String getPath();
/**
* The current property.
*
* @return never {@literal null}.
*/
MongoPersistentProperty getProperty();
/**
* Obtain the {@link MongoPersistentEntity} for a given property.
*
* @param property must not be {@literal null}.
* @param <T>
* @return {@literal null} if the property is not an entity. It is nevertheless recommend to check
* {@link PersistentProperty#isEntity()} first.
*/
@Nullable
<T> MongoPersistentEntity<T> resolveEntity(MongoPersistentProperty property);
}
/**
* A filter {@link Predicate} that matches {@link Encrypted encrypted properties} and those having nested ones.
*
* @return new instance of {@link Predicate}.
* @since 3.3
*/
static Predicate<JsonSchemaPropertyContext> encryptedOnly() {
return new Predicate<JsonSchemaPropertyContext>() {
// cycle guard
private final Set<MongoPersistentProperty> seen = new HashSet<>();
@Override
public boolean test(JsonSchemaPropertyContext context) {
return extracted(context.getProperty(), context);
}
private boolean extracted(MongoPersistentProperty property, JsonSchemaPropertyContext context) {
if (property.isAnnotationPresent(Encrypted.class)) {
return true;
}
if (!property.isEntity() || seen.contains(property)) {
return false;
}
seen.add(property);
for (MongoPersistentProperty nested : context.resolveEntity(property)) {
if (extracted(nested, context)) {
return true;
}
}
return false;
}
};
}
/** /**
* Creates a new {@link MongoJsonSchemaCreator} that is aware of conversions applied by the given * Creates a new {@link MongoJsonSchemaCreator} that is aware of conversions applied by the given
* {@link MongoConverter}. * {@link MongoConverter}.
@@ -192,59 +69,7 @@ public interface MongoJsonSchemaCreator {
*/ */
static MongoJsonSchemaCreator create(MongoConverter mongoConverter) { static MongoJsonSchemaCreator create(MongoConverter mongoConverter) {
Assert.notNull(mongoConverter, "MongoConverter must not be null"); Assert.notNull(mongoConverter, "MongoConverter must not be null!");
return new MappingMongoJsonSchemaCreator(mongoConverter); return new MappingMongoJsonSchemaCreator(mongoConverter);
} }
/**
* Creates a new {@link MongoJsonSchemaCreator} that is aware of type mappings and potential
* {@link org.springframework.data.spel.spi.EvaluationContextExtension extensions}.
*
* @param mappingContext must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
static MongoJsonSchemaCreator create(MappingContext mappingContext) {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(config -> {}));
converter.afterPropertiesSet();
return create(converter);
}
/**
* Creates a new {@link MongoJsonSchemaCreator} that does not consider potential extensions - suitable for testing. We
* recommend to use {@link #create(MappingContext)}.
*
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.3
*/
static MongoJsonSchemaCreator create() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setSimpleTypeHolder(MongoSimpleTypes.HOLDER);
mappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(config -> {}));
converter.afterPropertiesSet();
return create(converter);
}
/**
* @author Christoph Strobl
* @since 3.4
*/
interface PropertySpecifier {
/**
* Set additional type parameters for polymorphic ones.
*
* @param types must not be {@literal null}.
* @return the source
*/
MongoJsonSchemaCreator withTypes(Class<?>... types);
}
} }

View File

@@ -1,92 +0,0 @@
/*
* Copyright 2021-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import com.mongodb.ServerApi;
import com.mongodb.ServerApi.Builder;
import com.mongodb.ServerApiVersion;
/**
* {@link FactoryBean} for creating {@link ServerApi} using the {@link ServerApi.Builder}.
*
* @author Christoph Strobl
* @since 3.3
*/
public class MongoServerApiFactoryBean implements FactoryBean<ServerApi> {
private String version;
private @Nullable Boolean deprecationErrors;
private @Nullable Boolean strict;
/**
* @param version the version string either as the enum name or the server version value.
* @see ServerApiVersion
*/
public void setVersion(String version) {
this.version = version;
}
/**
* @param deprecationErrors
* @see ServerApi.Builder#deprecationErrors(boolean)
*/
public void setDeprecationErrors(@Nullable Boolean deprecationErrors) {
this.deprecationErrors = deprecationErrors;
}
/**
* @param strict
* @see ServerApi.Builder#strict(boolean)
*/
public void setStrict(@Nullable Boolean strict) {
this.strict = strict;
}
@Nullable
@Override
public ServerApi getObject() throws Exception {
Builder builder = ServerApi.builder().version(version());
if (deprecationErrors != null) {
builder = builder.deprecationErrors(deprecationErrors);
}
if (strict != null) {
builder = builder.strict(strict);
}
return builder.build();
}
@Nullable
@Override
public Class<?> getObjectType() {
return ServerApi.class;
}
private ServerApiVersion version() {
try {
// lookup by name eg. 'V1'
return ObjectUtils.caseInsensitiveValueOf(ServerApiVersion.values(), version);
} catch (IllegalArgumentException e) {
// or just the version number, eg. just '1'
return ServerApiVersion.findByValue(version);
}
}
}

Some files were not shown because too many files have changed in this diff Show More