Compare commits

..

190 Commits

Author SHA1 Message Date
Mark Paluch
fa94c22c2a DATAMONGO-2333 - Release version 2.1.11 (Lovelace SR11). 2019-09-30 11:04:23 +02:00
Mark Paluch
e0f88a8b84 DATAMONGO-2333 - Prepare 2.1.11 (Lovelace SR11). 2019-09-30 11:04:00 +02:00
Mark Paluch
59aa8051d3 DATAMONGO-2333 - Updated changelog. 2019-09-30 11:03:56 +02:00
Mark Paluch
205a06e79a DATAMONGO-2377 - Polishing.
Reformat code.

Original pull request: #792.
2019-09-25 13:36:26 +02:00
Christoph Strobl
899b43a29b DATAMONGO-2377 - Fix handling of $$value and $$this in field exposing aggregation.
Internal field references to $$this and $$value are now no longer mapped against exposed fields which had caused errors before.

Original pull request: #792.
2019-09-25 13:36:26 +02:00
Christoph Strobl
0f0a4ed31b DATAMONGO-1731 - Improve update & upsert documentation. 2019-09-23 13:36:03 +02:00
Mark Paluch
9acc8d5268 DATAMONGO-2360 - Polishing.
Apply index hints in ReactiveMongoTemplate.count(…).

Original pull request: #788.
2019-09-19 14:55:06 +02:00
Christoph Strobl
313ffb5426 DATAMONGO-2360 - Apply query hint to count queries.
Original pull request: #788.
2019-09-19 14:41:50 +02:00
Mark Paluch
dc859953f4 DATAMONGO-2374 - Polishing.
Reformat code.

Original pull request: #791.
2019-09-19 14:17:30 +02:00
Christoph Strobl
bc29f2b24e DATAMONGO-2374 - Fix simple type result handling for repository query methods.
Original pull request: #791.
2019-09-19 14:17:09 +02:00
Christoph Strobl
686cdac73f DATAMONGO-2366 - Polishing.
Fix typo in reference documentation and add note on error handling.

Original Pull Request: #790
2019-09-19 11:47:21 +02:00
Mark Paluch
b7b339577b DATAMONGO-2366 - Consistently handle exceptions in CursorReadingTask.
Exceptions during CursorReadingTask startup and during polling are handled now by the same exception handling to handle Exceptions only once and notify ErrorHandler exactly once per exception.

Previously, startup exceptions relied on exception handling in the execute closure and notified ErrorHandler potentially multiple times.

Original Pull Request: #790
2019-09-19 11:34:58 +02:00
Franz van Betteraey
2166a6e953 DATAMONGO-2361 - Fix @Document reference documentation.
Original pull request: #787.
2019-09-13 10:53:15 +02:00
Mark Paluch
3c601a699a DATAMONGO-2335 - Updated changelog. 2019-09-06 10:22:55 +02:00
Christoph Strobl
37211fc6d7 DATAMONGO-2310 - Update documentation for TypedAggregation. 2019-09-05 13:02:30 +02:00
Christoph Strobl
a45c9040c4 DATAMONGO-2348 - Update documentation of version property handling. 2019-09-05 10:47:24 +02:00
Christoph Strobl
23c0a07b93 DATAMONGO-2354 - Polishing.
Same as with FindPublisherPreparer the CursorPreparer needs to be public because it is used in one of the protected methods of MongoTemplate.

Original Pull Request: #784
2019-09-04 13:30:05 +02:00
kostya05983
f3a7d6a20e DATAMONGO-2354 - Change visibility of FindPublisherPreparer.
The FindPublisherPreparer is used in an protected method of ReactiveMongoTemplate and needs to be public to allow overriding.

Original Pull Request: #784
2019-09-04 13:17:33 +02:00
Mark Paluch
0d22d831f8 DATAMONGO-2352 - Polishing.
Apply typo fixes also to ReactiveMongoOperations.

Original pull request: #782.
2019-09-03 11:28:37 +02:00
Ryan Cloherty
6b0e2ab5de DATAMONGO-2352 - Fix documentation typos.
Original pull request: #782.
2019-09-03 11:28:37 +02:00
Mark Paluch
5d02b84856 DATAMONGO-2349 - Polishing.
Reformat code. Remove duplicate simple types.

Original pull request: #783.
2019-09-03 08:58:10 +02:00
Christoph Strobl
93e911985e DATAMONGO-2349 - Fix converter registration for java.time types.
The MongoDB Java Driver does not handle java.time types. Therefore those must not be considered simple types.
The behavior was changed by DATACMNS-1294 forcing usage of Reading & WritingConverter annotations to disambiguate converter direction.
This commit restores the converter registration to the state before the change in Spring Data Commons.

Original pull request: #783.
2019-09-03 08:58:05 +02:00
Christoph Strobl
e7faa1a1ec DATAMONGO-2351 - Fix test setup for indices with collations.
This avoids errors in text execution switching back from a newer version that may leave incompatible data (collation option) in one of the collections.

Original Pull Request: #781
2019-08-23 10:37:58 +02:00
Christoph Strobl
631714941a DATAMONGO-2351 - Polishing.
Fix broken tests and favor StepVerifier over block() for reactive ones.

Original Pull Request: #781
2019-08-23 07:44:44 +02:00
Artyom Gabeev
db9428cebe DATAMONGO-2351 - Return zero deleted count for unacknowledged deleteBy.
Original Pull Request: #781
2019-08-23 07:41:16 +02:00
Christoph Strobl
4be53ac952 DATAMONGO-2339 - Fix QueryMapper field name resolution for properties containing underscore.
We now prevent splitting of paths that contain underscores if the entity contains a property that matches.

Original pull request: #777.
2019-08-13 10:42:05 +02:00
Mark Paluch
564acd75d5 DATAMONGO-2338 - Open RepositoryFactoryBeans for extension.
createRepositoryFactory() is now no longer final allowing for overriding the method. This change aligns with the remanining store modules.
2019-08-06 15:57:49 +02:00
Mark Paluch
95ccdf4c20 DATAMONGO-2337 - Add HTTPS entries into spring.schemas.
To resolve XSD files properly from the classpath, their HTTPS reference must be present in the spring.schemas to avoid internet interaction for resolving an XSD file.
2019-08-06 15:57:47 +02:00
Greg Turnquist
291ef4bb75 DATAMONGO-2280 - Force check for updates. 2019-08-05 11:07:40 -05:00
Mark Paluch
c7461928f4 DATAMONGO-2303 - Updated changelog. 2019-08-05 15:57:31 +02:00
Mark Paluch
f5a5d3e96b DATAMONGO-2302 - After release cleanups. 2019-08-05 11:33:32 +02:00
Mark Paluch
b213aada80 DATAMONGO-2302 - Prepare next development iteration. 2019-08-05 11:33:31 +02:00
Mark Paluch
403e5043cb DATAMONGO-2302 - Release version 2.1.10 (Lovelace SR10). 2019-08-05 11:21:19 +02:00
Mark Paluch
bdbda459c0 DATAMONGO-2302 - Prepare 2.1.10 (Lovelace SR10). 2019-08-05 11:20:57 +02:00
Mark Paluch
0bf6d5f7fa DATAMONGO-2302 - Updated changelog. 2019-08-05 11:20:51 +02:00
Mark Paluch
f2ae14206a DATAMONGO-2272 - Updated changelog. 2019-08-05 11:09:02 +02:00
Mark Paluch
049159374d DATAMONGO-2320 - Polishing.
Use for-loops instead of Stream API and Collectors. Reformat code. Invert condition for smoother readability.

Original pull request: #776.
2019-08-02 11:06:38 +02:00
Christoph Strobl
79f8e06fc1 DATAMONGO-2320 - Simplify test code.
Original pull request: #776.
2019-08-02 11:06:38 +02:00
Christoph Strobl
370db2dce5 DATAMONGO-2320 - Fix aggregation field reference for $filter operator.
We now render field and local variable references correctly when using the $filter aggregation operator.
Prior to this commit field references had been rendered with an additional $ prefix.

Original pull request: #776.
2019-08-02 11:06:38 +02:00
Christoph Strobl
74325d5193 DATAMONGO-2330 - Polishing.
Added tests.

Original Pull Request: #775
2019-07-31 14:47:59 +02:00
nkey
e6ea2e1379 DATAMONGO-2330 - Apply defaultWriteConcern for bulk operations.
Fixed regression introduced in DATAMONGO-1880.

Original Pull Request: #775
2019-07-31 14:27:24 +02:00
Mark Paluch
cb85f3cfa6 DATAMONGO-2322 - Polishing.
Suspend change streams tests relying heavily on timing.
2019-07-18 10:09:19 +02:00
Mark Paluch
aff8b89006 DATAMONGO-2322 - Handle exceptions thrown by MessageListeners.
ErrorHandlers associated with a CursorReadingTask (Change Streams, imperative Tailable Cursors) now handle exceptions raised by the listener callback.

Exceptions are now catched and the callback continues with the next message.
2019-07-18 09:57:38 +02:00
Mark Paluch
0ad8857368 DATAMONGO-2323 - Polishing.
Use .as(StepVerifier::create) syntax where possible.
2019-07-17 14:40:49 +02:00
Mark Paluch
46de82fe0b DATAMONGO-2323 - Fix query(…) and stream(…) to be used with BSON Document.
MongoTemplate.stream(…), MongoTemplate.query(…) and ReactiveMongoTemplate.query(…) now no longer fail when used with BSON Document.class.

Previously, field mapping failed because it required an entity type. Now we gracefully back off when find methods are used with a simple Document entity type.
2019-07-17 14:35:56 +02:00
Mark Paluch
387348b615 DATAMONGO-2318 - Fix typo. 2019-07-10 09:46:15 +02:00
Mark Paluch
8fd41faac6 DATAMONGO-2280 - Cleanup release profile.
Reuse inherited configuration from parent pom.
2019-07-09 11:03:43 +02:00
Mark Paluch
8a15e1086b DATAMONGO-2318 - Revise readme for a consistent structure. 2019-07-09 11:03:42 +02:00
Mark Paluch
8502786648 DATAMONGO-2314 - Polishing.
Reformat code. Remove unnecessary warning suppressions. Switch to diamond syntax.

Original pull request: #771.
2019-07-04 16:27:41 +02:00
Christoph Strobl
d7107d49bf DATAMONGO-2314 - Fix query by example on nested properties.
This fix allows using alike on nested properties.

new Criteria("nested").alike(Example.of(probe, matching().withIgnorePaths("_class"))));

Switch tests to AssertJ.

Original pull request: #771.
2019-07-04 16:27:15 +02:00
Greg Turnquist
f42cb1e2f0 DATAMONGO-2280 - Use parent 'artifactory' profile for snapshot releases. 2019-07-03 17:12:27 -05:00
Mark Paluch
a9403b526f DATAMONGO-2296 - Polishing.
Use getCollectionName() in MongoTemplate.insert/save. Consistently use getCollectionName(Class) from ReactiveMongoTemplate and fluent API implementations.

Original pull request: #768.
2019-07-01 16:48:26 +02:00
Christoph Strobl
5f6291ed32 DATAMONGO-2296 - Consistent use of getCollectionName(Class) throughout MongoTemplate.
Original pull request: #768.
2019-07-01 16:44:33 +02:00
Greg Turnquist
676ee80434 DATAMONGO-2280 - Only test main branch for upstream triggers. 2019-06-28 19:31:04 -05:00
Greg Turnquist
b54641ff86 DATAMONGO-2280 - Set user.name and user.home for CI jobs. 2019-06-25 13:35:49 -05:00
Oliver Drotbohm
6930c720ca DATAMONGO-2309 - Fix NoHTTP errors. 2019-06-24 11:44:50 +02:00
Christoph Strobl
611cfe9c11 DATAMONGO-2256 - Updated changelog. 2019-06-14 15:18:14 +02:00
Christoph Strobl
507a1fbf34 DATAMONGO-2271 - After release cleanups. 2019-06-14 13:14:51 +02:00
Christoph Strobl
087649de35 DATAMONGO-2271 - Prepare next development iteration. 2019-06-14 13:14:50 +02:00
Christoph Strobl
1f01f34377 DATAMONGO-2271 - Release version 2.1.9 (Lovelace SR9). 2019-06-14 12:39:34 +02:00
Christoph Strobl
295c43c6ff DATAMONGO-2271 - Prepare 2.1.9 (Lovelace SR9). 2019-06-14 12:38:59 +02:00
Christoph Strobl
5a62d449bf DATAMONGO-2271 - Updated changelog. 2019-06-14 12:38:52 +02:00
Greg Turnquist
1cbbe692b5 DATAMONGO-2280 - Add maven wrapper. 2019-06-11 15:15:39 -05:00
Greg Turnquist
5bfe125160 DATAMONGO-2280 - Introduce Jenkins. 2019-06-11 15:03:15 -05:00
Christoph Strobl
1b6722324e DATAMONGO-2293 - Fix EntityOperations id population nulling out entire entity.
We now no longer null the entire entity if a given id value is actually null.

Original Pull Request: #742
2019-06-11 12:02:55 +02:00
Mark Paluch
a212f5f79d DATAMONGO-2231 - URL Cleanup. 2019-06-05 11:19:54 +02:00
Hippolyte Durix
2879348d4b DATAMONGO-2288 - Fix wrong indentation on documentation code sample.
Original pull request: #758.
2019-05-29 14:36:38 +02:00
Mark Paluch
10097311c7 DATAMONGO-2278 - Polishing.
Update method comment. Switch to diamond syntax.

Original pull request: #755.
2019-05-29 14:31:45 +02:00
owen.qqq
b8303a56b6 DATAMONGO-2278 - Update Querydsl base package names in MongoAnnotationProcessor.
Current MongoAnnotationProcessor still uses 3.x.x Querydsl package names.
Update package names to com.querydsl.core.annotations.* to use Querydsl annotations for code-generation.

Original pull request: #755.
2019-05-29 14:31:45 +02:00
Christoph Strobl
f9e468aebb DATAMONGO-2252 - Update Javadoc for Reactive/MongoOperations#getCollection(String).
Original pull request: #747.
2019-05-27 11:11:53 +02:00
Christoph Strobl
b900dc6c09 DATAMONGO-2275 - Fix NPE when mapping MongoJsonSchema used in query.
We fixed a NPE when reading raw Document from a collection using a query matching against a JSON schema.

Original pull request: #752.
2019-05-17 14:15:58 +02:00
Mark Paluch
bede55714c DATAMONGO-2267 - Polishing.
Reuse collection name for index creation instead of resolving the collection for every index. Switch lambda to method reference.

Original pull request: #746.
2019-05-15 10:43:11 +02:00
Christoph Strobl
3ec426352f DATAMONGO-2267 - Fix eager collection resolution in Object path.
We now lazily read the collection of an entity as it potentially requires a more expensive SpEL evaluation that might not have been required in fist place.

Original pull request: #746.
2019-05-15 10:39:45 +02:00
Mark Paluch
c6293e0ebd DATAMONGO-2270 - Updated changelog. 2019-05-13 18:19:09 +02:00
Mark Paluch
74e49a2326 DATAMONGO-2269 - After release cleanups. 2019-05-13 14:58:46 +02:00
Mark Paluch
69c451f69f DATAMONGO-2269 - Prepare next development iteration. 2019-05-13 14:58:44 +02:00
Mark Paluch
9af8160e05 DATAMONGO-2269 - Release version 2.1.8 (Lovelace SR8). 2019-05-13 13:25:52 +02:00
Mark Paluch
fdf4ea1e60 DATAMONGO-2269 - Prepare 2.1.8 (Lovelace SR8). 2019-05-13 13:25:06 +02:00
Mark Paluch
8c7afe012f DATAMONGO-2269 - Updated changelog. 2019-05-13 13:24:57 +02:00
Mark Paluch
6ba258a1f3 DATAMONGO-2260 - Updated changelog. 2019-05-13 12:37:41 +02:00
Oliver Drotbohm
059c8cf1dd DATAMONGO-2244 - Updated changelog. 2019-05-10 14:18:11 +02:00
Oliver Drotbohm
2b8955f583 DATAMONGO-2246 - After release cleanups. 2019-05-10 12:55:59 +02:00
Oliver Drotbohm
23fde167f6 DATAMONGO-2246 - Prepare next development iteration. 2019-05-10 12:55:57 +02:00
Oliver Drotbohm
9470f82e9b DATAMONGO-2246 - Release version 2.1.7 (Lovelace SR7). 2019-05-10 12:18:52 +02:00
Oliver Drotbohm
1e88e241d4 DATAMONGO-2246 - Prepare 2.1.7 (Lovelace SR7). 2019-05-10 12:18:10 +02:00
Oliver Drotbohm
0b8396c43c DATAMONGO-2246 - Updated changelog. 2019-05-10 12:18:01 +02:00
Christoph Strobl
b602e4cb26 DATAMONGO-2222 - Updated changelog. 2019-04-11 12:28:51 +02:00
Oliver Drotbohm
500393e596 DATAMONGO-2204 - After release cleanups. 2019-04-01 20:55:13 +02:00
Oliver Drotbohm
7e4cbdb8b0 DATAMONGO-2204 - Prepare next development iteration. 2019-04-01 20:55:12 +02:00
Oliver Drotbohm
1d6d8ff8e6 DATAMONGO-2204 - Release version 2.1.6 (Lovelace SR6). 2019-04-01 20:04:13 +02:00
Oliver Drotbohm
8ea4cbe9ea DATAMONGO-2204 - Prepare 2.1.6 (Lovelace SR6). 2019-04-01 20:03:19 +02:00
Oliver Drotbohm
45a0c36184 DATAMONGO-2204 - Updated changelog. 2019-04-01 20:03:15 +02:00
Oliver Drotbohm
599c79bce2 DATAMONGO-2186 - Updated changelog. 2019-04-01 19:37:01 +02:00
Oliver Drotbohm
eda6d40aa7 DATAMONGO-2243 - Updated changelog. 2019-04-01 18:52:21 +02:00
Oliver Drotbohm
22b844c87f DATAMONGO-2185 - Updated changelog. 2019-04-01 13:54:18 +02:00
Christoph Strobl
bdf7ec7c9b DATAMONGO-2241 - Polishing.
Ensure to have to DbRefResolver operate within the session while reusing the MappingContext.

Original Pull Request: #734
2019-04-01 12:54:50 +02:00
Mark Paluch
13db06d345 DATAMONGO-2241 - Retain MongoConverter using MongoTemplate.withSession(…).
We now reuse the MongoConverter instance that is associated with a MongoTemplate when using withSession(…) instead of creating a new converter instance.
In consequence, we're reusing PersistentEntity instances, EntityInstantiators and generated accessor classes.

Original Pull Request: #734
2019-04-01 10:47:52 +02:00
Mark Paluch
365ecd53c4 DATAMONGO-2221 - Polishing.
Reformat imports.

Original pull request: #732.
2019-03-26 14:46:23 +01:00
Christoph Strobl
dc40c42815 DATAMONGO-2221 - Fix mapping of Strings matching a valid ObjectId for unresolvable paths.
We now make sure we to not convert Strings representing valid ObjectIds into the such for paths that cannot be resolved to a Property.

Original pull request: #732.
2019-03-26 14:46:02 +01:00
Mark Paluch
49415efb8c DATAMONGO-2223 - Polishing.
Replace DBRef annotation using FQCN with import.

Original pull request: #660.
2019-03-25 10:07:04 +01:00
Christoph Strobl
dc234906f4 DATAMONGO-2223 - Add test for DBRef resolution in a different database.
Original pull request: #660.
2019-03-25 10:06:43 +01:00
Christoph Strobl
a7f51a7c85 DATAMONGO-2224 - Add trace logging to DBRef resolution.
We added trace logging to DefaultDbRefResolver.

<logger name="org.springframework.data.mongodb.core.convert.DefaultDbRefResolver" level="trace"/>

Original pull request: #659.
2019-03-25 10:00:39 +01:00
Spring Operator
9b0bd11d09 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://www.apache.org/licenses/ with 1 occurrences migrated to:
  https://www.apache.org/licenses/ ([https](https://www.apache.org/licenses/) result 200).
* [ ] http://www.apache.org/licenses/LICENSE-2.0 with 815 occurrences migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).

Original Pull Request: #700
2019-03-22 09:56:26 +01:00
Spring Operator
d7ad883f69 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* http://maven.apache.org/xsd/maven-4.0.0.xsd with 3 occurrences migrated to:
  https://maven.apache.org/xsd/maven-4.0.0.xsd ([https](https://maven.apache.org/xsd/maven-4.0.0.xsd) result 200).
* http://www.gopivotal.com (302) with 6 occurrences migrated to:
  https://pivotal.io ([https](https://www.gopivotal.com) result 200).
* http://maven.apache.org/maven-v4_0_0.xsd with 2 occurrences migrated to:
  https://maven.apache.org/maven-v4_0_0.xsd ([https](https://maven.apache.org/maven-v4_0_0.xsd) result 301).
* http://projects.spring.io/spring-data-mongodb with 1 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).
* http://www.pivotal.io with 1 occurrences migrated to:
  https://www.pivotal.io ([https](https://www.pivotal.io) result 301).

# Ignored
These URLs were intentionally ignored.

* http://maven.apache.org/POM/4.0.0 with 10 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 5 occurrences

Original Pull Request: #666
2019-03-19 12:55:41 +01:00
Christoph Strobl
44308bfbe1 DATAMONGO-2225 - Fix potential NPE in MongoExampleMapper. 2019-03-18 13:51:37 +01:00
Christoph Strobl
9b673d342f DATAMONGO-2228 - Polishing.
Favor AssertJ over hamcrest.

Original Pull Request: #661
2019-03-18 10:50:16 +01:00
Mikhail Kaduchka
5517198310 DATAMONGO-2228 - Fixed loosing branches in AND expressions in MongodbDocumentSerializer.
Original Pull Request: #661
2019-03-18 10:32:01 +01:00
Christoph Strobl
819a04f3db DATAMONGO-2164 - Updated changelog. 2019-03-07 10:30:07 +01:00
Mark Paluch
f7202067a5 DATAMONGO-2219 - Fix ReactiveMongoTemplate.findAllAndRemove(…) if the query yields no results.
ReactiveMongoTemplate.findAllAndRemove(…) now completes successfully without emitting a result if the find query yields no hits. We no longer call the subsequent remove query if without previous results.

Original Pull Request: #657
2019-03-05 12:57:59 +01:00
Mark Paluch
f20a0f20c9 DATAMONGO-2206 - Polishing.
Reformat code. Convert spaces to tabs. Use mockk version property to define mockk version. Author tags.

Original pull request: #646.
2019-02-20 11:34:12 +01:00
Sebastien Deleuze
02216d5941 DATAMONGO-2206 - Migrate Kotlin tests to Mockk.
Original pull request: #646.
2019-02-20 11:34:12 +01:00
Mark Paluch
79f2094322 DATAMONGO-2187 - After release cleanups. 2019-02-13 11:24:22 +01:00
Mark Paluch
afbc5cfa25 DATAMONGO-2187 - Prepare next development iteration. 2019-02-13 11:24:20 +01:00
Mark Paluch
a3882a5e5c DATAMONGO-2187 - Release version 2.1.5 (Lovelace SR5). 2019-02-13 09:56:38 +01:00
Mark Paluch
8194772388 DATAMONGO-2187 - Prepare 2.1.5 (Lovelace SR5). 2019-02-13 09:55:35 +01:00
Mark Paluch
12f18850dc DATAMONGO-2187 - Updated changelog. 2019-02-13 09:55:29 +01:00
Mark Paluch
816c1da248 DATAMONGO-2196 - Polishing.
Fix stubbing in test that sneaked in through a back port not considering the change which method was used for entity removal.

Original pull request: #641.
2019-02-12 10:47:12 +01:00
Christoph Strobl
5a78f19781 DATAMONGO-2196 - Remove applies WriteConcern to single Document delete operations.
We now make sure to apply the WriteConcern correctly when calling deleteOne on MongoCollection.

Original pull request: #641.
2019-02-07 15:20:36 +01:00
Mark Paluch
698837921b DATAMONGO-2193 - Polishing.
Reformat code.

Original pull request: #640.
2019-02-05 11:44:03 +01:00
Christoph Strobl
0f7fc7880b DATAMONGO-2193 - Fix String <> ObjectId conversion for non-Id properties.
We now make sure to only convert valid ObjectId Strings if the property can be considered as id property.

Original pull request: #640.
2019-02-05 11:43:55 +01:00
Christoph Strobl
6e42f49b08 DATAMONGO-2189 - Polishing.
Assert returned object is not the same as the saved one and move helper method.

Original Pull Request: #638
2019-01-28 13:41:27 +01:00
Mark Paluch
bdfe4e99ed DATAMONGO-2189 - Fix AfterSaveEvent to contain the saved entity in ReactiveMongoTemplate.insert(…).
ReactiveMongoTemplate.insert(…) now uses the saved entity when emitting AfterSaveEvent. This change affects usage of immutable objects that are using Id generation. Previously, the to-be-saved entity instance was used which left the Id unpopulated.

Original Pull Request: #638
2019-01-28 11:50:43 +01:00
Mark Paluch
85aa3927a6 DATAMONGO-2145 - After release cleanups. 2019-01-10 13:48:12 +01:00
Mark Paluch
33c4e4294f DATAMONGO-2145 - Prepare next development iteration. 2019-01-10 13:48:10 +01:00
Mark Paluch
a89ab387cc DATAMONGO-2145 - Release version 2.1.4 (Lovelace SR4). 2019-01-10 12:35:56 +01:00
Mark Paluch
e52b8c9d38 DATAMONGO-2145 - Prepare 2.1.4 (Lovelace SR4). 2019-01-10 12:34:54 +01:00
Mark Paluch
4dbf4795db DATAMONGO-2145 - Updated changelog. 2019-01-10 12:34:48 +01:00
Mark Paluch
8e4c6f68ae DATAMONGO-2144 - Updated changelog. 2019-01-10 12:26:36 +01:00
Mark Paluch
fddbd126ea DATAMONGO-2143 - Updated changelog. 2019-01-10 11:01:21 +01:00
Christoph Strobl
ee5b26ab1c DATAMONGO-2168 - Polishing.
MetadataBackedField no longer fails when Path detects reference to field within java.lang.Class. This can happen when splitting the property name via camel case where the first part matches to class which resolves to the getClass() call on java.lang.Object. When then the 2nd part also maps to a method (like getName()) on Class an error would be thrown.

Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
01e9a2ed67 DATAMONGO-2168 - Convert assertions to AssertJ.
Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
10107c7b81 DATAMONGO-2168 - Do not map type key in QueryMapper.
QueryMapper now excludes the type key in during mapping and retains the value as-is. This change fixes an issue in which type keys were attempted to be mapped using the entity model. Type key resolution for _class failed silently while other type keys such as className failed in property path resolution and the failure became visible.

Original Pull Request: #631
2019-01-09 17:06:29 +01:00
Mark Paluch
abe7876086 DATAMONGO-2155 - Polishing.
Reduce visibility of MappedUpdated to package-protected to avoid exposure. Rename UpdateDefinition.incVersion() to inc(). Reintroduce doUpdate() methods accepting Update and delegating to the new doUpdate() methods to preserve binary compatibility.

Original pull request: #625.
2019-01-09 16:35:19 +01:00
Christoph Strobl
a759dff5fd DATAMONGO-2155 - Introduce UpdateDefinition.
Original pull request: #625.
2019-01-09 16:35:19 +01:00
Oliver Drotbohm
9f8d081ef3 DATAMONGO-2155 - Polishing.
Original pull request: #625.
2019-01-09 16:35:19 +01:00
Christoph Strobl
b8f6030441 DATAMONGO-2155 - Bypass mapping for already mapped updates.
We now make sure that mapped updates (as in doSaveVersioned and doUpdate) are mapped only once as mapping is required only once. Mapping already mapped query/update objects comes with undesired side-effects such as following invalid property paths or reduction of type information availability.

We now make sure to map key/value pairs of Map like properties to the values domain type, and apply potentially registered custom converters to the keys.
Fixed invalid test for DATAMONGO-1423 as this one did not check the application of the registered converter.

Original pull request: #625.
2019-01-09 16:33:31 +01:00
Mark Paluch
267decf189 DATAMONGO-2173 - Polishing.
Set interrupted thread state after catching InterruptedException. Fix potential NPE by checking the cursor. Streamline generics to not hide class-level generic types.

Original pull request: #634.
2019-01-09 13:00:10 +01:00
Christoph Strobl
3a7492c68d DATAMONGO-2173 - Translate and forward exceptions during CursorReadingTask#start() to ErrorHandler.
We now make sure to translate and pass on errors during the cursor initialization procedure to the configured error handler.

Original pull request: #634.
2019-01-09 13:00:10 +01:00
Christoph Strobl
273088b6a8 DATAMONGO-2174 - Fix InvalidPersistentPropertyPath exception when updating documents.
MetadataBackedField.getPath() now returns null instead throwing an error for fields that are not part of the domain model. This allows adding any field when updating an entity.

Original pull request: #633.
2019-01-09 10:34:48 +01:00
Mark Paluch
723b481f82 DATAMONGO-2179 - Fixed broken auditing for entities using optimistic locking via batch save.
The previous implementation of (Reactive)MongoTemplate.doInsertBatch(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Related tickets: DATAMONGO-2139, DATAMONGO-2150.

Original Pull Request: #632
2019-01-08 08:22:38 +01:00
Mark Paluch
8a34bc46a2 DATAMONGO-2181 - Consider repository collection name in ReactiveMongoRepository.saveAll(…).
We now consider the collection name that is bound to the repository when inserting entities of which all are new. Previously, the collection name was derived from the entity.

Original Pull Request: #632
2019-01-08 08:15:41 +01:00
Mark Paluch
bb4c16f4cd DATAMONGO-2170 - Polishing.
Use ObjectUtils to compote hash code as hash code implementation contained artifacts that do not belong there. Extract test method.

Original pull request: #629.
2019-01-07 13:11:02 +01:00
Christoph Strobl
cf5b7c9763 DATAMONGO-2170 - Return null instead of empty string for IndexInfo#getPartialFilterExpression when not set.
We now return null instead of an empty string when calling IndexInfo#getPartialFilterExpression. The method has been marked to return null vales before and we’re complying to that contract and return value expectation.

Original pull request: #629.
2019-01-07 13:11:02 +01:00
Mark Paluch
f4414e98a2 DATAMONGO-2175 - Update copyright years to 2019. 2019-01-02 14:11:51 +01:00
Christoph Strobl
a97bfd2a37 DATAMONGO-2160 - Updated changelog. 2018-12-11 11:43:12 +01:00
Mark Paluch
9fe0f5c984 DATAMONGO-2150 - Polishing.
Fix imperative auditing test to use intended persist mechanism. Remove final keywords from method args and local variables in ReactiveMongoTemplate. Rename DBObject to Document.

Original Pull Request: #627
2018-12-07 14:39:10 +01:00
Mark Paluch
718a7ffe8c DATAMONGO-2150 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of ReactiveMongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via ReactiveMongoTemplate and repositories.

Related ticket: DATAMONGO-2139.

Original Pull Request: #627
2018-12-07 14:38:59 +01:00
Mark Paluch
f7106dc425 DATAMONGO-2156 - Polishing.
Original pull request: #626.
2018-12-05 14:40:13 +01:00
Mark Paluch
0698f8bcb8 DATAMONGO-2156 - Remove dependency to javax.xml.bind.
We now no longer use DatatypeConverter to convert byte[] to its Base64-representation but use Spring Framework's Base64 utils.

Original pull request: #626.
2018-12-05 14:40:13 +01:00
Mark Paluch
3effd9ae6f DATAMONGO-2149 - Polishing.
Add ticket reference to follow-up ticket regarding array matching on partial DBRef expressions.

Related ticket: DATAMONGO-2154

Original pull request: #623.
2018-11-30 14:53:56 +01:00
Christoph Strobl
7002cd1456 DATAMONGO-2149 - Fix $slice in fields projection when pointing to array of DBRefs.
We now no longer try to convert the actual slice parameters into a DBRef.

Original pull request: #623.
2018-11-30 14:52:58 +01:00
Mark Paluch
a15d488657 DATAMONGO-2148 - Polishing.
Add author tag. Add logging for ReactiveMongoTemplate.count(…) and findDistinct(…) operations. Fix variable names.

Original pull request: #620.
2018-11-28 17:24:52 +01:00
Cimon Lucas (LCM)
44651581b1 DATAMONGO-2148 - Add query logging for MongoTemplate.count(…).
Original pull request: #620.
2018-11-28 17:24:46 +01:00
Mark Paluch
6d64f5b2b2 DATAMONGO-2121 - After release cleanups. 2018-11-27 14:23:35 +01:00
Mark Paluch
0c52a29ba8 DATAMONGO-2121 - Prepare next development iteration. 2018-11-27 14:23:33 +01:00
Mark Paluch
bd8bd4f568 DATAMONGO-2121 - Release version 2.1.3 (Lovelace SR3). 2018-11-27 13:43:24 +01:00
Mark Paluch
c75f29dc42 DATAMONGO-2121 - Prepare 2.1.3 (Lovelace SR3). 2018-11-27 13:42:17 +01:00
Mark Paluch
e493af7266 DATAMONGO-2121 - Updated changelog. 2018-11-27 13:42:07 +01:00
Mark Paluch
8d892e5924 DATAMONGO-2109 - Updated changelog. 2018-11-27 12:36:47 +01:00
Mark Paluch
053299f243 DATAMONGO-2110 - Updated changelog. 2018-11-27 11:27:21 +01:00
Mark Paluch
872659cc00 DATAMONGO-2119 - Polishing.
Convert anonymous JSON callback class into a private static one. Use an expressive Pattern constant.

Original pull request: #621.
2018-11-23 09:48:26 +01:00
Christoph Strobl
96978a6194 DATAMONGO-2119 - Allow SpEL usage for annotated $regex query.
Original pull request: #621.
2018-11-23 09:48:26 +01:00
Oliver Drotbohm
2253d3e301 DATAMONGO-2108 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of MongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via MongoOperations and repositories.
2018-11-22 15:05:31 +01:00
Mark Paluch
5982ee84f7 DATAMONGO-2130 - Polishing.
Replace duplicate checks to ClientSession.hasActiveTransaction() with MongoResourceHolder.hasActiveTransaction(). Introduce MongoResourceHolder.getRequiredSession() to avoid nullability warnings.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Christoph Strobl
dd2af6462d DATAMONGO-2130 - Polishing.
Set timeout for InetAdress host lookup to reduce test execution time.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Christoph Strobl
622643bf24 DATAMONGO-2130 - Fix Repository count & exists inside transaction.
We now make sure invocations on repository count and exists methods delegate to countDocuments when inside a transaction.

Original pull request: #618.
2018-11-16 12:58:40 +01:00
Oliver Drotbohm
51cc55baac DATAMONGO-2135 - Default to intermediate List for properties typed to Collection.
We now defensively create a List rather than a LinkedHashSet (which Spring's CollectionFactory.createCollection(…) defaults to) to make sure we're not accidentally dropping values that are considered equal according to their Java class definition.
2018-11-15 15:26:36 +01:00
Mark Paluch
0b106e5649 DATAMONGO-2107 - After release cleanups. 2018-10-29 13:59:17 +01:00
Mark Paluch
8975d93ab3 DATAMONGO-2107 - Prepare next development iteration. 2018-10-29 13:59:15 +01:00
Mark Paluch
e25b6c49f5 DATAMONGO-2107 - Release version 2.1.2 (Lovelace SR2). 2018-10-29 12:53:51 +01:00
Mark Paluch
7a70c205de DATAMONGO-2107 - Prepare 2.1.2 (Lovelace SR2). 2018-10-29 12:52:54 +01:00
Mark Paluch
6045efa450 DATAMONGO-2107 - Updated changelog. 2018-10-29 12:52:45 +01:00
Mark Paluch
7b0816b3ee DATAMONGO-2118 - Polishing.
Fix typo in reactive repositories reference documentation.

Original pull request: #611.
2018-10-26 10:08:03 +02:00
Mona Mohamadinia
14e4ea736d DATAMONGO-2118 - Fix typo in repositories reference documentation.
Original pull request: #611.
2018-10-26 10:08:03 +02:00
Mark Paluch
32e7d9ab7f DATAMONGO-2098 - Polishing.
Annotate methods and parameters with Nullable. Use diamond syntax where appropriate.

Original pull request: #612.
2018-10-25 15:35:26 +02:00
Zied Yaich
7f35ad9e45 DATAMONGO-2098 - Fix typo in MappingMongoConverterParser method.
Original pull request: #612.
2018-10-25 15:35:26 +02:00
Mark Paluch
60228f6e5a DATAMONGO-2113 - Polishing.
Increase subscription await timeout to allow for slow system processing such as on TravisCI.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Christoph Strobl
7604492b7f DATAMONGO-2113 - Polishing.
Use AssertJ in tests.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Christoph Strobl
4680fe0e77 DATAMONGO-2113 - Fix resumeTimestamp conversion for change streams.
We now use the first 32 bits of the timestamp to create the instant and ignore the ordinal value.

Original pull request: #615.
2018-10-25 14:33:28 +02:00
Mark Paluch
b4228c88d3 DATAMONGO-2083 - Updated changelog. 2018-10-15 14:19:03 +02:00
Mark Paluch
f6ef8c94c8 DATAMONGO-2084 - Updated changelog. 2018-10-15 12:46:24 +02:00
Mark Paluch
0d0dafa85e DATAMONGO-2094 - After release cleanups. 2018-10-15 11:12:14 +02:00
Mark Paluch
29aa34619f DATAMONGO-2094 - Prepare next development iteration. 2018-10-15 11:12:12 +02:00
Mark Paluch
7f19f769c4 DATAMONGO-2094 - Release version 2.1.1 (Lovelace SR1). 2018-10-15 10:42:04 +02:00
Mark Paluch
a40e89d90a DATAMONGO-2094 - Prepare 2.1.1 (Lovelace SR1). 2018-10-15 10:40:57 +02:00
Mark Paluch
6b2350200a DATAMONGO-2094 - Updated changelog. 2018-10-15 10:40:53 +02:00
Mark Paluch
fb50b0f6e7 DATAMONGO-2096 - Polishing.
Migrate assertions to AssertJ.

Original pull request: #613.
2018-10-05 15:02:38 +02:00
Christoph Strobl
ab568229b5 DATAMONGO-2096 - Fix target field name for GraphLookup aggregation operation.
We now make sure to use the target field name instead of the alias when processing GraphLookupOperation.

Original pull request: #613.
2018-10-05 15:02:38 +02:00
Mark Paluch
7f9c1bd774 DATAMONGO-2061 - After release cleanups. 2018-09-21 07:46:17 -04:00
Mark Paluch
670a0978da DATAMONGO-2061 - Prepare next development iteration. 2018-09-21 07:46:16 -04:00
319 changed files with 4803 additions and 13661 deletions

View File

@@ -1,110 +0,0 @@
/*
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*/
import java.net.*;
import java.io.*;
import java.nio.channels.*;
import java.util.Properties;
public class MavenWrapperDownloader {
/**
* Default URL to download the maven-wrapper.jar from, if no 'downloadUrl' is provided.
*/
private static final String DEFAULT_DOWNLOAD_URL =
"https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar";
/**
* Path to the maven-wrapper.properties file, which might contain a downloadUrl property to
* use instead of the default one.
*/
private static final String MAVEN_WRAPPER_PROPERTIES_PATH =
".mvn/wrapper/maven-wrapper.properties";
/**
* Path where the maven-wrapper.jar will be saved to.
*/
private static final String MAVEN_WRAPPER_JAR_PATH =
".mvn/wrapper/maven-wrapper.jar";
/**
* Name of the property which should be used to override the default download url for the wrapper.
*/
private static final String PROPERTY_NAME_WRAPPER_URL = "wrapperUrl";
public static void main(String args[]) {
System.out.println("- Downloader started");
File baseDirectory = new File(args[0]);
System.out.println("- Using base directory: " + baseDirectory.getAbsolutePath());
// If the maven-wrapper.properties exists, read it and check if it contains a custom
// wrapperUrl parameter.
File mavenWrapperPropertyFile = new File(baseDirectory, MAVEN_WRAPPER_PROPERTIES_PATH);
String url = DEFAULT_DOWNLOAD_URL;
if(mavenWrapperPropertyFile.exists()) {
FileInputStream mavenWrapperPropertyFileInputStream = null;
try {
mavenWrapperPropertyFileInputStream = new FileInputStream(mavenWrapperPropertyFile);
Properties mavenWrapperProperties = new Properties();
mavenWrapperProperties.load(mavenWrapperPropertyFileInputStream);
url = mavenWrapperProperties.getProperty(PROPERTY_NAME_WRAPPER_URL, url);
} catch (IOException e) {
System.out.println("- ERROR loading '" + MAVEN_WRAPPER_PROPERTIES_PATH + "'");
} finally {
try {
if(mavenWrapperPropertyFileInputStream != null) {
mavenWrapperPropertyFileInputStream.close();
}
} catch (IOException e) {
// Ignore ...
}
}
}
System.out.println("- Downloading from: : " + url);
File outputFile = new File(baseDirectory.getAbsolutePath(), MAVEN_WRAPPER_JAR_PATH);
if(!outputFile.getParentFile().exists()) {
if(!outputFile.getParentFile().mkdirs()) {
System.out.println(
"- ERROR creating output direcrory '" + outputFile.getParentFile().getAbsolutePath() + "'");
}
}
System.out.println("- Downloading to: " + outputFile.getAbsolutePath());
try {
downloadFileFromURL(url, outputFile);
System.out.println("Done");
System.exit(0);
} catch (Throwable e) {
System.out.println("- Error downloading");
e.printStackTrace();
System.exit(1);
}
}
private static void downloadFileFromURL(String urlString, File destination) throws Exception {
URL website = new URL(urlString);
ReadableByteChannel rbc;
rbc = Channels.newChannel(website.openStream());
FileOutputStream fos = new FileOutputStream(destination);
fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
fos.close();
rbc.close();
}
}

43
CI.adoc Normal file
View File

@@ -0,0 +1,43 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
== Running CI tasks locally
Since this pipeline is purely Docker-based, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your test routine before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what the CI server does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, run the tests from inside the container:
+
3. `./mvnw clean dependency:list test -Dsort -Dbundlor.enabled=false -B` (or with whatever profile you need to test out)
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to package things up, do this:
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, package things from inside the container doing this:
+
3. `./mvnw clean dependency:list package -Dsort -Dbundlor.enabled=false -B`
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -24,4 +24,4 @@ Instances of abusive, harassing, or otherwise unacceptable behavior may be repor
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].
This Code of Conduct is adapted from the https://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at https://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

123
Jenkinsfile vendored Normal file
View File

@@ -0,0 +1,123 @@
pipeline {
agent none
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/2.1.x", threshold: hudson.model.Result.SUCCESS)
}
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '14'))
}
stages {
stage("Test") {
when {
anyOf {
branch '2.1.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: baseline") {
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
}
}
}
}
stage('Release to artifactory') {
when {
branch 'issue/*'
not { triggeredBy 'UpstreamCause' }
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-2.1 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
stage('Release to artifactory with docs') {
when {
branch '2.1.x'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-2.1 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
post {
changed {
script {
slackSend(
color: (currentBuild.currentResult == 'SUCCESS') ? 'good' : 'danger',
channel: '#spring-data-dev',
message: "${currentBuild.fullDisplayName} - `${currentBuild.currentResult}`\n${env.BUILD_URL}")
emailext(
subject: "[${currentBuild.fullDisplayName}] ${currentBuild.currentResult}",
mimeType: 'text/html',
recipientProviders: [[$class: 'CulpritsRecipientProvider'], [$class: 'RequesterRecipientProvider']],
body: "<a href=\"${env.BUILD_URL}\">${currentBuild.fullDisplayName} is reported as ${currentBuild.currentResult}</a>")
}
}
}
}

159
README.adoc Normal file
View File

@@ -0,0 +1,159 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
== Code of Conduct
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
[source,java]
----
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
@Service
public class MyService {
private final PersonRepository repository;
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
----
=== Maven configuration
Add the Maven dependency:
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
----
== Getting Help
Having trouble with Spring Data? Wed love to help!
* Check the
https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/[reference documentation], and https://docs.spring.io/spring-data/mongodb/docs/current/api/[Javadocs].
* Learn the Spring basics Spring Data builds on Spring Framework, check the https://spring.io[spring.io] web-site for a wealth of reference documentation.
If you are just starting out with Spring, try one of the https://spring.io/guides[guides].
* If you are upgrading, check out the https://docs.spring.io/spring-data/mongodb/docs/current/changelog.txt[changelog] for "`new and noteworthy`" features.
* Ask a question - we monitor https://stackoverflow.com[stackoverflow.com] for questions tagged with https://stackoverflow.com/tags/spring-data[`spring-data-mongodb`].
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://jira.spring.io/browse/DATAMONGO[jira.spring.io/browse/DATAMONGO].
== Reporting Issues
Spring Data uses JIRA as issue tracking system to record bugs and feature requests. If you want to raise an issue, please follow the recommendations below:
* Before you log a bug, please search the
https://jira.spring.io/browse/DATAMONGO[issue tracker] to see if someone has already reported the problem.
* If the issue doesnt already exist, https://jira.spring.io/browse/DATAMONGO[create a new issue].
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using and JVM version.
* If you need to paste code, or include a stack trace use JIRA `{code}…{code}` escapes before and after your text.
* If possible try to create a test-case or project that replicates the issue. Attach a link to your code or a compressed file containing your code.
== Building from Source
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
[source,bash]
----
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
Building the documentation builds also the project without running tests.
[source,bash]
----
$ ./mvnw clean install -Pdistribute
----
The generated documentation is available from `target/site/reference/html/index.html`.
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

186
README.md
View File

@@ -1,186 +0,0 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
## Getting Help
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
### Maven configuration
Add the Maven dependency:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
```
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() throws Exception {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
```xml
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
```java
@Service
public class MyService {
private final PersonRepository repository;
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
### MongoDB 4.0 Transactions
As of version 4 MongoDB supports [Transactions](https://www.mongodb.com/transactions). Transactions are built on top of
`ClientSessions` and therefore require an active session.
`MongoTransactionManager` is the gateway to the well known Spring transaction support. It allows applications to use
[managed transaction features of Spring](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html).
The `MongoTransactionManager` binds a `ClientSession` to the thread. `MongoTemplate` automatically detects those and operates on them accordingly.
```java
@Configuration
static class Config extends AbstractMongoConfiguration {
@Bean
MongoTransactionManager transactionManager(MongoDbFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
// ...
}
@Component
public class StateService {
@Transactional
void someBusinessFunction(Step step) {
template.insert(step);
process(step);
template.update(Step.class).apply(Update.set("state", // ...
};
});
```
## Contributing to Spring Data
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

View File

@@ -1,14 +0,0 @@
FROM openjdk:11-jdk
RUN apt-get update && apt-get install -y apt-transport-https
RUN apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb https://repo.mongodb.org/apt/debian stretch/mongodb-org/4.0 main" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.3 mongodb-org-server=4.0.3 mongodb-org-shell=4.0.3 mongodb-org-mongos=4.0.3 mongodb-org-tools=4.0.3
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,14 +0,0 @@
FROM openjdk:8-jdk
RUN apt-get update && apt-get install -y apt-transport-https
RUN apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb https://repo.mongodb.org/apt/debian stretch/mongodb-org/4.0 main" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.3 mongodb-org-server=4.0.3 mongodb-org-shell=4.0.3 mongodb-org-mongos=4.0.3 mongodb-org-tools=4.0.3
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,39 +0,0 @@
== Running CI tasks locally
Since Concourse is built on top of Docker, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your `test.sh` script before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what Concourse does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
Next, run the `test.sh` script from inside the container:
+
2. `PROFILE=none spring-data-mongodb-github/ci/test.sh`
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to test the `build.sh` script, do this:
1. `mkdir /tmp/spring-data-mongodb-artifactory`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github` and the temporary
artifactory output directory at `spring-data-mongodb-artifactory`.
+
Next, run the `build.sh` script from inside the container:
+
3. `spring-data-mongodb-github/ci/build.sh`
IMPORTANT: `build.sh` doesn't actually push to Artifactory so don't worry about accidentally deploying anything.
It just deploys to a local folder. That way, the `artifactory-resource` later in the pipeline can pick up these artifacts
and deliver them to artifactory.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -1,15 +0,0 @@
#!/bin/bash
set -euo pipefail
[[ -d $PWD/maven && ! -d $HOME/.m2 ]] && ln -s $PWD/maven $HOME/.m2
spring_data_mongodb_artifactory=$(pwd)/spring-data-mongodb-artifactory
rm -rf $HOME/.m2/repository/org/springframework/data 2> /dev/null || :
cd spring-data-mongodb-github
./mvnw deploy \
-Dmaven.test.skip=true \
-DaltDeploymentRepository=distribution::default::file://${spring_data_mongodb_artifactory} \

View File

@@ -1,19 +0,0 @@
---
platform: linux
image_resource:
type: docker-image
source:
repository: springci/spring-data-8-jdk-with-mongodb
inputs:
- name: spring-data-mongodb-github
outputs:
- name: spring-data-mongodb-artifactory
caches:
- path: maven
run:
path: spring-data-mongodb-github/ci/build.sh

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.9 mongodb-org-server=4.0.9 mongodb-org-shell=4.0.9 mongodb-org-mongos=4.0.9 mongodb-org-tools=4.0.9
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 4B7C549A058F8B6B
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.1 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.1.list
RUN apt-get update
RUN apt-get install -y mongodb-org-unstable=4.1.13 mongodb-org-unstable-server=4.1.13 mongodb-org-unstable-shell=4.1.13 mongodb-org-unstable-mongos=4.1.13 mongodb-org-unstable-tools=4.1.13
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,14 +0,0 @@
#!/bin/bash
set -euo pipefail
mkdir -p /data/db
mongod &
[[ -d $PWD/maven && ! -d $HOME/.m2 ]] && ln -s $PWD/maven $HOME/.m2
rm -rf $HOME/.m2/repository/org/springframework/data/mongodb 2> /dev/null || :
cd spring-data-mongodb-github
./mvnw clean dependency:list test -P${PROFILE} -Dsort

View File

@@ -1,16 +0,0 @@
---
platform: linux
image_resource:
type: docker-image
source:
repository: springci/spring-data-8-jdk-with-mongodb
inputs:
- name: spring-data-mongodb-github
caches:
- path: maven
run:
path: spring-data-mongodb-github/ci/test.sh

47
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.2.0.BUILD-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
</parent>
<modules>
@@ -27,9 +27,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.2.0.BUILD-SNAPSHOT</springdata.commons>
<mongo>3.10.1</mongo>
<mongo.reactivestreams>1.11.0</mongo.reactivestreams>
<springdata.commons>2.1.11.RELEASE</springdata.commons>
<mongo>3.8.2</mongo>
<mongo.reactivestreams>1.9.2</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -114,20 +114,6 @@
</developers>
<profiles>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>benchmarks</id>
<modules>
@@ -137,6 +123,25 @@
<module>spring-data-mongodb-benchmarks</module>
</modules>
</profile>
<profile>
<id>distribute</id>
<build>
<plugins>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -150,8 +155,8 @@
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>

View File

@@ -1,6 +1,6 @@
# Benchmarks
Benchmarks are based on [JMH](http://openjdk.java.net/projects/code-tools/jmh/).
Benchmarks are based on [JMH](https://openjdk.java.net/projects/code-tools/jmh/).
# Running Benchmarks

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -87,7 +87,6 @@
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -50,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -5,11 +5,11 @@
xmlns:jdbc="http://www.springframework.org/schema/jdbc"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo https://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/jdbc https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd
http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/tx https://www.springframework.org/schema/tx/spring-tx-3.0.xsd
http://www.springframework.org/schema/context https://www.springframework.org/schema/context/spring-context-3.0.xsd">
<context:spring-configured/>

View File

@@ -1,6 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -14,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -29,22 +28,11 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.2.0.DATAMONGO-REACTIVE-SKIP-TAKE-SNAPSHOT</version>
<version>2.1.11.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -83,14 +83,14 @@
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
<version>${mongo.reactivestreams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
@@ -107,7 +107,7 @@
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<optional>true</optional>
</dependency>
@@ -119,14 +119,14 @@
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
@@ -264,27 +264,20 @@
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-core</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-reactor</artifactId>
<version>${kotlin-coroutines}</version>
<optional>true</optional>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-test</artifactId>
<version>${kotlin}</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -329,7 +322,6 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>

View File

@@ -0,0 +1,69 @@
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.lang.Nullable;
/**
* Exception being thrown in case we cannot connect to a MongoDB instance.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class CannotGetMongoDbConnectionException extends DataAccessResourceFailureException {
private final UserCredentials credentials;
private final @Nullable String database;
private static final long serialVersionUID = 1172099106475265589L;
public CannotGetMongoDbConnectionException(String msg, Throwable cause) {
super(msg, cause);
this.database = null;
this.credentials = UserCredentials.NO_CREDENTIALS;
}
public CannotGetMongoDbConnectionException(String msg) {
this(msg, null, UserCredentials.NO_CREDENTIALS);
}
public CannotGetMongoDbConnectionException(String msg, @Nullable String database, UserCredentials credentials) {
super(msg);
this.database = database;
this.credentials = credentials;
}
/**
* Returns the {@link UserCredentials} that were used when trying to connect to the MongoDB instance.
*
* @return
*/
public UserCredentials getCredentials() {
return this.credentials;
}
/**
* Returns the name of the database trying to be accessed.
*
* @return
*/
@Nullable
public String getDatabase() {
return database;
}
}

View File

@@ -51,6 +51,7 @@ import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
@@ -101,6 +102,8 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, id);
createIsNewStrategyFactoryBeanDefinition(ctxRef, parserContext, element);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
@@ -345,6 +348,20 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
public static String createIsNewStrategyFactoryBeanDefinition(String mappingContextRef, ParserContext context,
Element element) {
BeanDefinitionBuilder mappingContextStrategyFactoryBuilder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(
builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY_BEAN_NAME));
return IS_NEW_STRATEGY_FACTORY_BEAN_NAME;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
@@ -27,12 +28,17 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -81,11 +87,23 @@ public abstract class MongoConfigurationSupport {
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(
new PersistentEntities(Arrays.<MappingContext<?, ?>> asList(new MappingContext[] { mongoMappingContext() }))));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
@@ -172,16 +190,4 @@ public abstract class MongoConfigurationSupport {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
/**
* Configure whether to automatically create indices for domain types by deriving the
* {@link org.springframework.data.mongodb.core.index.IndexDefinition} from the entity or not.
*
* @return {@literal true} by default. <br />
* <strong>INFO</strong>: As of 3.x the default will be set to {@literal false}.
* @since 2.2
*/
protected boolean autoIndexCreation() {
return true;
}
}

View File

@@ -17,7 +17,6 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.Arrays;
@@ -27,7 +26,6 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
@@ -80,23 +78,12 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if ("MONGODB-CR".equals(authMechanism)) {
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
Method createCRCredentialMethod = ReflectionUtils.findMethod(MongoCredential.class,
"createMongoCRCredential", String.class, String.class, char[].class);
if (createCRCredentialMethod == null) {
throw new IllegalArgumentException("MONGODB-CR is no longer supported.");
}
MongoCredential credential = MongoCredential.class
.cast(ReflectionUtils.invokeMethod(createCRCredentialMethod, null, userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
credentials.add(credential);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);

View File

@@ -20,7 +20,6 @@ import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -85,19 +84,8 @@ public class ChangeStreamEvent<T> {
@Nullable
public Instant getTimestamp() {
return getBsonTimestamp() != null ? converter.getConversionService().convert(raw.getClusterTime(), Instant.class)
: null;
}
/**
* Get the {@link ChangeStreamDocument#getClusterTime() cluster time}.
*
* @return can be {@literal null}.
* @since 2.2
*/
@Nullable
public BsonTimestamp getBsonTimestamp() {
return raw != null ? raw.getClusterTime() : null;
return raw != null && raw.getClusterTime() != null
? converter.getConversionService().convert(raw.getClusterTime(), Instant.class) : null;
}
/**

View File

@@ -21,15 +21,12 @@ import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
@@ -50,7 +47,7 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private @Nullable Instant resumeTimestamp;
protected ChangeStreamOptions() {}
@@ -86,15 +83,7 @@ public class ChangeStreamOptions {
* @return {@link Optional#empty()} if not set.
*/
public Optional<Instant> getResumeTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, Instant.class));
}
/**
* @return {@link Optional#empty()} if not set.
* @since 2.2
*/
public Optional<BsonTimestamp> getResumeBsonTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, BsonTimestamp.class));
return Optional.ofNullable(resumeTimestamp);
}
/**
@@ -114,29 +103,6 @@ public class ChangeStreamOptions {
return new ChangeStreamOptionsBuilder();
}
private static <T> T asTimestampOfType(Object timestamp, Class<T> targetType) {
return targetType.cast(doGetTimestamp(timestamp, targetType));
}
private static <T> Object doGetTimestamp(Object timestamp, Class<T> targetType) {
if (ClassUtils.isAssignableValue(targetType, timestamp)) {
return timestamp;
}
if (timestamp instanceof Instant) {
return new BsonTimestamp((int) ((Instant) timestamp).getEpochSecond(), 0);
}
if (timestamp instanceof BsonTimestamp) {
return Instant.ofEpochSecond(((BsonTimestamp) timestamp).getTime());
}
throw new IllegalArgumentException(
"o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp));
}
/**
* Builder for creating {@link ChangeStreamOptions}.
*
@@ -149,7 +115,7 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private @Nullable Instant resumeTimestamp;
private ChangeStreamOptionsBuilder() {}
@@ -258,21 +224,6 @@ public class ChangeStreamOptions {
return this;
}
/**
* Set the cluster time to resume from.
*
* @param resumeTimestamp must not be {@literal null}.
* @return this.
* @since 2.2
*/
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
}
/**
* @return the built {@link ChangeStreamOptions}
*/

View File

@@ -25,7 +25,7 @@ import com.mongodb.client.FindIterable;
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface CursorPreparer {
public interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.

View File

@@ -58,6 +58,7 @@ import com.mongodb.client.model.WriteModel;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @author Michail Nikolaev
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
@@ -274,8 +275,12 @@ class DefaultBulkOperations implements BulkOperations {
public com.mongodb.bulk.BulkWriteResult execute() {
try {
return mongoOperations.execute(collectionName, collection -> {
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite(models.stream().map(this::mapWriteModel).collect(Collectors.toList()), bulkOptions);
});
} finally {

View File

@@ -38,15 +38,17 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
import com.mongodb.util.JSONParseException;
/**
* Common operations performed on an entity in the context of it's mapping metadata.
*
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.1
* @see MongoTemplate
* @see ReactiveMongoTemplate
@@ -114,17 +116,6 @@ class EntityOperations {
return context.getRequiredPersistentEntity(entityClass).getCollection();
}
/**
* Returns the collection name to be used for the given entity.
*
* @param obj can be {@literal null}.
* @return
*/
@Nullable
public String determineEntityCollectionName(@Nullable Object obj) {
return null == obj ? null : determineCollectionName(obj.getClass());
}
public Query getByIdInQuery(Collection<?> entities) {
MultiValueMap<String, Object> byIds = new LinkedMultiValueMap<>();
@@ -164,15 +155,8 @@ class EntityOperations {
try {
return Document.parse(source);
} catch (org.bson.json.JsonParseException o_O) {
} catch (JSONParseException | org.bson.json.JsonParseException o_O) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
} catch (RuntimeException o_O) {
// legacy 3.x exception
if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
}
throw o_O;
}
}
@@ -205,16 +189,6 @@ class EntityOperations {
*/
Query getByIdQuery();
/**
* Returns the {@link Query} to remove an entity by its {@literal id} and if applicable {@literal version}.
*
* @return the {@link Query} to use for removing the entity. Never {@literal null}.
* @since 2.2
*/
default Query getRemoveByQuery() {
return isVersionedEntity() ? getQueryForVersion() : getByIdQuery();
}
/**
* Returns the {@link Query} to find the entity in its current version.
*
@@ -245,11 +219,9 @@ class EntityOperations {
}
/**
* Returns the value of the version if the entity {@link #isVersionedEntity() has a version property}.
* Returns the value of the version if the entity has a version property, {@literal null} otherwise.
*
* @return the entity version. Can be {@literal null}.
* @throws IllegalStateException if the entity does not define a {@literal version} property. Make sure to check
* {@link #isVersionedEntity()}.
* @return
*/
@Nullable
Object getVersion();
@@ -305,8 +277,8 @@ class EntityOperations {
/**
* Returns the current version value if the entity has a version property.
*
* @return the current version or {@literal null} in case it's uninitialized.
* @throws IllegalStateException if the entity does not define a {@literal version} property.
* @return the current version or {@literal null} in case it's uninitialized or the entity doesn't expose a version
* property.
*/
@Nullable
Number getVersion();
@@ -508,10 +480,10 @@ class EntityOperations {
public Query getQueryForVersion() {
MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
MongoPersistentProperty property = entity.getRequiredVersionProperty();
return new Query(Criteria.where(idProperty.getName()).is(getId())//
.and(versionProperty.getName()).is(getVersion()));
.and(property.getName()).is(getVersion()));
}
/*
@@ -632,22 +604,19 @@ class EntityOperations {
public T populateIdIfNecessary(@Nullable Object id) {
if (id == null) {
return null;
return propertyAccessor.getBean();
}
T bean = propertyAccessor.getBean();
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty == null) {
return bean;
return propertyAccessor.getBean();
}
if (identifierAccessor.getIdentifier() != null) {
return bean;
return propertyAccessor.getBean();
}
propertyAccessor.setProperty(idProperty, id);
return propertyAccessor.getBean();
}

View File

@@ -21,8 +21,9 @@ import com.mongodb.reactivestreams.client.FindPublisher;
* Simple callback interface to allow customization of a {@link FindPublisher}.
*
* @author Mark Paluch
* @author Konstantin Volivach
*/
interface FindPublisherPreparer {
public interface FindPublisherPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.

View File

@@ -92,7 +92,7 @@ public class MappedDocument {
* mapped to the specific domain type.
*
* @author Christoph Strobl
* @since 2.2
* @since 2.1.4
*/
class MappedUpdate implements UpdateDefinition {
@@ -137,14 +137,5 @@ public class MappedDocument {
public Boolean isIsolated() {
return delegate.isIsolated();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
}
}

View File

@@ -41,8 +41,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
// TODO: Mongo Driver 4 - use application name insetad of description if not available
private @Nullable String description = DEFAULT_MONGO_OPTIONS.getApplicationName();
private @Nullable String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
@@ -52,8 +51,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
// TODO: Mongo Driver 4 - check if available
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private @Nullable ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
@@ -61,8 +58,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private @Nullable WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private @Nullable SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
// TODO: Mongo Driver 4 - remove this option
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
@@ -79,7 +74,6 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
*
* @param description
*/
// TODO: Mongo Driver 4 - deprecate that one and add application name
public void setDescription(@Nullable String description) {
this.description = description;
}
@@ -241,7 +235,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
}
/**
* This controls if the driver should us an SSL connection. Defaults to {@literal false}.
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
@@ -291,7 +285,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.applicationName(description) // TODO: Mongo Driver 4 - use application name if description not available
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
@@ -303,9 +297,8 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.serverSelectionTimeout(serverSelectionTimeout) //
.sslEnabled(ssl) //
.socketFactory(socketFactoryToUse) // TODO: Mongo Driver 4 - remove if not available
.socketKeepAlive(socketKeepAlive) // TODO: Mongo Driver 4 - remove if not available
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();

View File

@@ -57,6 +57,10 @@ import com.mongodb.client.result.UpdateResult;
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy).
* <p />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
*
* @author Thomas Risberg
* @author Mark Pollack
@@ -289,12 +293,15 @@ public interface MongoOperations extends FluentMongoOperations {
Set<String> getCollectionNames();
/**
* Get a collection by name, creating it if it doesn't exist.
* Get a {@link MongoCollection} by its name. The returned collection may not exists yet (except in local memory) and
* is created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
* @return an existing collection or a newly created one.
* @return an existing collection or one created on first server interaction.
*/
MongoCollection<Document> getCollection(String collectionName);
@@ -1131,10 +1138,10 @@ public interface MongoOperations extends FluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
@@ -1193,10 +1200,10 @@ public interface MongoOperations extends FluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1211,10 +1218,10 @@ public interface MongoOperations extends FluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1225,7 +1232,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, FindAndModifyOptions, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
@@ -1241,6 +1250,9 @@ public interface MongoOperations extends FluentMongoOperations {
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
* <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, FindAndModifyOptions, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
@@ -1253,8 +1265,10 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, FindAndModifyOptions, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
* @param update the update document that contains the updated object or $ operators to manipulate the existing
@@ -1267,7 +1281,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
* the provided update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.
@@ -1283,6 +1299,9 @@ public interface MongoOperations extends FluentMongoOperations {
* the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
* <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.
@@ -1296,6 +1315,8 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.
@@ -1350,10 +1371,7 @@ public interface MongoOperations extends FluentMongoOperations {
UpdateResult updateMulti(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Remove the given object from the collection by {@literal id} and (if applicable) its
* {@link org.springframework.data.annotation.Version}. <br />
* Use {@link DeleteResult#getDeletedCount()} for insight whether an {@link DeleteResult#wasAcknowledged()
* acknowledged} remove operation was successful or not.
* Remove the given object from the collection by id.
*
* @param object must not be {@literal null}.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
@@ -1361,10 +1379,7 @@ public interface MongoOperations extends FluentMongoOperations {
DeleteResult remove(Object object);
/**
* Removes the given object from the given collection by {@literal id} and (if applicable) its
* {@link org.springframework.data.annotation.Version}. <br />
* Use {@link DeleteResult#getDeletedCount()} for insight whether an {@link DeleteResult#wasAcknowledged()
* acknowledged} remove operation was successful or not.
* Removes the given object from the given collection.
*
* @param object must not be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.

View File

@@ -68,16 +68,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
@@ -104,7 +95,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.util.CloseableIterator;
@@ -256,14 +246,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
mappingContext = this.mongoConverter.getMappingContext();
// We create indexes based on mapping events
if (mappingContext instanceof MongoMappingContext) {
MongoMappingContext mappingContext = (MongoMappingContext) this.mappingContext;
if (mappingContext.isAutoIndexCreation()) {
indexCreator = new MongoPersistentEntityIndexCreator(mappingContext, this);
eventPublisher = new MongoMappingEventPublisher(indexCreator);
mappingContext.setApplicationEventPublisher(eventPublisher);
indexCreator = new MongoPersistentEntityIndexCreator((MongoMappingContext) mappingContext, this);
eventPublisher = new MongoMappingEventPublisher(indexCreator);
if (mappingContext instanceof ApplicationEventPublisherAware) {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
}
}
}
@@ -273,8 +259,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
this.mongoDbFactory = dbFactory;
this.exceptionTranslator = that.exceptionTranslator;
this.sessionSynchronization = that.sessionSynchronization;
this.mongoConverter = that.mongoConverter instanceof MappingMongoConverter ? getDefaultMongoConverter(dbFactory)
: that.mongoConverter;
// we need to (re)create the MappingMongoConverter as we need to have it use a DbRefResolver that operates within
// the sames session. Otherwise loading referenced objects would happen outside of it.
if (that.mongoConverter instanceof MappingMongoConverter) {
this.mongoConverter = ((MappingMongoConverter) that.mongoConverter).with(dbFactory);
} else {
this.mongoConverter = that.mongoConverter;
}
this.queryMapper = that.queryMapper;
this.updateMapper = that.updateMapper;
this.schemaMapper = that.schemaMapper;
@@ -383,9 +376,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.MongoOperations#executeAsStream(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType) {
return stream(query, entityType, operations.determineCollectionName(entityType));
public <T> CloseableIterator<T> stream(Query query, Class<T> entityType) {
return stream(query, entityType, getCollectionName(entityType));
}
/*
@@ -393,11 +385,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.MongoOperations#stream(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType, final String collectionName) {
public <T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName) {
return doStream(query, entityType, collectionName, entityType);
}
protected <T> CloseableIterator<T> doStream(final Query query, final Class<?> entityType, final String collectionName,
protected <T> CloseableIterator<T> doStream(Query query, final Class<?> entityType, String collectionName,
Class<T> returnType) {
Assert.notNull(query, "Query must not be null!");
@@ -411,7 +403,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public CloseableIterator<T> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
MongoPersistentEntity<?> persistentEntity = mappingContext.getRequiredPersistentEntity(entityType);
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(entityType);
Document mappedFields = getMappedFieldsObject(query.getFieldsObject(), persistentEntity, returnType);
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
@@ -535,7 +527,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public <T> T execute(Class<?> entityClass, CollectionCallback<T> callback) {
Assert.notNull(entityClass, "EntityClass must not be null!");
return execute(operations.determineCollectionName(entityClass), callback);
return execute(getCollectionName(entityClass), callback);
}
/*
@@ -606,7 +598,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable CollectionOptions collectionOptions) {
Assert.notNull(entityClass, "EntityClass must not be null!");
return doCreateCollection(operations.determineCollectionName(entityClass),
return doCreateCollection(getCollectionName(entityClass),
convertToDocument(collectionOptions, entityClass));
}
@@ -652,7 +644,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#getCollection(java.lang.Class)
*/
public <T> boolean collectionExists(Class<T> entityClass) {
return collectionExists(operations.determineCollectionName(entityClass));
return collectionExists(getCollectionName(entityClass));
}
/*
@@ -681,7 +673,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#dropCollection(java.lang.Class)
*/
public <T> void dropCollection(Class<T> entityClass) {
dropCollection(operations.determineCollectionName(entityClass));
dropCollection(getCollectionName(entityClass));
}
/*
@@ -717,7 +709,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#indexOps(java.lang.Class)
*/
public IndexOperations indexOps(Class<?> entityClass) {
return new DefaultIndexOperations(this, operations.determineCollectionName(entityClass), entityClass);
return new DefaultIndexOperations(this, getCollectionName(entityClass), entityClass);
}
/*
@@ -733,7 +725,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#bulkOps(org.springframework.data.mongodb.core.BulkMode, java.lang.Class)
*/
public BulkOperations bulkOps(BulkMode bulkMode, Class<?> entityClass) {
return bulkOps(bulkMode, entityClass, operations.determineCollectionName(entityClass));
return bulkOps(bulkMode, entityClass, getCollectionName(entityClass));
}
/*
@@ -768,7 +760,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable
@Override
public <T> T findOne(Query query, Class<T> entityClass) {
return findOne(query, entityClass, operations.determineCollectionName(entityClass));
return findOne(query, entityClass, getCollectionName(entityClass));
}
@Nullable
@@ -790,7 +782,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public boolean exists(Query query, Class<?> entityClass) {
return exists(query, entityClass, operations.determineCollectionName(entityClass));
return exists(query, entityClass, getCollectionName(entityClass));
}
@Override
@@ -820,7 +812,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
@Override
public <T> List<T> find(Query query, Class<T> entityClass) {
return find(query, entityClass, operations.determineCollectionName(entityClass));
return find(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -841,7 +833,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable
@Override
public <T> T findById(Object id, Class<T> entityClass) {
return findById(id, entityClass, operations.determineCollectionName(entityClass));
return findById(id, entityClass, getCollectionName(entityClass));
}
@Nullable
@@ -863,7 +855,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
@Override
public <T> List<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(query, field, operations.determineCollectionName(entityClass), entityClass, resultClass);
return findDistinct(query, field, getCollectionName(entityClass), entityClass, resultClass);
}
/*
@@ -946,7 +938,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public <T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass) {
return geoNear(near, entityClass, operations.determineCollectionName(entityClass));
return geoNear(near, entityClass, getCollectionName(entityClass));
}
@Override
@@ -969,11 +961,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(returnType, "ReturnType must not be null!");
String collection = StringUtils.hasText(collectionName) ? collectionName
: operations.determineCollectionName(domainType);
: getCollectionName(domainType);
Document nearDocument = near.toDocument();
Document command = new Document("geoNear", collection);
command.putAll(queryMapper.getMappedObject(nearDocument, Optional.empty()));
command.putAll(nearDocument);
if (nearDocument.containsKey("query")) {
Document query = (Document) nearDocument.get("query");
@@ -1023,7 +1015,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass,
operations.determineCollectionName(entityClass));
getCollectionName(entityClass));
}
@Nullable
@@ -1035,7 +1027,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable
@Override
public <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass) {
return findAndModify(query, update, options, entityClass, operations.determineCollectionName(entityClass));
return findAndModify(query, update, options, entityClass, getCollectionName(entityClass));
}
@Nullable
@@ -1099,7 +1091,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Nullable
@Override
public <T> T findAndRemove(Query query, Class<T> entityClass) {
return findAndRemove(query, entityClass, operations.determineCollectionName(entityClass));
return findAndRemove(query, entityClass, getCollectionName(entityClass));
}
@Nullable
@@ -1118,7 +1110,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public long count(Query query, Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return count(query, entityClass, operations.determineCollectionName(entityClass));
return count(query, entityClass, getCollectionName(entityClass));
}
@Override
@@ -1138,6 +1130,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
CountOptions options = new CountOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
if (StringUtils.hasText(query.getHint())) {
options.hint(Document.parse(query.getHint()));
}
Document document = queryMapper.getMappedObject(query.getQueryObject(),
Optional.ofNullable(entityClass).map(it -> mappingContext.getPersistentEntity(entityClass)));
@@ -1167,7 +1163,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(objectToSave, "ObjectToSave must not be null!");
ensureNotIterable(objectToSave);
return insert(objectToSave, operations.determineEntityCollectionName(objectToSave));
return insert(objectToSave, getCollectionName(ClassUtils.getUserClass(objectToSave)));
}
/*
@@ -1262,7 +1258,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(batchToSave, "BatchToSave must not be null!");
return (Collection<T>) doInsertBatch(operations.determineCollectionName(entityClass), batchToSave,
return (Collection<T>) doInsertBatch(getCollectionName(entityClass), batchToSave,
this.mongoConverter);
}
@@ -1296,9 +1292,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
continue;
}
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(element.getClass());
String collection = entity.getCollection();
String collection = getCollectionName(ClassUtils.getUserClass(element));
List<T> collectionElements = elementsByCollection.get(collection);
if (null == collectionElements) {
@@ -1362,7 +1356,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public <T> T save(T objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
return save(objectToSave, operations.determineEntityCollectionName(objectToSave));
return save(objectToSave, getCollectionName(ClassUtils.getUserClass(objectToSave)));
}
@Override
@@ -1518,7 +1512,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public UpdateResult upsert(Query query, Update update, Class<?> entityClass) {
return doUpdate(operations.determineCollectionName(entityClass), query, update, entityClass, true, false);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, true, false);
}
@Override
@@ -1536,7 +1530,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public UpdateResult updateFirst(Query query, Update update, Class<?> entityClass) {
return doUpdate(operations.determineCollectionName(entityClass), query, update, entityClass, false, false);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, false, false);
}
@Override
@@ -1554,7 +1548,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public UpdateResult updateMulti(Query query, Update update, Class<?> entityClass) {
return doUpdate(operations.determineCollectionName(entityClass), query, update, entityClass, false, true);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, false, true);
}
@Override
@@ -1570,13 +1564,24 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return doUpdate(collectionName, query, update, entityClass, false, true);
}
protected UpdateResult doUpdate(final String collectionName, final Query query, final UpdateDefinition update,
protected UpdateResult doUpdate(final String collectionName, final Query query, final Update update,
@Nullable final Class<?> entityClass, final boolean upsert, final boolean multi) {
return doUpdate(collectionName, query, (UpdateDefinition) update, entityClass, upsert, multi);
}
private UpdateResult doUpdate(final String collectionName, final Query query, final UpdateDefinition update,
@Nullable final Class<?> entityClass, final boolean upsert, final boolean multi) {
Assert.notNull(collectionName, "CollectionName must not be null!");
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
if (query.isSorted() && LOGGER.isWarnEnabled()) {
LOGGER.warn("{} does not support sort ('{}'). Please use findAndModify() instead.",
upsert ? "Upsert" : "UpdateFirst", serializeToJsonSafely(query.getSortObject()));
}
return execute(collectionName, new CollectionCallback<UpdateResult>() {
public UpdateResult doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
@@ -1588,11 +1593,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
UpdateOptions opts = new UpdateOptions();
opts.upsert(upsert);
if (update.hasArrayFilters()) {
opts.arrayFilters(
update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()));
}
Document queryObj = new Document();
if (query != null) {
@@ -1601,8 +1601,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
query.getCollation().map(Collation::toMongoCollation).ifPresent(opts::collation);
}
Document updateObj = update instanceof MappedUpdate ? update.getUpdateObject()
: updateMapper.getMappedObject(update.getUpdateObject(), entity);
Document updateObj = update instanceof MappedUpdate ? update.getUpdateObject() : updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (multi && update.isIsolated() && !queryObj.containsKey("$isolated")) {
queryObj.put("$isolated", 1);
@@ -1637,8 +1636,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
});
}
private void increaseVersionForUpdateIfNecessary(@Nullable MongoPersistentEntity<?> persistentEntity,
UpdateDefinition update) {
private void increaseVersionForUpdateIfNecessary(@Nullable MongoPersistentEntity<?> persistentEntity, UpdateDefinition update) {
if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
String versionFieldName = persistentEntity.getRequiredVersionProperty().getFieldName();
@@ -1653,7 +1651,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(object, "Object must not be null!");
return remove(object, operations.determineCollectionName(object.getClass()));
Query query = operations.forEntity(object).getByIdQuery();
return remove(query, object.getClass());
}
@Override
@@ -1662,7 +1662,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Query query = operations.forEntity(object).getRemoveByQuery();
Query query = operations.forEntity(object).getByIdQuery();
return doRemove(collectionName, query, object.getClass(), false);
}
@@ -1674,7 +1674,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public DeleteResult remove(Query query, Class<?> entityClass) {
return remove(query, entityClass, operations.determineCollectionName(entityClass));
return remove(query, entityClass, getCollectionName(entityClass));
}
@Override
@@ -1745,7 +1745,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public <T> List<T> findAll(Class<T> entityClass) {
return findAll(entityClass, operations.determineCollectionName(entityClass));
return findAll(entityClass, getCollectionName(entityClass));
}
@Override
@@ -1942,7 +1942,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
@Override
public <O> AggregationResults<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType) {
return aggregate(aggregation, operations.determineCollectionName(aggregation.getInputType()), outputType);
return aggregate(aggregation, getCollectionName(aggregation.getInputType()), outputType);
}
/* (non-Javadoc)
@@ -1965,7 +1965,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public <O> AggregationResults<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
return aggregate(aggregation, operations.determineCollectionName(inputType), outputType,
return aggregate(aggregation, getCollectionName(inputType), outputType,
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
}
@@ -1996,7 +1996,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
@Override
public <O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType) {
return aggregateStream(aggregation, operations.determineCollectionName(aggregation.getInputType()), outputType);
return aggregateStream(aggregation, getCollectionName(aggregation.getInputType()), outputType);
}
/* (non-Javadoc)
@@ -2005,7 +2005,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public <O> CloseableIterator<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
return aggregateStream(aggregation, operations.determineCollectionName(inputType), outputType,
return aggregateStream(aggregation, getCollectionName(inputType), outputType,
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
}
@@ -2031,7 +2031,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
@Override
public <T> List<T> findAllAndRemove(Query query, Class<T> entityClass) {
return findAllAndRemove(query, entityClass, operations.determineCollectionName(entityClass));
return findAllAndRemove(query, entityClass, getCollectionName(entityClass));
}
/* (non-Javadoc)
@@ -2428,7 +2428,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
<S, T> List<T> doFind(String collectionName, Document query, Document fields, Class<S> sourceClass,
Class<T> targetClass, CursorPreparer preparer) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(sourceClass);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(sourceClass);
Document mappedFields = getMappedFieldsObject(fields, entity, targetClass);
Document mappedQuery = queryMapper.getMappedObject(query, entity);
@@ -2557,9 +2557,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collectionName);
}
return executeFindOneInternal(
new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate,
update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()), options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDocumentCallback<>(readerToUse, entityClass, collectionName), collectionName);
}
@@ -2765,7 +2763,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
}
private Document getMappedFieldsObject(Document fields, MongoPersistentEntity<?> entity, Class<?> targetType) {
private Document getMappedFieldsObject(Document fields, @Nullable MongoPersistentEntity<?> entity,
Class<?> targetType) {
if (entity == null) {
return fields;
}
Document projectedFields = propertyOperations.computeFieldsForProjection(projectionFactory, fields,
entity.getType(), targetType);
@@ -2916,16 +2919,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Document fields;
private final Document sort;
private final Document update;
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
public FindAndModifyCallback(Document query, Document fields, Document sort, Document update,
List<Document> arrayFilters, FindAndModifyOptions options) {
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.arrayFilters = arrayFilters;
this.options = options;
}
@@ -2943,10 +2944,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
options.getCollation().map(Collation::toMongoCollation).ifPresent(opts::collation);
if (!arrayFilters.isEmpty()) {
opts.arrayFilters(arrayFilters);
}
return collection.findOneAndUpdate(query, update, opts);
}
}

View File

@@ -66,8 +66,12 @@ class PropertyOperations {
projectionInformation.getInputProperties().forEach(it -> projectedFields.append(it.getName(), 1));
}
} else {
mappingContext.getRequiredPersistentEntity(targetType).doWithProperties(
(SimplePropertyHandler) persistentProperty -> projectedFields.append(persistentProperty.getName(), 1));
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(targetType);
if (entity != null) {
entity.doWithProperties(
(SimplePropertyHandler) persistentProperty -> projectedFields.append(persistentProperty.getName(), 1));
}
}
return projectedFields;

View File

@@ -116,11 +116,11 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
TypedAggregation<?> typedAggregation = (TypedAggregation<?>) aggregation;
if (typedAggregation.getInputType() != null) {
return template.determineCollectionName(typedAggregation.getInputType());
return template.getCollectionName(typedAggregation.getInputType());
}
}
return template.determineCollectionName(domainType);
return template.getCollectionName(domainType);
}
}
}

View File

@@ -238,7 +238,7 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
return StringUtils.hasText(collection) ? collection : template.getCollectionName(domainType);
}
private String asString() {

View File

@@ -96,7 +96,7 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
return StringUtils.hasText(collection) ? collection : template.getCollectionName(domainType);
}
}
}

View File

@@ -171,7 +171,7 @@ class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
return StringUtils.hasText(collection) ? collection : template.getCollectionName(domainType);
}
}
}

View File

@@ -56,13 +56,17 @@ import com.mongodb.reactivestreams.client.MongoCollection;
* Implemented by {@link ReactiveMongoTemplate}. Not often used but a useful option for extensibility and testability
* (as it can be easily mocked, stubbed, or be the target of a JDK proxy). Command execution using
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}.
* <p />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
* @see Flux
* @see Mono
* @see <a href="http://projectreactor.io/docs/">Project Reactor</a>
* @see <a href="https://projectreactor.io/docs/">Project Reactor</a>
*/
public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
@@ -277,12 +281,15 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Flux<String> getCollectionNames();
/**
* Get a collection by name, creating it if it doesn't exist.
* Get a {@link MongoCollection} by name. The returned collection may not exists yet (except in local memory) and is
* created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection.
* @return an existing collection or a newly created one.
* @return an existing collection or one created on first server interaction.
*/
MongoCollection<Document> getCollection(String collectionName);
@@ -917,10 +924,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
@@ -977,10 +984,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
@@ -1025,10 +1032,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1043,10 +1050,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1062,10 +1069,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1080,10 +1087,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1094,7 +1101,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
@@ -1110,6 +1119,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
* <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
@@ -1122,7 +1134,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
@@ -1136,7 +1150,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
* the provided update document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.
@@ -1152,6 +1168,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
* <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.
@@ -1165,6 +1184,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br />
* <strong>NOTE:</strong> {@link Query#getSortObject() sorting} is not supported by {@code db.collection.updateOne}.
* Use {@link #findAndModify(Query, Update, Class, String)} instead.
*
* @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* {@literal null}.

View File

@@ -20,20 +20,18 @@ import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.CoreSubscriber;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.util.context.Context;
import reactor.util.function.Tuple2;
import reactor.util.function.Tuples;
import java.lang.reflect.Field;
import java.util.*;
import java.util.concurrent.TimeUnit;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.stream.Collectors;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
@@ -73,16 +71,7 @@ import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.PrefixingDelegatingAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.index.ReactiveMongoPersistentEntityIndexCreator;
@@ -105,7 +94,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.util.Optionals;
@@ -114,7 +102,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
@@ -126,29 +113,11 @@ import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndReplaceOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.ValidationOptions;
import com.mongodb.client.model.*;
import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.reactivestreams.client.AggregatePublisher;
import com.mongodb.reactivestreams.client.ChangeStreamPublisher;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.DistinctPublisher;
import com.mongodb.reactivestreams.client.FindPublisher;
import com.mongodb.reactivestreams.client.MapReducePublisher;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.reactivestreams.client.Success;
import com.mongodb.reactivestreams.client.*;
/**
* Primary implementation of {@link ReactiveMongoOperations}. It simplifies the use of Reactive MongoDB usage and helps
@@ -265,15 +234,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (this.mappingContext instanceof MongoMappingContext) {
MongoMappingContext mongoMappingContext = (MongoMappingContext) this.mappingContext;
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
this.eventPublisher = new MongoMappingEventPublisher(this.indexCreatorListener);
if (mongoMappingContext.isAutoIndexCreation()) {
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
this.eventPublisher = new MongoMappingEventPublisher(this.indexCreatorListener);
mongoMappingContext.setApplicationEventPublisher(this.eventPublisher);
this.mappingContext.getPersistentEntities()
.forEach(entity -> onCheckForIndexes(entity, subscriptionExceptionHandler));
}
mongoMappingContext.setApplicationEventPublisher(this.eventPublisher);
this.mappingContext.getPersistentEntities()
.forEach(entity -> onCheckForIndexes(entity, subscriptionExceptionHandler));
}
}
@@ -409,12 +375,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#reactiveIndexOps(java.lang.Class)
*/
public ReactiveIndexOperations indexOps(Class<?> entityClass) {
return new DefaultReactiveIndexOperations(this, determineCollectionName(entityClass), this.queryMapper,
entityClass);
return new DefaultReactiveIndexOperations(this, getCollectionName(entityClass), this.queryMapper, entityClass);
}
public String getCollectionName(Class<?> entityClass) {
return this.determineCollectionName(entityClass);
return operations.determineCollectionName(entityClass);
}
/*
@@ -454,7 +419,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action) {
return createFlux(determineCollectionName(entityClass), action);
return createFlux(getCollectionName(entityClass), action);
}
/*
@@ -612,98 +577,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Mono<MongoCollection<Document>> collectionPublisher = Mono
.fromCallable(() -> getAndPrepareCollection(doGetDatabase(), collectionName));
Flux<T> source = collectionPublisher.flatMapMany(callback::doInCollection).onErrorMap(translateException());
return new Flux<T>() {
@Override
public void subscribe(CoreSubscriber actual) {
Long skip = extractSkip(actual);
Long take = extractLimit(actual);
System.out.println(String.format("Setting offset %s and limit: %s", skip, take));
Context context = Context.empty();
// and here we use the original Flux and evaluate skip / take in the template
if (skip != null && skip > 0L) {
context = context.put("skip", skip);
}
if (take != null && take > 0L) {
context = context.put("take", take);
}
source.subscriberContext(context).subscribe(actual);
}
};
return collectionPublisher.flatMapMany(callback::doInCollection).onErrorMap(translateException());
}
// --> HACKING
@Nullable
static Long extractSkip(Subscriber subscriber) {
if (subscriber == null || !ClassUtils.getShortName(subscriber.getClass()).endsWith("SkipSubscriber")) {
return null;
}
java.lang.reflect.Field field = ReflectionUtils.findField(subscriber.getClass(), "remaining");
if (field == null) {
return null;
}
ReflectionUtils.makeAccessible(field);
Long skip = (Long) ReflectionUtils.getField(field, subscriber);
if (skip != null && skip > 0L) {
// reset the field, otherwise we'd skip stuff in the code.
ReflectionUtils.setField(field, subscriber, 0L);
}
return skip;
}
@Nullable
static Long extractLimit(Subscriber subscriber) {
if (subscriber == null) {
return null;
}
if (!ClassUtils.getShortName(subscriber.getClass()).endsWith("TakeSubscriber")) {
return extractLimit(extractPotentialTakeSubscriber(subscriber));
}
java.lang.reflect.Field field = ReflectionUtils.findField(subscriber.getClass(), "n");
if (field == null) {
return null;
}
ReflectionUtils.makeAccessible(field);
return (Long) ReflectionUtils.getField(field, subscriber);
}
@Nullable
static Subscriber extractPotentialTakeSubscriber(Subscriber subscriber) {
if (!ClassUtils.getShortName(subscriber.getClass()).endsWith("SkipSubscriber")) {
return null;
}
Field field = ReflectionUtils.findField(subscriber.getClass(), "actual");
if (field == null) {
return null;
}
ReflectionUtils.makeAccessible(field);
return (Subscriber) ReflectionUtils.getField(field, subscriber);
}
// <--- HACKING
/**
* Create a reusable {@link Mono} for the {@code collectionName} and {@link ReactiveCollectionCallback}.
*
@@ -729,7 +605,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#createCollection(java.lang.Class)
*/
public <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass) {
return createCollection(determineCollectionName(entityClass));
return createCollection(getCollectionName(entityClass));
}
/*
@@ -738,8 +614,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
public <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass,
@Nullable CollectionOptions collectionOptions) {
return doCreateCollection(determineCollectionName(entityClass),
convertToCreateCollectionOptions(collectionOptions, entityClass));
return doCreateCollection(getCollectionName(entityClass), convertToCreateCollectionOptions(collectionOptions, entityClass));
}
/*
@@ -772,7 +647,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#collectionExists(java.lang.Class)
*/
public <T> Mono<Boolean> collectionExists(Class<T> entityClass) {
return collectionExists(determineCollectionName(entityClass));
return collectionExists(getCollectionName(entityClass));
}
/*
@@ -791,7 +666,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#dropCollection(java.lang.Class)
*/
public <T> Mono<Void> dropCollection(Class<T> entityClass) {
return dropCollection(determineCollectionName(entityClass));
return dropCollection(getCollectionName(entityClass));
}
/*
@@ -828,7 +703,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findOne(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
public <T> Mono<T> findOne(Query query, Class<T> entityClass) {
return findOne(query, entityClass, determineCollectionName(entityClass));
return findOne(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -851,7 +726,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#exists(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
public Mono<Boolean> exists(Query query, Class<?> entityClass) {
return exists(query, entityClass, determineCollectionName(entityClass));
return exists(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -894,7 +769,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#find(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
public <T> Flux<T> find(Query query, Class<T> entityClass) {
return find(query, entityClass, determineCollectionName(entityClass));
return find(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -916,7 +791,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findById(java.lang.Object, java.lang.Class)
*/
public <T> Mono<T> findById(Object id, Class<T> entityClass) {
return findById(id, entityClass, determineCollectionName(entityClass));
return findById(id, entityClass, getCollectionName(entityClass));
}
/*
@@ -935,7 +810,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findDistinct(org.springframework.data.mongodb.core.query.Query, java.lang.String, java.lang.Class, java.lang.Class)
*/
public <T> Flux<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(query, field, determineCollectionName(entityClass), entityClass, resultClass);
return findDistinct(query, field, getCollectionName(entityClass), entityClass, resultClass);
}
/*
@@ -1030,7 +905,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <O> Flux<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType) {
return aggregate(aggregation, determineCollectionName(aggregation.getInputType()), outputType);
return aggregate(aggregation, getCollectionName(aggregation.getInputType()), outputType);
}
/*
@@ -1040,7 +915,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Override
public <O> Flux<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType) {
return aggregate(aggregation, determineCollectionName(inputType), outputType,
return aggregate(aggregation, getCollectionName(inputType), outputType,
new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper));
}
@@ -1106,7 +981,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass) {
return geoNear(near, entityClass, determineCollectionName(entityClass));
return geoNear(near, entityClass, getCollectionName(entityClass));
}
/*
@@ -1130,7 +1005,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
throw new InvalidDataAccessApiUsageException("Entity class must not be null!");
}
String collection = StringUtils.hasText(collectionName) ? collectionName : determineCollectionName(entityClass);
String collection = StringUtils.hasText(collectionName) ? collectionName : getCollectionName(entityClass);
Document nearDocument = near.toDocument();
Document command = new Document("geoNear", collection);
@@ -1166,7 +1041,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findAndModify(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update, java.lang.Class)
*/
public <T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass) {
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, determineCollectionName(entityClass));
return findAndModify(query, update, new FindAndModifyOptions(), entityClass, getCollectionName(entityClass));
}
/*
@@ -1182,7 +1057,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findAndModify(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update, org.springframework.data.mongodb.core.FindAndModifyOptions, java.lang.Class)
*/
public <T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass) {
return findAndModify(query, update, options, entityClass, determineCollectionName(entityClass));
return findAndModify(query, update, options, entityClass, getCollectionName(entityClass));
}
/*
@@ -1241,7 +1116,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findAndRemove(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
public <T> Mono<T> findAndRemove(Query query, Class<T> entityClass) {
return findAndRemove(query, entityClass, determineCollectionName(entityClass));
return findAndRemove(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -1262,7 +1137,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(entityClass, "Entity class must not be null!");
return count(query, entityClass, determineCollectionName(entityClass));
return count(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -1292,6 +1167,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (query != null) {
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
}
if (StringUtils.hasText(query.getHint())) {
options.hint(Document.parse(query.getHint()));
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing count: {} in collection: {}", serializeToJsonSafely(filter), collectionName);
@@ -1319,7 +1197,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass) {
return insertAll(batchToSave, determineCollectionName(entityClass));
return insertAll(batchToSave, getCollectionName(entityClass));
}
/*
@@ -1343,7 +1221,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(objectToSave, "Object to insert must not be null!");
ensureNotIterable(objectToSave);
return insert(objectToSave, determineEntityCollectionName(objectToSave));
return insert(objectToSave, getCollectionName(ClassUtils.getUserClass(objectToSave)));
}
/*
@@ -1389,7 +1267,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(java.util.Collection, java.lang.Class)
*/
public <T> Flux<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass) {
return doInsertBatch(determineCollectionName(entityClass), batchToSave, this.mongoConverter);
return doInsertBatch(getCollectionName(entityClass), batchToSave, this.mongoConverter);
}
/*
@@ -1423,9 +1301,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
listToSave.forEach(element -> {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(element.getClass());
String collection = entity.getCollection();
String collection = getCollectionName(element.getClass());
List<T> collectionElements = elementsByCollection.computeIfAbsent(collection, k -> new ArrayList<>());
collectionElements.add(element);
@@ -1505,7 +1381,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Mono<T> save(T objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
return save(objectToSave, determineEntityCollectionName(objectToSave));
return save(objectToSave, getCollectionName(ClassUtils.getUserClass(objectToSave)));
}
/*
@@ -1669,7 +1545,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update, java.lang.Class)
*/
public Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, true, false);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, true, false);
}
/*
@@ -1693,7 +1569,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#updateFirst(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update, java.lang.Class)
*/
public Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, false);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, false, false);
}
/*
@@ -1717,7 +1593,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update, java.lang.Class)
*/
public Mono<UpdateResult> updateMulti(Query query, Update update, Class<?> entityClass) {
return doUpdate(determineCollectionName(entityClass), query, update, entityClass, false, true);
return doUpdate(getCollectionName(entityClass), query, update, entityClass, false, true);
}
/*
@@ -1736,8 +1612,19 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return doUpdate(collectionName, query, update, entityClass, false, true);
}
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable UpdateDefinition update,
protected Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable Update update,
@Nullable Class<?> entityClass, boolean upsert, boolean multi) {
return doUpdate(collectionName, query, (UpdateDefinition) update, entityClass, upsert, multi);
}
private Mono<UpdateResult> doUpdate(String collectionName, Query query, @Nullable UpdateDefinition update,
@Nullable Class<?> entityClass, boolean upsert, boolean multi) {
if (query.isSorted() && LOGGER.isWarnEnabled()) {
LOGGER.warn("{} does not support sort ('{}'). Please use findAndModify() instead.",
upsert ? "Upsert" : "UpdateFirst", serializeToJsonSafely(query.getSortObject()));
}
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
@@ -1762,11 +1649,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
UpdateOptions updateOptions = new UpdateOptions().upsert(upsert);
query.getCollation().map(Collation::toMongoCollation).ifPresent(updateOptions::collation);
if (update.hasArrayFilters()) {
updateOptions.arrayFilters(update.getArrayFilters().stream().map(ArrayFilter::asDocument)
.map(it -> queryMapper.getMappedObject(it, entity)).collect(Collectors.toList()));
}
if (!UpdateMapper.isUpdateObject(updateObj)) {
ReplaceOptions replaceOptions = new ReplaceOptions();
@@ -1815,7 +1697,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return false;
}
return document.containsKey(persistentEntity.getRequiredVersionProperty().getFieldName());
return document.containsKey(persistentEntity.getRequiredIdProperty().getFieldName());
}
/*
@@ -1844,7 +1726,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(object, "Object must not be null!");
return remove(operations.forEntity(object).getRemoveByQuery(), object.getClass());
return remove(operations.forEntity(object).getByIdQuery(), object.getClass());
}
/*
@@ -1856,7 +1738,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
return doRemove(collectionName, operations.forEntity(object).getRemoveByQuery(), object.getClass());
return doRemove(collectionName, operations.forEntity(object).getByIdQuery(), object.getClass());
}
private void assertUpdateableIdIfNotSet(Object value) {
@@ -1893,7 +1775,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
public Mono<DeleteResult> remove(Query query, Class<?> entityClass) {
return remove(query, entityClass, determineCollectionName(entityClass));
return remove(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -1914,14 +1796,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
Document queryObject = query.getQueryObject();
MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
Document removeQuery = queryMapper.getMappedObject(queryObject, entity);
return execute(collectionName, collection -> {
maybeEmitEvent(new BeforeDeleteEvent<>(removeQuery, entityClass, collectionName));
Document removeQuey = queryMapper.getMappedObject(queryObject, entity);
maybeEmitEvent(new BeforeDeleteEvent<>(removeQuey, entityClass, collectionName));
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass,
null, removeQuery);
null, removeQuey);
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
@@ -1931,13 +1814,13 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
new Object[] { serializeToJsonSafely(removeQuey), collectionName });
}
if (query.getLimit() > 0 || query.getSkip() > 0) {
FindPublisher<Document> cursor = new QueryFindPublisherPreparer(query, entityClass)
.prepare(collection.find(removeQuery)) //
.prepare(collection.find(removeQuey)) //
.projection(MappedDocument.getIdOnlyProjection());
return Flux.from(cursor) //
@@ -1948,10 +1831,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return collectionToUse.deleteMany(MappedDocument.getIdIn(val), deleteOptions);
});
} else {
return collectionToUse.deleteMany(removeQuery, deleteOptions);
return collectionToUse.deleteMany(removeQuey, deleteOptions);
}
}).doOnNext(it -> maybeEmitEvent(new AfterDeleteEvent<>(queryObject, entityClass, collectionName))) //
}).doOnNext(deleteResult -> maybeEmitEvent(new AfterDeleteEvent<>(queryObject, entityClass, collectionName)))
.next();
}
@@ -1960,7 +1843,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#findAll(java.lang.Class)
*/
public <T> Flux<T> findAll(Class<T> entityClass) {
return findAll(entityClass, determineCollectionName(entityClass));
return findAll(entityClass, getCollectionName(entityClass));
}
/*
@@ -1988,7 +1871,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass) {
return findAllAndRemove(query, entityClass, determineCollectionName(entityClass));
return findAllAndRemove(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -2006,7 +1889,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
@Override
public <T> Flux<T> tail(Query query, Class<T> entityClass) {
return tail(query, entityClass, determineCollectionName(entityClass));
return tail(query, entityClass, getCollectionName(entityClass));
}
/*
@@ -2052,7 +1935,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
publisher = options.getResumeToken().map(BsonValue::asDocument).map(publisher::resumeAfter).orElse(publisher);
publisher = options.getCollation().map(Collation::toMongoCollation).map(publisher::collation).orElse(publisher);
publisher = options.getResumeBsonTimestamp().map(publisher::startAtOperationTime).orElse(publisher);
publisher = options.getResumeTimestamp().map(it -> new BsonTimestamp((int) it.getEpochSecond(), 0))
.map(publisher::startAtOperationTime).orElse(publisher);
publisher = publisher.fullDocument(options.getFullDocumentLookup().orElse(fullDocument));
return Flux.from(publisher).map(document -> new ChangeStreamEvent<>(document, targetType, getConverter()));
@@ -2088,8 +1972,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, Class<T> resultType, String mapFunction,
String reduceFunction, MapReduceOptions options) {
return mapReduce(filterQuery, domainType, determineCollectionName(domainType), resultType, mapFunction,
reduceFunction, options);
return mapReduce(filterQuery, domainType, getCollectionName(domainType), resultType, mapFunction, reduceFunction,
options);
}
/*
@@ -2378,7 +2262,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
<S, T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<S> sourceClass,
Class<T> targetClass, FindPublisherPreparer preparer) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(sourceClass);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(sourceClass);
Document mappedFields = getMappedFieldsObject(fields, entity, targetClass);
Document mappedQuery = queryMapper.getMappedObject(query, entity);
@@ -2392,7 +2276,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
new ProjectingReadCallback<>(mongoConverter, sourceClass, targetClass, collectionName), collectionName);
}
private Document getMappedFieldsObject(Document fields, MongoPersistentEntity<?> entity, Class<?> targetType) {
private Document getMappedFieldsObject(Document fields, @Nullable MongoPersistentEntity<?> entity,
Class<?> targetType) {
if (entity == null) {
return fields;
}
Document projectedFields = propertyOperations.computeFieldsForProjection(projectionFactory, fields,
entity.getType(), targetType);
@@ -2494,7 +2383,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collectionName));
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, update.getArrayFilters().stream().map(ArrayFilter::asDocument).collect(Collectors.toList()), options),
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDocumentCallback<>(this.mongoConverter, entityClass, collectionName), collectionName);
});
}
@@ -2659,33 +2548,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return createFlux(collectionName, collection -> {
return Mono.subscriberContext().flatMapMany(context -> {
FindPublisher<Document> findPublisher = collectionCallback.doInCollection(collection);
if (preparer != null) {
findPublisher = preparer.prepare(findPublisher);
}
Long skip = context.getOrDefault("skip", null);
Long take = context.getOrDefault("take", null);
System.out.println(String.format("Using offset: %s and limit: %s", skip, take));
if(skip != null && skip > 0L) {
findPublisher = findPublisher.skip(skip.intValue());
}
if(take != null && take > 0L) {
findPublisher = findPublisher.limit(take.intValue());
}
return Flux.from(findPublisher).doOnNext(System.out::println).map(objectCallback::doWith);
});
FindPublisher<Document> findPublisher = collectionCallback.doInCollection(collection);
if (preparer != null) {
findPublisher = preparer.prepare(findPublisher);
}
return Flux.from(findPublisher).map(objectCallback::doWith);
});
}
@@ -2737,25 +2605,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return type == null ? null : mappingContext.getPersistentEntity(type);
}
private <T> String determineEntityCollectionName(@Nullable T obj) {
if (null != obj) {
return determineCollectionName(obj.getClass());
}
return null;
}
String determineCollectionName(@Nullable Class<?> entityClass) {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
"No class parameter provided, entity collection can't be determined!");
}
return mappingContext.getRequiredPersistentEntity(entityClass).getCollection();
}
private static MappingMongoConverter getDefaultMongoConverter() {
MongoCustomConversions conversions = new MongoCustomConversions(Collections.emptyList());
@@ -2899,7 +2748,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final Document fields;
private final Document sort;
private final Document update;
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
@Override
@@ -2915,12 +2763,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return collection.findOneAndDelete(query, findOneAndDeleteOptions);
}
FindOneAndUpdateOptions findOneAndUpdateOptions = convertToFindOneAndUpdateOptions(options, fields, sort, arrayFilters);
FindOneAndUpdateOptions findOneAndUpdateOptions = convertToFindOneAndUpdateOptions(options, fields, sort);
return collection.findOneAndUpdate(query, update, findOneAndUpdateOptions);
}
private static FindOneAndUpdateOptions convertToFindOneAndUpdateOptions(FindAndModifyOptions options, Document fields,
Document sort, List<Document> arrayFilters) {
private FindOneAndUpdateOptions convertToFindOneAndUpdateOptions(FindAndModifyOptions options, Document fields,
Document sort) {
FindOneAndUpdateOptions result = new FindOneAndUpdateOptions();
@@ -2933,7 +2781,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
result = options.getCollation().map(Collation::toMongoCollation).map(result::collation).orElse(result);
result.arrayFilters(arrayFilters);
return result;
}
@@ -3334,7 +3181,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
onCheckForIndexes((MongoPersistentEntity<?>) entity, subscriptionExceptionHandler);
}
}

View File

@@ -112,7 +112,7 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
return StringUtils.hasText(collection) ? collection : template.getCollectionName(domainType);
}
}

View File

@@ -126,7 +126,7 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
return template.findAndModify(query, update, findAndModifyOptions, targetType, collectionName);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndReplace#findAndReplace()
*/
@@ -172,7 +172,7 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#replaceWith(java.lang.Object)
*/
@@ -185,7 +185,7 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@@ -216,7 +216,7 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
return StringUtils.hasText(collection) ? collection : template.getCollectionName(domainType);
}
}
}

View File

@@ -20,7 +20,7 @@ import org.springframework.util.Assert;
/**
* An {@link AggregationExpression} that renders a MongoDB Aggregation Framework expression from the AST of a
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#expressions">SpEL
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#expressions">SpEL
* expression</a>. <br />
* <br />
* <strong>Samples:</strong> <br />

View File

@@ -496,7 +496,7 @@ public class ArrayOperators {
}
NestedDelegatingExpressionAggregationOperationContext nea = new NestedDelegatingExpressionAggregationOperationContext(
context);
context, Collections.singleton(as));
return ((AggregationExpression) condition).toDocument(nea);
}

View File

@@ -69,6 +69,11 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(Field field) {
if (field.isInternal()) {
return new DirectFieldReference(new ExposedField(field, true));
}
return getReference(field, field.getTarget());
}

View File

@@ -43,4 +43,12 @@ public interface Field {
* @return
*/
boolean isAliased();
/**
* @return true if the field name references a local value such as {@code $$this}.
* @since 2.1.11
*/
default boolean isInternal() {
return false;
}
}

View File

@@ -283,6 +283,11 @@ public final class Fields implements Iterable<Field> {
return !getName().equals(getTarget());
}
@Override
public boolean isInternal() {
return getRaw().endsWith("$$this") || getRaw().endsWith("$$value");
}
/**
* @return {@literal true} in case the field name starts with {@code $$}.
* @since 1.10

View File

@@ -15,9 +15,12 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collection;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExpressionFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
/**
@@ -26,21 +29,25 @@ import org.springframework.util.Assert;
* variable.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.10
*/
class NestedDelegatingExpressionAggregationOperationContext implements AggregationOperationContext {
private final AggregationOperationContext delegate;
private final Collection<Field> inners;
/**
* Creates new {@link NestedDelegatingExpressionAggregationOperationContext}.
*
* @param referenceContext must not be {@literal null}.
*/
public NestedDelegatingExpressionAggregationOperationContext(AggregationOperationContext referenceContext) {
NestedDelegatingExpressionAggregationOperationContext(AggregationOperationContext referenceContext,
Collection<Field> inners) {
Assert.notNull(referenceContext, "Reference context must not be null!");
this.delegate = referenceContext;
this.inners = inners;
}
/*
@@ -58,7 +65,25 @@ class NestedDelegatingExpressionAggregationOperationContext implements Aggregati
*/
@Override
public FieldReference getReference(Field field) {
return new ExpressionFieldReference(delegate.getReference(field));
FieldReference reference = delegate.getReference(field);
return isInnerVariableReference(field) ? new ExpressionFieldReference(delegate.getReference(field)) : reference;
}
private boolean isInnerVariableReference(Field field) {
if (inners.isEmpty()) {
return false;
}
for (Field inner : inners) {
if (inner.getName().equals(field.getName())
|| (field.getTarget().contains(".") && field.getTarget().startsWith(inner.getName()))) {
return true;
}
}
return false;
}
/*

View File

@@ -18,7 +18,9 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
@@ -185,7 +187,8 @@ public class VariableOperators {
map.putAll(context.getMappedObject(input));
map.put("as", itemVariableName);
map.put("in",
functionToApply.toDocument(new NestedDelegatingExpressionAggregationOperationContext(operationContext)));
functionToApply.toDocument(new NestedDelegatingExpressionAggregationOperationContext(operationContext,
Collections.singleton(Fields.field(itemVariableName)))));
return new Document("$map", map);
}
@@ -322,12 +325,14 @@ public class VariableOperators {
private Document getMappedVariable(ExpressionVariable var, AggregationOperationContext context) {
return new Document(var.variableName, var.expression instanceof AggregationExpression
? ((AggregationExpression) var.expression).toDocument(context) : var.expression);
return new Document(var.variableName,
var.expression instanceof AggregationExpression ? ((AggregationExpression) var.expression).toDocument(context)
: var.expression);
}
private Object getMappedIn(AggregationOperationContext context) {
return expression.toDocument(new NestedDelegatingExpressionAggregationOperationContext(context));
return expression.toDocument(new NestedDelegatingExpressionAggregationOperationContext(context,
this.vars.stream().map(var -> Fields.field(var.variableName)).collect(Collectors.toList())));
}
/**

View File

@@ -32,6 +32,8 @@ import java.util.stream.Stream;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.cglib.proxy.Callback;
import org.springframework.cglib.proxy.Enhancer;
@@ -66,6 +68,8 @@ import com.mongodb.client.model.Filters;
*/
public class DefaultDbRefResolver implements DbRefResolver {
private static final Logger LOGGER = LoggerFactory.getLogger(DefaultDbRefResolver.class);
private final MongoDbFactory mongoDbFactory;
private final PersistenceExceptionTranslator exceptionTranslator;
private final ObjenesisStd objenesis;
@@ -110,6 +114,12 @@ public class DefaultDbRefResolver implements DbRefResolver {
@Override
public Document fetch(DBRef dbRef) {
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Fetching DBRef '{}' from {}.{}.", dbRef.getId(),
StringUtils.hasText(dbRef.getDatabaseName()) ? dbRef.getDatabaseName() : mongoDbFactory.getDb().getName(),
dbRef.getCollectionName());
}
StringUtils.hasText(dbRef.getDatabaseName());
return getCollection(dbRef).find(Filters.eq("_id", dbRef.getId())).first();
}
@@ -141,7 +151,16 @@ public class DefaultDbRefResolver implements DbRefResolver {
ids.add(ref.getId());
}
List<Document> result = getCollection(refs.iterator().next()) //
DBRef databaseSource = refs.iterator().next();
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Bulk fetching DBRefs {} from {}.{}.", ids,
StringUtils.hasText(databaseSource.getDatabaseName()) ? databaseSource.getDatabaseName()
: mongoDbFactory.getDb().getName(),
databaseSource.getCollectionName());
}
List<Document> result = getCollection(databaseSource) //
.find(new Document("_id", new Document("$in", ids))) //
.into(new ArrayList<>());
@@ -438,26 +457,34 @@ public class DefaultDbRefResolver implements DbRefResolver {
@Nullable
private synchronized Object resolve() {
if (!resolved) {
if (resolved) {
try {
return callback.resolve(property);
} catch (RuntimeException ex) {
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
if (translatedException instanceof ClientSessionException) {
throw new LazyLoadingException("Unable to lazily resolve DBRef! Invalid session state.", ex);
}
throw new LazyLoadingException("Unable to lazily resolve DBRef!",
translatedException != null ? translatedException : ex);
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Accessing already resolved lazy loading property {}.{}",
property.getOwner() != null ? property.getOwner().getName() : "unknown", property.getName());
}
return result;
}
return result;
try {
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("Resolving lazy loading property {}.{}",
property.getOwner() != null ? property.getOwner().getName() : "unknown", property.getName());
}
return callback.resolve(property);
} catch (RuntimeException ex) {
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
if (translatedException instanceof ClientSessionException) {
throw new LazyLoadingException("Unable to lazily resolve DBRef! Invalid session state.", ex);
}
throw new LazyLoadingException("Unable to lazily resolve DBRef!",
translatedException != null ? translatedException : ex);
}
}
}

View File

@@ -20,7 +20,6 @@ import java.util.Map.Entry;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
@@ -29,7 +28,9 @@ import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.MappingException;
@@ -515,7 +516,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (idProperty != null && !dbObjectAccessor.hasValue(idProperty)) {
Object value = idMapper.convertId(accessor.getProperty(idProperty), idProperty.getFieldType());
Object value = idMapper.convertId(accessor.getProperty(idProperty));
if (value != null) {
dbObjectAccessor.put(idProperty, value);
@@ -619,7 +620,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
MongoPersistentEntity<?> entity = valueType.isSubTypeOf(prop.getType())
MongoPersistentEntity<?> entity = isSubTypeOf(obj.getClass(), prop.getType())
? mappingContext.getRequiredPersistentEntity(obj.getClass())
: mappingContext.getRequiredPersistentEntity(type);
@@ -976,8 +977,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("Cannot create a reference to an object with a NULL id.");
}
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), entity,
idMapper.convertId(id, idProperty != null ? idProperty.getFieldType() : ObjectId.class));
return dbRefResolver.createDbRef(property == null ? null : property.getDBRef(), entity, idMapper.convertId(id));
}
throw new MappingException("No id property found on class " + entity.getType());
@@ -1007,8 +1007,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.isSubTypeOf(Collection.class) //
? targetType.getType() //
Class<?> collectionType = targetType.getType();
collectionType = isSubTypeOf(collectionType, Collection.class) //
? collectionType //
: List.class;
TypeInformation<?> componentType = targetType.getComponentType() != null //
@@ -1597,6 +1598,26 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbRefResolver.bulkFetch(references);
}
/**
* Create a new {@link MappingMongoConverter} using the given {@link MongoDbFactory} when loading {@link DBRef}.
*
* @return new instance of {@link MappingMongoConverter}. Never {@literal null}.
* @since 2.1.6
*/
public MappingMongoConverter with(MongoDbFactory dbFactory) {
MappingMongoConverter target = new MappingMongoConverter(new DefaultDbRefResolver(dbFactory), mappingContext);
target.applicationContext = applicationContext;
target.conversions = conversions;
target.spELContext = spELContext;
target.setInstantiators(instantiators);
target.typeMapper = typeMapper;
target.afterPropertiesSet();
return target;
}
/**
* Returns whether the given {@link Iterable} contains {@link DBRef} instances all pointing to the same collection.
*
@@ -1625,6 +1646,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return true;
}
/**
* Returns whether the given type is a sub type of the given reference, i.e. assignable but not the exact same type.
*
* @param type must not be {@literal null}.
* @param reference must not be {@literal null}.
* @return
*/
private static boolean isSubTypeOf(Class<?> type, Class<?> reference) {
return !type.equals(reference) && reference.isAssignableFrom(type);
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.

View File

@@ -18,8 +18,6 @@ package org.springframework.data.mongodb.core.convert;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.data.convert.EntityConverter;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.convert.TypeMapper;
@@ -85,18 +83,7 @@ public interface MongoConverter
if (sourceDocument.containsKey("$ref") && sourceDocument.containsKey("$id")) {
Object id = sourceDocument.get("$id");
String collection = sourceDocument.getString("$ref");
MongoPersistentEntity<?> entity = getMappingContext().getPersistentEntity(targetType);
if (entity != null && entity.hasIdProperty()) {
id = convertId(id, entity.getIdProperty().getFieldType());
}
DBRef ref = sourceDocument.containsKey("$db") ? new DBRef(sourceDocument.getString("$db"), collection, id)
: new DBRef(collection, id);
sourceDocument = dbRefResolver.fetch(ref);
sourceDocument = dbRefResolver.fetch(new DBRef(sourceDocument.getString("$ref"), sourceDocument.get("$id")));
if (sourceDocument == null) {
return null;
}
@@ -115,38 +102,4 @@ public interface MongoConverter
}
return getConversionService().convert(source, targetType);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link String}.
*
* @param id
* @param targetType
* @return {@literal null} if source {@literal id} is already {@literal null}.
* @since 2.2
*/
@Nullable
default Object convertId(@Nullable Object id, Class<?> targetType) {
if (id == null) {
return null;
}
if (ClassUtils.isAssignable(ObjectId.class, targetType)) {
if (id instanceof String) {
if (ObjectId.isValid(id.toString())) {
return new ObjectId(id.toString());
}
}
}
try {
return getConversionService().canConvert(id.getClass(), targetType)
? getConversionService().convert(id, targetType)
: convertToMongoType(id, null);
} catch (ConversionException o_O) {
return convertToMongoType(id, null);
}
}
}

View File

@@ -15,12 +15,9 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.convert.ConverterBuilder.*;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URI;
import java.net.URL;
import java.time.Instant;
import java.util.ArrayList;
@@ -93,8 +90,6 @@ abstract class MongoConverters {
converters.add(BinaryToByteArrayConverter.INSTANCE);
converters.add(BsonTimestampToInstantConverter.INSTANCE);
converters.add(reading(String.class, URI.class, URI::create).andWriting(URI::toString));
return converters;
}

View File

@@ -17,8 +17,10 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.List;
import java.util.function.Supplier;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -46,14 +48,14 @@ class ObjectPath {
private final @Nullable ObjectPath parent;
private final @Nullable Object object;
private final @Nullable Object idValue;
private final String collection;
private final Lazy<String> collection;
private ObjectPath() {
this.parent = null;
this.object = null;
this.idValue = null;
this.collection = "";
this.collection = Lazy.empty();
}
/**
@@ -64,7 +66,7 @@ class ObjectPath {
* @param idValue
* @param collection
*/
private ObjectPath(ObjectPath parent, Object object, @Nullable Object idValue, String collection) {
private ObjectPath(ObjectPath parent, Object object, @Nullable Object idValue, Lazy<String> collection) {
this.parent = parent;
this.object = object;
@@ -85,7 +87,7 @@ class ObjectPath {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
return new ObjectPath(this, object, id, entity.getCollection());
return new ObjectPath(this, object, id, Lazy.of(entity::getCollection));
}
/**
@@ -175,7 +177,7 @@ class ObjectPath {
}
private String getCollection() {
return collection;
return collection.get();
}
/*

View File

@@ -26,7 +26,6 @@ import java.util.Map;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.bson.BsonValue;
@@ -34,6 +33,7 @@ import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
@@ -57,7 +57,6 @@ import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -132,10 +131,8 @@ public class QueryMapper {
// TODO: remove one once QueryMapper can work with Query instances directly
if (Query.isRestrictedTypeKey(key)) {
@SuppressWarnings("unchecked")
Set<Class<?>> restrictedTypes = BsonUtils.get(query, key);
this.converter.getTypeMapper().writeTypeRestrictions(result, restrictedTypes);
continue;
}
@@ -257,16 +254,7 @@ public class QueryMapper {
*/
protected Field createPropertyField(@Nullable MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
if (entity == null) {
return new Field(key);
}
if (Field.ID_KEY.equals(key)) {
return new MetadataBackedField(key, entity, mappingContext, entity.getIdProperty());
}
return new MetadataBackedField(key, entity, mappingContext);
return entity == null ? new Field(key) : new MetadataBackedField(key, entity, mappingContext);
}
/**
@@ -282,7 +270,7 @@ public class QueryMapper {
if (keyword.isOrOrNor() || (keyword.hasIterableValue() && !keyword.isGeometry())) {
Iterable<?> conditions = keyword.getValue();
List<Object> newConditions = new ArrayList<Object>();
List<Object> newConditions = new ArrayList<>();
for (Object condition : conditions) {
newConditions.add(isDocument(condition) ? getMappedObject((Document) condition, entity)
@@ -293,11 +281,12 @@ public class QueryMapper {
}
if (keyword.isSample()) {
return exampleMapper.getMappedExample(keyword.<Example<?>> getValue(), entity);
return exampleMapper.getMappedExample(keyword.getValue(), entity);
}
if (keyword.isJsonSchema()) {
return schemaMapper.mapSchema(new Document(keyword.getKey(), keyword.getValue()), entity.getType());
return schemaMapper.mapSchema(new Document(keyword.getKey(), keyword.getValue()),
entity != null ? entity.getType() : Object.class);
}
return new Document(keyword.getKey(), convertSimpleOrDocument(keyword.getValue(), entity));
@@ -318,6 +307,10 @@ public class QueryMapper {
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property)
: getMappedValue(property.with(keyword.getKey()), value);
if (keyword.isSample() && convertedValue instanceof Document) {
return (Document) convertedValue;
}
return new Document(keyword.key, convertedValue);
}
@@ -325,9 +318,8 @@ public class QueryMapper {
* Returns the mapped value for the given source object assuming it's a value for the given
* {@link MongoPersistentProperty}.
*
* @param documentField the key the value will be bound to eventually
* @param value the source object to be mapped
* @param property the property the value is a value for
* @param newKey the key the value will be bound to eventually
* @return
*/
@Nullable
@@ -342,13 +334,13 @@ public class QueryMapper {
if (valueDbo.containsField("$in") || valueDbo.containsField("$nin")) {
String inKey = valueDbo.containsField("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
List<Object> ids = new ArrayList<>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id, getIdTypeForField(documentField)));
ids.add(convertId(id));
}
resultDbo.put(inKey, ids);
} else if (valueDbo.containsField("$ne")) {
resultDbo.put("$ne", convertId(valueDbo.get("$ne"), getIdTypeForField(documentField)));
resultDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject(resultDbo, Optional.empty());
}
@@ -361,20 +353,20 @@ public class QueryMapper {
if (valueDbo.containsKey("$in") || valueDbo.containsKey("$nin")) {
String inKey = valueDbo.containsKey("$in") ? "$in" : "$nin";
List<Object> ids = new ArrayList<Object>();
List<Object> ids = new ArrayList<>();
for (Object id : (Iterable<?>) valueDbo.get(inKey)) {
ids.add(convertId(id, getIdTypeForField(documentField)));
ids.add(convertId(id));
}
resultDbo.put(inKey, ids);
} else if (valueDbo.containsKey("$ne")) {
resultDbo.put("$ne", convertId(valueDbo.get("$ne"), getIdTypeForField(documentField)));
resultDbo.put("$ne", convertId(valueDbo.get("$ne")));
} else {
return getMappedObject(resultDbo, Optional.empty());
}
return resultDbo;
} else {
return convertId(value, getIdTypeForField(documentField));
return convertId(value);
}
}
@@ -389,14 +381,6 @@ public class QueryMapper {
return convertSimpleOrDocument(value, documentField.getPropertyEntity());
}
private boolean isIdField(Field documentField) {
return documentField.getProperty() != null && documentField.getProperty().isIdProperty();
}
private Class<?> getIdTypeForField(Field documentField) {
return isIdField(documentField) ? documentField.getProperty().getFieldType() : ObjectId.class;
}
/**
* Returns whether the given {@link Field} represents an association reference that together with the given value
* requires conversion to a {@link org.springframework.data.mongodb.core.mapping.DBRef} object. We check whether the
@@ -442,6 +426,10 @@ public class QueryMapper {
@SuppressWarnings("unchecked")
protected Object convertSimpleOrDocument(Object source, @Nullable MongoPersistentEntity<?> entity) {
if (source instanceof Example) {
return exampleMapper.getMappedExample((Example) source, entity);
}
if (source instanceof List) {
return delegateConvertToMongoType(source, entity);
}
@@ -517,14 +505,7 @@ public class QueryMapper {
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
Object id = convertId(ref.getId(),
property != null && property.isIdProperty() ? property.getFieldType() : ObjectId.class);
if (StringUtils.hasText(ref.getDatabaseName())) {
return new DBRef(ref.getDatabaseName(), ref.getCollectionName(), id);
} else {
return new DBRef(ref.getCollectionName(), id);
}
return new DBRef(ref.getCollectionName(), convertId(ref.getId()));
}
if (source instanceof Iterable) {
@@ -605,24 +586,24 @@ public class QueryMapper {
*
* @param id
* @return
* @since 2.2
*/
@Nullable
public Object convertId(@Nullable Object id) {
return convertId(id, ObjectId.class);
}
/**
* Converts the given raw id value into either {@link ObjectId} or {@link Class targetType}.
*
* @param id can be {@literal null}.
* @param targetType
* @return the converted {@literal id} or {@literal null} if the source was already {@literal null}.
* @since 2.2
*/
@Nullable
public Object convertId(@Nullable Object id, Class<?> targetType) {
return converter.convertId(id, targetType);
if (id == null) {
return null;
}
if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
}
/**
@@ -770,8 +751,6 @@ public class QueryMapper {
*/
protected static class Field {
protected static final Pattern POSITIONAL_OPERATOR = Pattern.compile("\\$\\[.*\\]");
private static final String ID_KEY = "_id";
protected final String name;
@@ -952,9 +931,7 @@ public class QueryMapper {
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty != null) {
return name.equals(idProperty.getName()) || name.equals(idProperty.getFieldName())
|| name.endsWith("." + idProperty.getName()) || name.endsWith("." + idProperty.getFieldName());
return name.equals(idProperty.getName()) || name.equals(idProperty.getFieldName());
}
return DEFAULT_ID_NAMES.contains(name);
@@ -1042,17 +1019,15 @@ public class QueryMapper {
@Nullable
private PersistentPropertyPath<MongoPersistentProperty> getPath(String pathExpression) {
String rawPath = pathExpression.replaceAll("\\.\\d+", "");
PropertyPath path = forName(rawPath);
if (path == null || isPathToJavaLangClassProperty(path)) {
return null;
}
try {
String rawPath = pathExpression.replaceAll("\\.\\d+", "") //
.replaceAll(POSITIONAL_OPERATOR.pattern(), "");
PropertyPath path = PropertyPath.from(rawPath, entity.getTypeInformation());
if (isPathToJavaLangClassProperty(path)) {
return null;
}
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
@@ -1073,7 +1048,35 @@ public class QueryMapper {
}
return propertyPath;
} catch (InvalidPersistentPropertyPath e) {
return null;
}
}
/**
* Querydsl happens to map id fields directly to {@literal _id} which breaks {@link PropertyPath} resolution. So if
* the first attempt fails we try to replace {@literal _id} with just {@literal id} and see if we can resolve if
* then.
*
* @param path
* @return the path or {@literal null}
*/
@Nullable
private PropertyPath forName(String path) {
try {
if (entity.getPersistentProperty(path) != null) {
return PropertyPath.from(Pattern.quote(path), entity.getTypeInformation());
}
return PropertyPath.from(path, entity.getTypeInformation());
} catch (PropertyReferenceException | InvalidPersistentPropertyPath e) {
if (path.endsWith("_id")) {
return forName(path.substring(0, path.length() - 3) + "id");
}
return null;
}
}
@@ -1196,11 +1199,6 @@ public class QueryMapper {
return true;
}
Matcher matcher = POSITIONAL_OPERATOR.matcher(partial);
if (matcher.find()) {
return true;
}
try {
Long.valueOf(partial);
return true;

View File

@@ -289,7 +289,7 @@ public class UpdateMapper extends QueryMapper {
public MetadataBackedUpdateField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(key.replaceAll("\\.\\$(\\[.*\\])?", ""), entity, mappingContext);
super(key.replaceAll("\\.\\$", ""), entity, mappingContext);
this.key = key;
}

View File

@@ -16,7 +16,7 @@
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in GeoJSON ({@link http://geojson.org/}) format.
* Interface definition for structures defined in GeoJSON ({@link https://geojson.org/}) format.
*
* @author Christoph Strobl
* @since 1.7
@@ -27,7 +27,7 @@ public interface GeoJson<T extends Iterable<?>> {
* String value representing the type of the {@link GeoJson} object.
*
* @return will never be {@literal null}.
* @see <a href="http://geojson.org/geojson-spec.html#geojson-objects">http://geojson.org/geojson-spec.html#geojson-objects</a>
* @see <a href="https://geojson.org/geojson-spec.html#geojson-objects">https://geojson.org/geojson-spec.html#geojson-objects</a>
*/
String getType();
@@ -36,7 +36,7 @@ public interface GeoJson<T extends Iterable<?>> {
* determined by {@link #getType()} of geometry.
*
* @return will never be {@literal null}.
* @see <a href="http://geojson.org/geojson-spec.html#geometry-objects">http://geojson.org/geojson-spec.html#geometry-objects</a>
* @see <a href="https://geojson.org/geojson-spec.html#geometry-objects">https://geojson.org/geojson-spec.html#geometry-objects</a>
*/
T getCoordinates();
}

View File

@@ -27,7 +27,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#geometry-collection">http://geojson.org/geojson-spec.html#geometry-collection</a>
* @see <a href="https://geojson.org/geojson-spec.html#geometry-collection">https://geojson.org/geojson-spec.html#geometry-collection</a>
*/
public class GeoJsonGeometryCollection implements GeoJson<Iterable<GeoJson<?>>> {

View File

@@ -24,7 +24,7 @@ import org.springframework.data.geo.Point;
*
* @author Christoph Strobl
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#linestring">http://geojson.org/geojson-spec.html#linestring</a>
* @see <a href="https://geojson.org/geojson-spec.html#linestring">https://geojson.org/geojson-spec.html#linestring</a>
*/
public class GeoJsonLineString extends GeoJsonMultiPoint {

View File

@@ -28,7 +28,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#multilinestring">http://geojson.org/geojson-spec.html#multilinestring</a>
* @see <a href="https://geojson.org/geojson-spec.html#multilinestring">https://geojson.org/geojson-spec.html#multilinestring</a>
*/
public class GeoJsonMultiLineString implements GeoJson<Iterable<GeoJsonLineString>> {

View File

@@ -29,7 +29,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#multipoint">http://geojson.org/geojson-spec.html#multipoint</a>
* @see <a href="https://geojson.org/geojson-spec.html#multipoint">https://geojson.org/geojson-spec.html#multipoint</a>
*/
public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {

View File

@@ -25,7 +25,7 @@ import org.springframework.data.geo.Point;
*
* @author Christoph Strobl
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#point">http://geojson.org/geojson-spec.html#point</a>
* @see <a href="https://geojson.org/geojson-spec.html#point">https://geojson.org/geojson-spec.html#point</a>
*/
public class GeoJsonPoint extends Point implements GeoJson<List<Double>> {

View File

@@ -32,7 +32,7 @@ import org.springframework.util.Assert;
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.7
* @see <a href="http://geojson.org/geojson-spec.html#polygon">http://geojson.org/geojson-spec.html#polygon</a>
* @see <a href="https://geojson.org/geojson-spec.html#polygon">https://geojson.org/geojson-spec.html#polygon</a>
*/
public class GeoJsonPolygon extends Polygon implements GeoJson<List<GeoJsonLineString>> {

View File

@@ -55,8 +55,7 @@ public @interface CompoundIndex {
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
*/
boolean unique() default false;
@@ -64,15 +63,13 @@ public @interface CompoundIndex {
* If set to true index will skip over any document that is missing the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
*/
boolean sparse() default false;
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @deprecated since 2.1. No longer supported by MongoDB as of server version 3.0.
*/
@Deprecated
@@ -134,8 +131,7 @@ public @interface CompoundIndex {
* If {@literal true} the index will be created in the background.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
* @see <a href="https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
*/
boolean background() default false;

View File

@@ -15,54 +15,25 @@
*/
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @author Thomas Darimont
* @author Mark Paluch
* @since 1.5
*/
public interface IndexResolver {
interface IndexResolver {
/**
* Creates a new {@link IndexResolver} given {@link MongoMappingContext}.
*
* @param mappingContext must not be {@literal null}.
* @return the new {@link IndexResolver}.
* @since 2.2
*/
static IndexResolver create(MongoMappingContext mappingContext) {
Assert.notNull(mappingContext, "MongoMappingContext must not be null!");
return new MongoPersistentEntityIndexResolver(mappingContext);
}
/**
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s
* are created for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param typeInformation
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinition> resolveIndexFor(TypeInformation<?> typeInformation);
/**
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s
* are created for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param entityType
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
* @see 2.2
*/
default Iterable<? extends IndexDefinition> resolveIndexFor(Class<?> entityType) {
return resolveIndexFor(ClassTypeInformation.from(entityType));
}
Iterable<? extends IndexDefinitionHolder> resolveIndexFor(TypeInformation<?> typeInformation);
}

View File

@@ -31,7 +31,7 @@ import java.lang.annotation.Target;
* @author Christoph Strobl
* @author Jordi Llach
*/
@Target({ ElementType.ANNOTATION_TYPE, ElementType.FIELD })
@Target({ElementType.ANNOTATION_TYPE, ElementType.FIELD})
@Retention(RetentionPolicy.RUNTIME)
public @interface Indexed {
@@ -39,8 +39,7 @@ public @interface Indexed {
* If set to true reject all documents that contain a duplicate value for the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-unique/">https://docs.mongodb.org/manual/core/index-unique/</a>
*/
boolean unique() default false;
@@ -50,15 +49,13 @@ public @interface Indexed {
* If set to true index will skip over any document that is missing the indexed field.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-sparse/">https://docs.mongodb.org/manual/core/index-sparse/</a>
*/
boolean sparse() default false;
/**
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @see <a href="https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping">https://docs.mongodb.org/manual/core/index-creation/#index-creation-duplicate-dropping</a>
* @deprecated since 2.1. No longer supported by MongoDB as of server version 3.0.
*/
@Deprecated
@@ -118,8 +115,7 @@ public @interface Indexed {
* If {@literal true} the index will be created in the background.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
* @see <a href="https://docs.mongodb.org/manual/core/indexes/#background-construction">https://docs.mongodb.org/manual/core/indexes/#background-construction</a>
*/
boolean background() default false;
@@ -127,8 +123,7 @@ public @interface Indexed {
* Configures the number of seconds after which the collection should expire. Defaults to -1 for no expiry.
*
* @return
* @see <a href=
* "https://docs.mongodb.org/manual/tutorial/expire-data/">https://docs.mongodb.org/manual/tutorial/expire-data/</a>
* @see <a href="https://docs.mongodb.org/manual/tutorial/expire-data/">https://docs.mongodb.org/manual/tutorial/expire-data/</a>
*/
int expireAfterSeconds() default -1;
}

View File

@@ -1,84 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.util.Collections;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentSkipListSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* @author Christoph Strobl
* @since 2.2
*/
class JustOnceLogger {
private static final Map<String, Set<String>> KNOWN_LOGS = new ConcurrentHashMap<>();
private static final String AUTO_INDEX_CREATION_CONFIG_CHANGE;
static {
AUTO_INDEX_CREATION_CONFIG_CHANGE = "Automatic index creation will be disabled by default as of Spring Data MongoDB 3.x."
+ System.lineSeparator()
+ "\tPlease use 'MongoMappingContext#setAutoIndexCreation(boolean)' or override 'MongoConfigurationSupport#autoIndexCreation()' to be explicit."
+ System.lineSeparator()
+ "\tHowever, we recommend setting up indices manually in an application ready block. You may use index derivation there as well."
+ System.lineSeparator() + System.lineSeparator() //
+ "\t> -----------------------------------------------------------------------------------------"
+ System.lineSeparator() //
+ "\t> @EventListener(ApplicationReadyEvent.class)" + System.lineSeparator() //
+ "\t> public void initIndicesAfterStartup() {" + System.lineSeparator() //
+ "\t>" + System.lineSeparator() //
+ "\t> IndexOperations indexOps = mongoTemplate.indexOps(DomainType.class);" + System.lineSeparator()//
+ "\t>" + System.lineSeparator() //
+ "\t> IndexResolver resolver = new MongoPersistentEntityIndexResolver(mongoMappingContext);"
+ System.lineSeparator() //
+ "\t> resolver.resolveIndexFor(DomainType.class).forEach(indexOps::ensureIndex);" + System.lineSeparator() //
+ "\t> }" + System.lineSeparator() //
+ "\t> -----------------------------------------------------------------------------------------"
+ System.lineSeparator();
}
static void logWarnIndexCreationConfigurationChange(String loggerName) {
warnOnce(loggerName, AUTO_INDEX_CREATION_CONFIG_CHANGE);
}
static void warnOnce(String loggerName, String message) {
Logger logger = LoggerFactory.getLogger(loggerName);
if (!logger.isWarnEnabled()) {
return;
}
if (!KNOWN_LOGS.containsKey(loggerName)) {
KNOWN_LOGS.put(loggerName, new ConcurrentSkipListSet<>(Collections.singleton(message)));
logger.warn(message);
} else {
Set<String> messages = KNOWN_LOGS.get(loggerName);
if (messages.contains(message)) {
return;
}
messages.add(message);
logger.warn(message);
}
}
}

View File

@@ -63,13 +63,11 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
/**
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
* @param indexOperationsProvider must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
IndexOperationsProvider indexOperationsProvider) {
this(mappingContext, indexOperationsProvider, IndexResolver.create(mappingContext));
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, IndexOperationsProvider indexOperationsProvider) {
this(mappingContext, indexOperationsProvider, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
@@ -80,8 +78,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
* @param mongoDbFactory must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
IndexOperationsProvider indexOperationsProvider, IndexResolver indexResolver) {
public MongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, IndexOperationsProvider indexOperationsProvider,
IndexResolver indexResolver) {
Assert.notNull(mappingContext, "MongoMappingContext must not be null!");
Assert.notNull(indexOperationsProvider, "IndexOperationsProvider must not be null!");
@@ -110,7 +108,6 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -134,16 +131,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.isAnnotationPresent(Document.class)) {
for (IndexDefinition indexDefinition : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
JustOnceLogger.logWarnIndexCreationConfigurationChange(this.getClass().getName());
IndexDefinitionHolder indexToCreate = indexDefinition instanceof IndexDefinitionHolder
? (IndexDefinitionHolder) indexDefinition
: new IndexDefinitionHolder("", indexDefinition, entity.getCollection());
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
createIndex(indexToCreate);
}
}
}
@@ -157,8 +146,8 @@ public class MongoPersistentEntityIndexCreator implements ApplicationListener<Ma
} catch (UncategorizedMongoDbException ex) {
if (ex.getCause() instanceof MongoException
&& MongoDbErrorCodes.isDataIntegrityViolationCode(((MongoException) ex.getCause()).getCode())) {
if (ex.getCause() instanceof MongoException &&
MongoDbErrorCodes.isDataIntegrityViolationCode(((MongoException) ex.getCause()).getCode())) {
IndexInfo existingIndex = fetchIndexInformation(indexDefinition);
String message = "Cannot create index for '%s' in collection '%s' with keys '%s' and options '%s'.";

View File

@@ -105,14 +105,15 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
Document document = root.findAnnotation(Document.class);
Assert.notNull(document, "Given entity is not collection root.");
final List<IndexDefinitionHolder> indexInformation = new ArrayList<>();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", root.getCollection(), root));
indexInformation.addAll(potentiallyCreateTextIndexDefinition(root));
List<IndexDefinitionHolder> indexInformation = new ArrayList<>();
String collection = root.getCollection();
indexInformation.addAll(potentiallyCreateCompoundIndexDefinitions("", collection, root));
indexInformation.addAll(potentiallyCreateTextIndexDefinition(root, collection));
root.doWithProperties((PropertyHandler<MongoPersistentProperty>) property -> this
.potentiallyAddIndexForProperty(root, property, indexInformation, new CycleGuard()));
indexInformation.addAll(resolveIndexesForDbrefs("", root.getCollection(), root));
indexInformation.addAll(resolveIndexesForDbrefs("", collection, root));
return indexInformation;
}
@@ -121,13 +122,15 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
List<IndexDefinitionHolder> indexes, CycleGuard guard) {
try {
String collection = root.getCollection();
if (persistentProperty.isEntity()) {
indexes.addAll(resolveIndexForClass(persistentProperty.getTypeInformation().getActualType(),
persistentProperty.getFieldName(), Path.of(persistentProperty), root.getCollection(), guard));
persistentProperty.getFieldName(), Path.of(persistentProperty), collection, guard));
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(
persistentProperty.getFieldName(), root.getCollection(), persistentProperty);
persistentProperty.getFieldName(), collection, persistentProperty);
if (indexDefinitionHolder != null) {
indexes.add(indexDefinitionHolder);
}
@@ -212,7 +215,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(
MongoPersistentEntity<?> root) {
MongoPersistentEntity<?> root, String collection) {
String name = root.getType().getSimpleName() + "_TextIndex";
if (name.getBytes().length > 127) {
@@ -248,7 +251,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return Collections.emptyList();
}
IndexDefinitionHolder holder = new IndexDefinitionHolder("", indexDefinition, root.getCollection());
IndexDefinitionHolder holder = new IndexDefinitionHolder("", indexDefinition, collection);
return Collections.singletonList(holder);
}

View File

@@ -63,7 +63,7 @@ public class ReactiveMongoPersistentEntityIndexCreator {
*/
public ReactiveMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
ReactiveIndexOperationsProvider operationsProvider) {
this(mappingContext, operationsProvider, IndexResolver.create(mappingContext));
this(mappingContext, operationsProvider, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
@@ -125,12 +125,7 @@ public class ReactiveMongoPersistentEntityIndexCreator {
List<Mono<?>> publishers = new ArrayList<>();
if (entity.isAnnotationPresent(Document.class)) {
for (IndexDefinition indexDefinition : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
IndexDefinitionHolder indexToCreate = indexDefinition instanceof IndexDefinitionHolder
? (IndexDefinitionHolder) indexDefinition
: new IndexDefinitionHolder("", indexDefinition, entity.getCollection());
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
publishers.add(createIndex(indexToCreate));
}
}
@@ -140,8 +135,6 @@ public class ReactiveMongoPersistentEntityIndexCreator {
Mono<String> createIndex(IndexDefinitionHolder indexDefinition) {
JustOnceLogger.logWarnIndexCreationConfigurationChange(this.getClass().getName());
return operationsProvider.indexOps(indexDefinition.getCollection()).ensureIndex(indexDefinition) //
.onErrorResume(ReactiveMongoPersistentEntityIndexCreator::isDataIntegrityViolation,
e -> translateException(e, indexDefinition));

View File

@@ -67,7 +67,8 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
/**
* Creates a new {@link BasicMongoPersistentProperty}.
*
* @param property
* @param field
* @param propertyDescriptor
* @param owner
* @param simpleTypeHolder
* @param fieldNamingStrategy
@@ -143,32 +144,6 @@ public class BasicMongoPersistentProperty extends AnnotationBasedPersistentPrope
return fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentProperty#getFieldType()
*/
@Override
public Class<?> getFieldType() {
if (!isIdProperty()) {
return getType();
}
MongoId idAnnotation = findAnnotation(MongoId.class);
if (idAnnotation == null) {
return FieldType.OBJECT_ID.getJavaClass();
}
FieldType fieldType = idAnnotation.targetType();
if (fieldType == FieldType.IMPLICIT) {
return getType();
}
return fieldType.getJavaClass();
}
/**
* @return true if {@link org.springframework.data.mongodb.core.mapping.Field} having non blank
* {@link org.springframework.data.mongodb.core.mapping.Field#value()} present.

View File

@@ -33,7 +33,6 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
private @Nullable boolean dbRefResolved;
private @Nullable DBRef dbref;
private @Nullable String fieldName;
private @Nullable Class<?> fieldType;
private @Nullable Boolean usePropertyAccess;
private @Nullable Boolean isTransient;
@@ -90,20 +89,6 @@ public class CachingMongoPersistentProperty extends BasicMongoPersistentProperty
return this.fieldName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.BasicMongoPersistentProperty#getFieldType()
*/
@Override
public Class<?> getFieldType() {
if (this.fieldType == null) {
this.fieldType = super.getFieldType();
}
return this.fieldType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.AnnotationBasedPersistentProperty#usePropertyAccess()

View File

@@ -1,65 +0,0 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import org.bson.types.ObjectId;
/**
* Enumeration of field value types that can be used to represent a {@link org.bson.Document} field value. This
* enumeration contains a subset of {@link org.bson.BsonType} that is supported by the mapping and conversion
* components.
* <p/>
* Bson types are identified by a {@code byte} {@link #getBsonType() value}. This enumeration typically returns the
* according bson type value except for {@link #IMPLICIT} which is a marker to derive the field type from a property.
*
* @author Mark Paluch
* @since 2.2
* @see org.bson.BsonType
*/
public enum FieldType {
/**
* Implicit type that is derived from the property value.
*/
IMPLICIT(-1, Object.class), STRING(2, String.class), OBJECT_ID(7, ObjectId.class);
private final int bsonType;
private final Class<?> javaClass;
FieldType(int bsonType, Class<?> javaClass) {
this.bsonType = bsonType;
this.javaClass = javaClass;
}
/**
* Returns the BSON type identifier. Can be {@code -1} if {@link FieldType} maps to a synthetic Bson type.
*
* @return the BSON type identifier. Can be {@code -1} if {@link FieldType} maps to a synthetic Bson type.
*/
public int getBsonType() {
return bsonType;
}
/**
* Returns the Java class used to represent the type.
*
* @return the Java class used to represent the type.
*/
public Class<?> getJavaClass() {
return javaClass;
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.core.annotation.AliasFor;
import org.springframework.data.annotation.Id;
/**
* {@link MongoId} represents a MongoDB specific {@link Id} annotation that allows customizing {@literal id} conversion.
* Id properties use {@link org.springframework.data.mongodb.core.mapping.FieldType#IMPLICIT} as the default
* {@literal id's} target type. This means that the actual property value is used. No conversion attempts to any other
* type are made. <br />
* In contrast to {@link Id &#64;Id}, {@link String} {@literal id's} are stored as the such even when the actual value
* represents a valid {@link org.bson.types.ObjectId#isValid(String) ObjectId hex String}. To trigger {@link String} to
* {@link org.bson.types.ObjectId} conversion use {@link MongoId#targetType() &#64;MongoId(FieldType.OBJECT_ID)}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
*/
@Id
@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.FIELD, ElementType.METHOD, ElementType.ANNOTATION_TYPE })
public @interface MongoId {
/**
* @return the preferred id type.
* @see #targetType()
*/
@AliasFor("targetType")
FieldType value() default FieldType.IMPLICIT;
/**
* Get the preferred {@literal _id} type to be used. Defaults to {@link FieldType#IMPLICIT} which uses the property's
* type. If defined different, the given value is attempted to be converted into the desired target type via
* {@link org.springframework.data.mongodb.core.convert.MongoConverter#convertId(Object, Class)}.
*
* @return the preferred {@literal id} type. {@link FieldType#IMPLICIT} by default.
*/
@AliasFor("value")
FieldType targetType() default FieldType.IMPLICIT;
}

View File

@@ -43,7 +43,6 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
private FieldNamingStrategy fieldNamingStrategy = DEFAULT_NAMING_STRATEGY;
private @Nullable ApplicationContext context;
private boolean autoIndexCreation = true;
/**
* Creates a new {@link MongoMappingContext}.
@@ -102,30 +101,4 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
this.context = applicationContext;
}
/**
* Returns whether auto-index creation is enabled or disabled. <br />
* <strong>NOTE:</strong>Index creation should happen at a well-defined time that is ideally controlled by the
* application itself.
*
* @return {@literal true} when auto-index creation is enabled; {@literal false} otherwise.
* @since 2.2
* @see org.springframework.data.mongodb.core.index.Indexed
*/
public boolean isAutoIndexCreation() {
return autoIndexCreation;
}
/**
* Enables/disables auto-index creation. <br />
* <strong>NOTE:</strong>Index creation should happen at a well-defined time that is ideally controlled by the
* application itself.
*
* @param autoCreateIndexes set to {@literal false} to disable auto-index creation.
* @since 2.2
* @see org.springframework.data.mongodb.core.index.Indexed
*/
public void setAutoIndexCreation(boolean autoCreateIndexes) {
this.autoIndexCreation = autoCreateIndexes;
}
}

View File

@@ -38,15 +38,6 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
*/
String getFieldName();
/**
* Returns the {@link Class Java FieldType} of the field a property is persisted to.
*
* @return
* @since 2.2
* @see FieldType
*/
Class<?> getFieldType();
/**
* Returns the order of the field if defined. Will return -1 if undefined.
*

View File

@@ -43,16 +43,15 @@ public abstract class MongoSimpleTypes {
public static final Set<Class<?>> AUTOGENERATED_ID_TYPES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
Set<Class<?>> classes = new HashSet<>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
AUTOGENERATED_ID_TYPES = Collections.unmodifiableSet(classes);
Set<Class<?>> simpleTypes = new HashSet<Class<?>>();
Set<Class<?>> simpleTypes = new HashSet<>();
simpleTypes.add(DBRef.class);
simpleTypes.add(ObjectId.class);
simpleTypes.add(BsonObjectId.class);
simpleTypes.add(CodeWScope.class);
simpleTypes.add(CodeWithScope.class);
simpleTypes.add(org.bson.Document.class);
@@ -67,7 +66,6 @@ public abstract class MongoSimpleTypes {
simpleTypes.add(BsonDbPointer.class);
simpleTypes.add(BsonDecimal128.class);
simpleTypes.add(BsonDocument.class);
simpleTypes.add(BsonDocument.class);
simpleTypes.add(BsonDouble.class);
simpleTypes.add(BsonInt32.class);
simpleTypes.add(BsonInt64.class);
@@ -82,7 +80,19 @@ public abstract class MongoSimpleTypes {
}
private static final Set<Class<?>> MONGO_SIMPLE_TYPES;
public static final SimpleTypeHolder HOLDER = new SimpleTypeHolder(MONGO_SIMPLE_TYPES, true);
public static final SimpleTypeHolder HOLDER = new SimpleTypeHolder(MONGO_SIMPLE_TYPES, true) {
@Override
public boolean isSimpleType(Class<?> type) {
if (type.getName().startsWith("java.time")) {
return false;
}
return super.isSimpleType(type);
}
};
private MongoSimpleTypes() {}
}

View File

@@ -115,7 +115,8 @@ class ChangeStreamTask extends CursorReadingTask<ChangeStreamDocument<Document>,
.orElseGet(() -> ClassUtils.isAssignable(Document.class, targetType) ? FullDocument.DEFAULT
: FullDocument.UPDATE_LOOKUP);
startAt = changeStreamOptions.getResumeBsonTimestamp().orElse(null);
startAt = changeStreamOptions.getResumeTimestamp().map(it -> new BsonTimestamp((int) it.getEpochSecond(), 0))
.orElse(null);
}
MongoDatabase db = StringUtils.hasText(options.getDatabaseName())

View File

@@ -73,26 +73,37 @@ abstract class CursorReadingTask<T, R> implements Task {
@Override
public void run() {
start();
try {
while (isRunning()) {
start();
try {
while (isRunning()) {
T next = execute(this::getNext);
try {
if (next != null) {
emitMessage(createMessage(next, targetType, request.getRequestOptions()));
} else {
Thread.sleep(10);
T next = execute(this::getNext);
if (next != null) {
emitMessage(createMessage(next, targetType, request.getRequestOptions()));
} else {
Thread.sleep(10);
}
} catch (InterruptedException e) {
synchronized (lifecycleMonitor) {
state = State.CANCELLED;
}
Thread.currentThread().interrupt();
break;
}
} catch (InterruptedException e) {
synchronized (lifecycleMonitor) {
state = State.CANCELLED;
}
Thread.currentThread().interrupt();
}
} catch (RuntimeException e) {
synchronized (lifecycleMonitor) {
state = State.CANCELLED;
}
errorHandler.handleError(e);
}
}
@@ -126,7 +137,7 @@ abstract class CursorReadingTask<T, R> implements Task {
if (valid) {
this.cursor = cursor;
state = State.RUNNING;
} else if(cursor != null){
} else if (cursor != null) {
cursor.close();
}
}
@@ -219,7 +230,11 @@ abstract class CursorReadingTask<T, R> implements Task {
@SuppressWarnings("unchecked")
private void emitMessage(Message<T, R> message) {
request.getMessageListener().onMessage((Message) message);
try {
request.getMessageListener().onMessage((Message) message);
} catch (Exception e) {
errorHandler.handleError(e);
}
}
@Nullable
@@ -249,8 +264,8 @@ abstract class CursorReadingTask<T, R> implements Task {
/**
* Execute an operation and take care of translating exceptions using the {@link MongoTemplate templates}
* {@link org.springframework.data.mongodb.core.MongoExceptionTranslator} and passing those on to the
* {@link #errorHandler}.
* {@link org.springframework.data.mongodb.core.MongoExceptionTranslator} rethrowing the potentially translated
* exception.
*
* @param callback must not be {@literal null}.
* @param <T>
@@ -265,10 +280,7 @@ abstract class CursorReadingTask<T, R> implements Task {
} catch (RuntimeException e) {
RuntimeException translated = template.getExceptionTranslator().translateExceptionIfPossible(e);
RuntimeException toHandle = translated != null ? translated : e;
errorHandler.handleError(toHandle);
throw toHandle;
throw translated != null ? translated : e;
}
}
}

View File

@@ -148,6 +148,15 @@ public class BasicQuery extends Query {
this.sortObject = sortObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#isSorted()
*/
@Override
public boolean isSorted() {
return super.isSorted() || !sortObject.isEmpty();
}
/**
* Set the fields (projection) {@link Document}.
*

View File

@@ -721,7 +721,7 @@ public class Collation {
/**
* ICU locale abstraction for usage with MongoDB {@link Collation}.
*
* @see <a href="http://site.icu-project.org">ICU - International Components for Unicode</a>
* @see <a href="https://site.icu-project.org">ICU - International Components for Unicode</a>
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)

View File

@@ -26,10 +26,10 @@ import java.util.Map.Entry;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.bson.BSON;
import org.bson.BsonRegularExpression;
import org.bson.Document;
import org.bson.types.Binary;
import org.springframework.data.domain.Example;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Point;
@@ -67,20 +67,6 @@ public class Criteria implements CriteriaDefinition {
*/
private static final Object NOT_SET = new Object();
private static final int[] FLAG_LOOKUP = new int[Character.MAX_VALUE];
static {
FLAG_LOOKUP['g'] = 256;
FLAG_LOOKUP['i'] = Pattern.CASE_INSENSITIVE;
FLAG_LOOKUP['m'] = Pattern.MULTILINE;
FLAG_LOOKUP['s'] = Pattern.DOTALL;
FLAG_LOOKUP['c'] = Pattern.CANON_EQ;
FLAG_LOOKUP['x'] = Pattern.COMMENTS;
FLAG_LOOKUP['d'] = Pattern.UNIX_LINES;
FLAG_LOOKUP['t'] = Pattern.LITERAL;
FLAG_LOOKUP['u'] = Pattern.UNICODE_CASE;
}
private @Nullable String key;
private List<Criteria> criteriaChain;
private LinkedHashMap<String, Object> criteria = new LinkedHashMap<String, Object>();
@@ -464,7 +450,7 @@ public class Criteria implements CriteriaDefinition {
Assert.notNull(regex, "Regex string must not be null!");
return Pattern.compile(regex, regexFlags(options));
return Pattern.compile(regex, options == null ? 0 : BSON.regexFlags(options));
}
/**
@@ -611,7 +597,6 @@ public class Criteria implements CriteriaDefinition {
public Criteria alike(Example<?> sample) {
criteria.put("$example", sample);
this.criteriaChain.add(this);
return this;
}
@@ -919,47 +904,6 @@ public class Criteria implements CriteriaDefinition {
|| (value instanceof GeoCommand && ((GeoCommand) value).getShape() instanceof GeoJson);
}
/**
* Lookup the MongoDB specific flags for a given regex option string.
*
* @param s the Regex option/flag to look up. Can be {@literal null}.
* @return zero if given {@link String} is {@literal null} or empty.
* @since 2.2
*/
private static int regexFlags(@Nullable String s) {
int flags = 0;
if (s == null) {
return flags;
}
for (final char f : s.toLowerCase().toCharArray()) {
flags |= regexFlag(f);
}
return flags;
}
/**
* Lookup the MongoDB specific flags for a given character.
*
* @param c the Regex option/flag to look up.
* @return
* @throws IllegalArgumentException for unknown flags
* @since 2.2
*/
private static int regexFlag(char c) {
int flag = FLAG_LOOKUP[c];
if (flag == 0) {
throw new IllegalArgumentException(String.format("Unrecognized flag [%c]", c));
}
return flag;
}
/**
* MongoDB specific <a href="https://docs.mongodb.com/manual/reference/operator/query-bitwise/">bitwise query
* operators</a> like {@code $bitsAllClear, $bitsAllSet,...} for usage with {@link Criteria#bits()} and {@link Query}.
@@ -1136,7 +1080,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Default implementation of {@link BitwiseCriteriaOperators}.
*
*
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/

View File

@@ -1,159 +0,0 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.math.BigDecimal;
import java.math.RoundingMode;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* {@link Metric} and {@link Distance} conversions using the metric system.
*
* @author Mark Paluch
* @since 2.2
*/
class MetricConversion {
private static final BigDecimal METERS_MULTIPLIER = new BigDecimal(Metrics.KILOMETERS.getMultiplier())
.multiply(new BigDecimal(1000));
// to achieve a calculation that is accurate to 0.3 meters
private static final int PRECISION = 8;
/**
* Return meters to {@code metric} multiplier.
*
* @param metric
* @return
*/
protected static double getMetersToMetricMultiplier(Metric metric) {
ConversionMultiplier conversionMultiplier = ConversionMultiplier.builder().from(METERS_MULTIPLIER).to(metric)
.build();
return conversionMultiplier.multiplier().doubleValue();
}
/**
* Return {@code distance} in meters.
*
* @param distance
* @return
*/
protected static double getDistanceInMeters(Distance distance) {
return new BigDecimal(distance.getValue()).multiply(getMetricToMetersMultiplier(distance.getMetric()))
.doubleValue();
}
/**
* Return {@code metric} to meters multiplier.
*
* @param metric
* @return
*/
private static BigDecimal getMetricToMetersMultiplier(Metric metric) {
ConversionMultiplier conversionMultiplier = ConversionMultiplier.builder().from(metric).to(METERS_MULTIPLIER)
.build();
return conversionMultiplier.multiplier();
}
/**
* Provides a multiplier to convert between various metrics. Metrics must share the same base scale and provide a
* multiplier to convert between the base scale and its own metric.
*
* @author Mark Paluch
*/
private static class ConversionMultiplier {
private final BigDecimal source;
private final BigDecimal target;
ConversionMultiplier(Number source, Number target) {
if (source instanceof BigDecimal) {
this.source = (BigDecimal) source;
} else {
this.source = new BigDecimal(source.doubleValue());
}
if (target instanceof BigDecimal) {
this.target = (BigDecimal) target;
} else {
this.target = new BigDecimal(target.doubleValue());
}
}
/**
* Returns the multiplier to convert a number from the {@code source} metric to the {@code target} metric.
*
* @return
*/
BigDecimal multiplier() {
return target.divide(source, PRECISION, RoundingMode.HALF_UP);
}
/**
* Creates a new {@link ConversionMultiplierBuilder}.
*
* @return
*/
static ConversionMultiplierBuilder builder() {
return new ConversionMultiplierBuilder();
}
}
/**
* Builder for {@link ConversionMultiplier}.
*
* @author Mark Paluch
*/
private static class ConversionMultiplierBuilder {
private Number from;
private Number to;
ConversionMultiplierBuilder() {}
ConversionMultiplierBuilder from(Number from) {
this.from = from;
return this;
}
ConversionMultiplierBuilder from(Metric from) {
this.from = from.getMultiplier();
return this;
}
ConversionMultiplierBuilder to(Number to) {
this.to = to;
return this;
}
ConversionMultiplierBuilder to(Metric to) {
this.to = to.getMultiplier();
return this;
}
ConversionMultiplier build() {
return new ConversionMultiplier(this.from, this.to);
}
}
}

View File

@@ -24,147 +24,12 @@ import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Builder class to build near-queries. <br />
* MongoDB {@code $geoNear} operator allows usage of a {@literal GeoJSON Point} or legacy coordinate pair. Though
* syntactically different, there's no difference between {@code near: [-73.99171, 40.738868]} and {@code near: { type:
* "Point", coordinates: [-73.99171, 40.738868] } } for the MongoDB server<br />
* <br />
* Please note that there is a huge difference in the distance calculation. Using the legacy format (for near) operates
* upon {@literal Radians} on an Earth like sphere, whereas the {@literal GeoJSON} format uses {@literal Meters}. The
* actual type within the document is of no concern at this point.<br />
* To avoid a serious headache make sure to set the {@link Metric} to the desired unit of measure which ensures the
* distance to be calculated correctly.<br />
* <p />
* In other words: <br />
* Assume you've got 5 Documents like the ones below <br />
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a5"),
* "name" : "Penn Station",
* "location" : { "type" : "Point", "coordinates" : [ -73.99408, 40.75057 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796ab"),
* "name" : "Momofuku Milk Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.985839, 40.731698 ] }
* }
* </code>
* </pre>
*
* Fetching all Documents within a 400 Meter radius from {@code [-73.99171, 40.738868] } would look like this using
* {@literal GeoJSON}:
*
* <pre>
* <code>
* {
* $geoNear: {
* maxDistance: 400,
* num: 10,
* near: { type: "Point", coordinates: [-73.99171, 40.738868] },
* spherical:true,
* key: "location",
* distanceField: "distance"
* }
* }
*
* </code>
* </pre>
*
* resulting in the following 3 Documents.
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* "distance" : 0.0 // Meters
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 69.3582262492474 // Meters
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 69.3582262492474 // Meters
* }
* </code>
* </pre>
*
* Using legacy coordinate pairs one operates upon radians as discussed before. Assume we use {@link Metrics#KILOMETERS}
* when constructing the geoNear command. The {@link Metric} will make sure the distance multiplier is set correctly, so
* the command is rendered like
*
* <pre>
* <code>
* {
* $geoNear: {
* maxDistance: 0.0000627142377, // 400 Meters
* distanceMultiplier: 6378.137,
* num: 10,
* near: [-73.99171, 40.738868],
* spherical:true,
* key: "location",
* distanceField: "distance"
* }
* }
* </code>
* </pre>
*
* Please note the calculated distance now uses {@literal Kilometers} instead of {@literal Meters} as unit of measure,
* so we need to take it times 1000 to match up to {@literal Meters} as in the {@literal GeoJSON} variant. <br />
* Still as we've been requesting the {@link Distance} in {@link Metrics#KILOMETERS} the {@link Distance#getValue()}
* reflects exactly this.
*
* <pre>
* <code>
* {
* "_id" : ObjectId("5c10f3735d38908db52796a6"),
* "name" : "10gen Office",
* "location" : { "type" : "Point", "coordinates" : [ -73.99171, 40.738868 ] }
* "distance" : 0.0 // Kilometers
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796a9"),
* "name" : "City Bakery ",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 0.0693586286032982 // Kilometers
* }
* {
* "_id" : ObjectId("5c10f3735d38908db52796aa"),
* "name" : "Splash Bar",
* "location" : { "type" : "Point", "coordinates" : [ -73.992491, 40.738673 ] }
* "distance" : 0.0693586286032982 // Kilometers
* }
* </code>
* </pre>
* Builder class to build near-queries.
*
* @author Oliver Gierke
* @author Thomas Darimont
@@ -224,14 +89,10 @@ public final class NearQuery {
}
/**
* Creates a new {@link NearQuery} starting at the given {@link Point}. <br />
* <strong>NOTE</strong> There is a difference in using {@link Point} versus {@link GeoJsonPoint}. {@link Point}
* values are rendered as coordinate pairs in the legacy format and operate upon radians, whereas the
* {@link GeoJsonPoint} uses according to its specification {@literal meters} as unit of measure. This may lead to
* different results when using a {@link Metrics#NEUTRAL neutral Metric}.
* Creates a new {@link NearQuery} starting at the given {@link Point}.
*
* @param point must not be {@literal null}.
* @return new instance of {@link NearQuery}.
* @return
*/
public static NearQuery near(Point point) {
return near(point, Metrics.NEUTRAL);
@@ -240,15 +101,11 @@ public final class NearQuery {
/**
* Creates a {@link NearQuery} starting near the given {@link Point} using the given {@link Metric} to adapt given
* values to further configuration. E.g. setting a {@link #maxDistance(double)} will be interpreted as a value of the
* initially set {@link Metric}. <br />
* <strong>NOTE</strong> There is a difference in using {@link Point} versus {@link GeoJsonPoint}. {@link Point}
* values are rendered as coordinate pairs in the legacy format and operate upon radians, whereas the
* {@link GeoJsonPoint} uses according to its specification {@literal meters} as unit of measure. This may lead to
* different results when using a {@link Metrics#NEUTRAL neutral Metric}.
* initially set {@link Metric}.
*
* @param point must not be {@literal null}.
* @param metric must not be {@literal null}.
* @return new instance of {@link NearQuery}.
* @return
*/
public static NearQuery near(Point point, Metric metric) {
return new NearQuery(point, metric);
@@ -559,46 +416,25 @@ public final class NearQuery {
}
if (maxDistance != null) {
document.put("maxDistance", getDistanceValueInRadiantsOrMeters(maxDistance));
document.put("maxDistance", maxDistance.getNormalizedValue());
}
if (minDistance != null) {
document.put("minDistance", getDistanceValueInRadiantsOrMeters(minDistance));
document.put("minDistance", minDistance.getNormalizedValue());
}
if (metric != null) {
document.put("distanceMultiplier", getDistanceMultiplier());
document.put("distanceMultiplier", metric.getMultiplier());
}
if (num != null) {
document.put("num", num);
}
if (usesGeoJson()) {
document.put("near", point);
} else {
document.put("near", Arrays.asList(point.getX(), point.getY()));
}
document.put("near", Arrays.asList(point.getX(), point.getY()));
document.put("spherical", spherical ? spherical : usesGeoJson());
document.put("spherical", spherical);
return document;
}
private double getDistanceMultiplier() {
return usesMetricSystem() ? MetricConversion.getMetersToMetricMultiplier(metric) : metric.getMultiplier();
}
private double getDistanceValueInRadiantsOrMeters(Distance distance) {
return usesMetricSystem() ? MetricConversion.getDistanceInMeters(distance) : distance.getNormalizedValue();
}
private boolean usesMetricSystem() {
return usesGeoJson();
}
private boolean usesGeoJson() {
return point instanceof GeoJsonPoint;
}
}

View File

@@ -262,6 +262,17 @@ public class Query {
return document;
}
/**
* Returns {@literal true} if the {@link Query} has a sort parameter.
*
* @return {@literal true} if sorted.
* @see Sort#isSorted()
* @since 2.1
*/
public boolean isSorted() {
return sort.isSorted();
}
/**
* Get the number of documents to skip.
*
@@ -372,7 +383,7 @@ public class Query {
* Set the number of documents to return in each response batch. <br />
* Use {@literal 0 (zero)} for no limit. A <strong>negative limit</strong> closes the cursor after returning a single
* batch indicating to the server that the client will not ask for a subsequent one.
*
*
* @param batchSize The number of documents to return per batch.
* @return this.
* @see Meta#setCursorBatchSize(int)

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.Iterator;
@@ -23,10 +22,10 @@ import java.util.LinkedHashMap;
import java.util.Map;
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import com.mongodb.util.JSON;
/**
* Utility methods for JSON serialization.
@@ -119,32 +118,19 @@ public abstract class SerializationUtils {
}
try {
String json = value instanceof Document ? ((Document) value).toJson() : serializeValue(value);
return json.replaceAll("\":", "\" :").replaceAll("\\{\"", "{ \"");
return value instanceof Document ? ((Document) value).toJson() : JSON.serialize(value);
} catch (Exception e) {
if (value instanceof Collection) {
return toString((Collection<?>) value);
} else if (value instanceof Map) {
return toString((Map<?, ?>) value);
} else if (ObjectUtils.isArray(value)) {
return toString(Arrays.asList(ObjectUtils.toObjectArray(value)));
} else {
return String.format("{ \"$java\" : %s }", value.toString());
}
}
}
public static String serializeValue(@Nullable Object value) {
if (value == null) {
return "null";
}
String documentJson = new Document("toBeEncoded", value).toJson();
return documentJson.substring(documentJson.indexOf(':') + 1, documentJson.length() - 1).trim();
}
private static String toString(Map<?, ?> source) {
return iterableToDelimitedString(source.entrySet(), "{ ", " }",
entry -> String.format("\"%s\" : %s", entry.getKey(), serializeToJsonSafely(entry.getValue())));

View File

@@ -180,4 +180,13 @@ public class TextQuery extends Query {
return sort;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.Query#isSorted()
*/
@Override
public boolean isSorted() {
return super.isSorted() || sortByScore;
}
}

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
@@ -27,7 +26,6 @@ import java.util.Objects;
import java.util.Set;
import org.bson.Document;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
@@ -60,7 +58,6 @@ public class Update implements UpdateDefinition {
private Set<String> keysToUpdate = new HashSet<>();
private Map<String, Object> modifierOps = new LinkedHashMap<>();
private Map<String, PushOperatorBuilder> pushCommandBuilders = new LinkedHashMap<>(1);
private List<ArrayFilter> arrayFilters = new ArrayList<>();
/**
* Static factory method to create an Update using the provided key
@@ -402,35 +399,6 @@ public class Update implements UpdateDefinition {
return this;
}
/**
* Filter elements in an array that match the given criteria for update. {@link CriteriaDefinition} is passed directly
* to the driver without further type or field mapping.
*
* @param criteria must not be {@literal null}.
* @return this.
* @since 2.2
*/
public Update filterArray(CriteriaDefinition criteria) {
this.arrayFilters.add(criteria::getCriteriaObject);
return this;
}
/**
* Filter elements in an array that match the given criteria for update. {@code expression} is used directly with the
* driver without further further type or field mapping.
*
* @param identifier the positional operator identifier filter criteria name.
* @param expression the positional operator filter expression.
* @return this.
* @since 2.2
*/
public Update filterArray(String identifier, Object expression) {
this.arrayFilters.add(() -> new Document(identifier, expression));
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
@@ -447,14 +415,6 @@ public class Update implements UpdateDefinition {
return new Document(modifierOps);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
public List<ArrayFilter> getArrayFilters() {
return Collections.unmodifiableList(this.arrayFilters);
}
/**
* This method is not called anymore rather override {@link #addMultiFieldOperation(String, String, Object)}.
*

View File

@@ -15,8 +15,6 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.List;
import org.bson.Document;
/**
@@ -24,7 +22,7 @@ import org.bson.Document;
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
* @since 2.1.4
*/
public interface UpdateDefinition {
@@ -55,37 +53,4 @@ public interface UpdateDefinition {
* @param key must not be {@literal null}.
*/
void inc(String key);
/**
* Get the specification which elements to modify in an array field. {@link ArrayFilter} are passed directly to the
* driver without further type or field mapping.
*
* @return never {@literal null}.
* @since 2.2
*/
List<ArrayFilter> getArrayFilters();
/**
* @return {@literal true} if {@link UpdateDefinition} contains {@link #getArrayFilters() array filters}.
* @since 2.2
*/
default boolean hasArrayFilters() {
return !getArrayFilters().isEmpty();
}
/**
* A filter to specify which elements to modify in an array field.
*
* @since 2.2
*/
interface ArrayFilter {
/**
* Get the {@link Document} representation of the filter to apply. The returned {@link Document} is used directly
* with the driver without further type or field mapping.
*
* @return never {@literal null}.
*/
Document asDocument();
}
}

View File

@@ -45,149 +45,122 @@ public class MethodReferenceNode extends ExpressionNode {
Map<String, AggregationMethodReference> map = new HashMap<String, AggregationMethodReference>();
// BOOLEAN OPERATORS
map.put("and", arrayArgRef().forOperator("$and"));
map.put("or", arrayArgRef().forOperator("$or"));
map.put("not", arrayArgRef().forOperator("$not"));
map.put("and", arrayArgumentAggregationMethodReference().forOperator("$and"));
map.put("or", arrayArgumentAggregationMethodReference().forOperator("$or"));
map.put("not", arrayArgumentAggregationMethodReference().forOperator("$not"));
// SET OPERATORS
map.put("setEquals", arrayArgRef().forOperator("$setEquals"));
map.put("setIntersection", arrayArgRef().forOperator("$setIntersection"));
map.put("setUnion", arrayArgRef().forOperator("$setUnion"));
map.put("setDifference", arrayArgRef().forOperator("$setDifference"));
map.put("setEquals", arrayArgumentAggregationMethodReference().forOperator("$setEquals"));
map.put("setIntersection", arrayArgumentAggregationMethodReference().forOperator("$setIntersection"));
map.put("setUnion", arrayArgumentAggregationMethodReference().forOperator("$setUnion"));
map.put("setDifference", arrayArgumentAggregationMethodReference().forOperator("$setDifference"));
// 2nd.
map.put("setIsSubset", arrayArgRef().forOperator("$setIsSubset"));
map.put("anyElementTrue", arrayArgRef().forOperator("$anyElementTrue"));
map.put("allElementsTrue", arrayArgRef().forOperator("$allElementsTrue"));
map.put("setIsSubset", arrayArgumentAggregationMethodReference().forOperator("$setIsSubset"));
map.put("anyElementTrue", arrayArgumentAggregationMethodReference().forOperator("$anyElementTrue"));
map.put("allElementsTrue", arrayArgumentAggregationMethodReference().forOperator("$allElementsTrue"));
// COMPARISON OPERATORS
map.put("cmp", arrayArgRef().forOperator("$cmp"));
map.put("eq", arrayArgRef().forOperator("$eq"));
map.put("gt", arrayArgRef().forOperator("$gt"));
map.put("gte", arrayArgRef().forOperator("$gte"));
map.put("lt", arrayArgRef().forOperator("$lt"));
map.put("lte", arrayArgRef().forOperator("$lte"));
map.put("ne", arrayArgRef().forOperator("$ne"));
map.put("cmp", arrayArgumentAggregationMethodReference().forOperator("$cmp"));
map.put("eq", arrayArgumentAggregationMethodReference().forOperator("$eq"));
map.put("gt", arrayArgumentAggregationMethodReference().forOperator("$gt"));
map.put("gte", arrayArgumentAggregationMethodReference().forOperator("$gte"));
map.put("lt", arrayArgumentAggregationMethodReference().forOperator("$lt"));
map.put("lte", arrayArgumentAggregationMethodReference().forOperator("$lte"));
map.put("ne", arrayArgumentAggregationMethodReference().forOperator("$ne"));
// ARITHMETIC OPERATORS
map.put("abs", singleArgRef().forOperator("$abs"));
map.put("add", arrayArgRef().forOperator("$add"));
map.put("ceil", singleArgRef().forOperator("$ceil"));
map.put("divide", arrayArgRef().forOperator("$divide"));
map.put("exp", singleArgRef().forOperator("$exp"));
map.put("floor", singleArgRef().forOperator("$floor"));
map.put("ln", singleArgRef().forOperator("$ln"));
map.put("log", arrayArgRef().forOperator("$log"));
map.put("log10", singleArgRef().forOperator("$log10"));
map.put("mod", arrayArgRef().forOperator("$mod"));
map.put("multiply", arrayArgRef().forOperator("$multiply"));
map.put("pow", arrayArgRef().forOperator("$pow"));
map.put("sqrt", singleArgRef().forOperator("$sqrt"));
map.put("subtract", arrayArgRef().forOperator("$subtract"));
map.put("trunc", singleArgRef().forOperator("$trunc"));
map.put("abs", singleArgumentAggregationMethodReference().forOperator("$abs"));
map.put("add", arrayArgumentAggregationMethodReference().forOperator("$add"));
map.put("ceil", singleArgumentAggregationMethodReference().forOperator("$ceil"));
map.put("divide", arrayArgumentAggregationMethodReference().forOperator("$divide"));
map.put("exp", singleArgumentAggregationMethodReference().forOperator("$exp"));
map.put("floor", singleArgumentAggregationMethodReference().forOperator("$floor"));
map.put("ln", singleArgumentAggregationMethodReference().forOperator("$ln"));
map.put("log", arrayArgumentAggregationMethodReference().forOperator("$log"));
map.put("log10", singleArgumentAggregationMethodReference().forOperator("$log10"));
map.put("mod", arrayArgumentAggregationMethodReference().forOperator("$mod"));
map.put("multiply", arrayArgumentAggregationMethodReference().forOperator("$multiply"));
map.put("pow", arrayArgumentAggregationMethodReference().forOperator("$pow"));
map.put("sqrt", singleArgumentAggregationMethodReference().forOperator("$sqrt"));
map.put("subtract", arrayArgumentAggregationMethodReference().forOperator("$subtract"));
map.put("trunc", singleArgumentAggregationMethodReference().forOperator("$trunc"));
// STRING OPERATORS
map.put("concat", arrayArgRef().forOperator("$concat"));
map.put("strcasecmp", arrayArgRef().forOperator("$strcasecmp"));
map.put("substr", arrayArgRef().forOperator("$substr"));
map.put("toLower", singleArgRef().forOperator("$toLower"));
map.put("toUpper", singleArgRef().forOperator("$toUpper"));
map.put("indexOfBytes", arrayArgRef().forOperator("$indexOfBytes"));
map.put("indexOfCP", arrayArgRef().forOperator("$indexOfCP"));
map.put("split", arrayArgRef().forOperator("$split"));
map.put("strLenBytes", singleArgRef().forOperator("$strLenBytes"));
map.put("strLenCP", singleArgRef().forOperator("$strLenCP"));
map.put("substrCP", arrayArgRef().forOperator("$substrCP"));
map.put("trim", mapArgRef().forOperator("$trim").mappingParametersTo("input", "chars"));
map.put("ltrim", mapArgRef().forOperator("$ltrim").mappingParametersTo("input", "chars"));
map.put("rtrim", mapArgRef().forOperator("$rtrim").mappingParametersTo("input", "chars"));
map.put("concat", arrayArgumentAggregationMethodReference().forOperator("$concat"));
map.put("strcasecmp", arrayArgumentAggregationMethodReference().forOperator("$strcasecmp"));
map.put("substr", arrayArgumentAggregationMethodReference().forOperator("$substr"));
map.put("toLower", singleArgumentAggregationMethodReference().forOperator("$toLower"));
map.put("toUpper", singleArgumentAggregationMethodReference().forOperator("$toUpper"));
map.put("strcasecmp", arrayArgumentAggregationMethodReference().forOperator("$strcasecmp"));
map.put("indexOfBytes", arrayArgumentAggregationMethodReference().forOperator("$indexOfBytes"));
map.put("indexOfCP", arrayArgumentAggregationMethodReference().forOperator("$indexOfCP"));
map.put("split", arrayArgumentAggregationMethodReference().forOperator("$split"));
map.put("strLenBytes", singleArgumentAggregationMethodReference().forOperator("$strLenBytes"));
map.put("strLenCP", singleArgumentAggregationMethodReference().forOperator("$strLenCP"));
map.put("substrCP", arrayArgumentAggregationMethodReference().forOperator("$substrCP"));
// TEXT SEARCH OPERATORS
map.put("meta", singleArgRef().forOperator("$meta"));
map.put("meta", singleArgumentAggregationMethodReference().forOperator("$meta"));
// ARRAY OPERATORS
map.put("arrayElemAt", arrayArgRef().forOperator("$arrayElemAt"));
map.put("concatArrays", arrayArgRef().forOperator("$concatArrays"));
map.put("filter", mapArgRef().forOperator("$filter") //
map.put("arrayElemAt", arrayArgumentAggregationMethodReference().forOperator("$arrayElemAt"));
map.put("concatArrays", arrayArgumentAggregationMethodReference().forOperator("$concatArrays"));
map.put("filter", mapArgumentAggregationMethodReference().forOperator("$filter") //
.mappingParametersTo("input", "as", "cond"));
map.put("isArray", singleArgRef().forOperator("$isArray"));
map.put("size", singleArgRef().forOperator("$size"));
map.put("slice", arrayArgRef().forOperator("$slice"));
map.put("reverseArray", singleArgRef().forOperator("$reverseArray"));
map.put("reduce", mapArgRef().forOperator("$reduce").mappingParametersTo("input", "initialValue", "in"));
map.put("zip", mapArgRef().forOperator("$zip").mappingParametersTo("inputs", "useLongestLength", "defaults"));
map.put("in", arrayArgRef().forOperator("$in"));
map.put("arrayToObject", singleArgRef().forOperator("$arrayToObject"));
map.put("indexOfArray", arrayArgRef().forOperator("$indexOfArray"));
map.put("range", arrayArgRef().forOperator("$range"));
map.put("isArray", singleArgumentAggregationMethodReference().forOperator("$isArray"));
map.put("size", singleArgumentAggregationMethodReference().forOperator("$size"));
map.put("slice", arrayArgumentAggregationMethodReference().forOperator("$slice"));
map.put("reverseArray", singleArgumentAggregationMethodReference().forOperator("$reverseArray"));
map.put("reduce", mapArgumentAggregationMethodReference().forOperator("$reduce").mappingParametersTo("input",
"initialValue", "in"));
map.put("zip", mapArgumentAggregationMethodReference().forOperator("$zip").mappingParametersTo("inputs",
"useLongestLength", "defaults"));
map.put("in", arrayArgumentAggregationMethodReference().forOperator("$in"));
// VARIABLE OPERATORS
map.put("map", mapArgRef().forOperator("$map") //
map.put("map", mapArgumentAggregationMethodReference().forOperator("$map") //
.mappingParametersTo("input", "as", "in"));
map.put("let", mapArgRef().forOperator("$let").mappingParametersTo("vars", "in"));
map.put("let", mapArgumentAggregationMethodReference().forOperator("$let").mappingParametersTo("vars", "in"));
// LITERAL OPERATORS
map.put("literal", singleArgRef().forOperator("$literal"));
map.put("literal", singleArgumentAggregationMethodReference().forOperator("$literal"));
// DATE OPERATORS
map.put("dayOfYear", singleArgRef().forOperator("$dayOfYear"));
map.put("dayOfMonth", singleArgRef().forOperator("$dayOfMonth"));
map.put("dayOfWeek", singleArgRef().forOperator("$dayOfWeek"));
map.put("year", singleArgRef().forOperator("$year"));
map.put("month", singleArgRef().forOperator("$month"));
map.put("week", singleArgRef().forOperator("$week"));
map.put("hour", singleArgRef().forOperator("$hour"));
map.put("minute", singleArgRef().forOperator("$minute"));
map.put("second", singleArgRef().forOperator("$second"));
map.put("millisecond", singleArgRef().forOperator("$millisecond"));
map.put("dateToString", mapArgRef().forOperator("$dateToString") //
map.put("dayOfYear", singleArgumentAggregationMethodReference().forOperator("$dayOfYear"));
map.put("dayOfMonth", singleArgumentAggregationMethodReference().forOperator("$dayOfMonth"));
map.put("dayOfWeek", singleArgumentAggregationMethodReference().forOperator("$dayOfWeek"));
map.put("year", singleArgumentAggregationMethodReference().forOperator("$year"));
map.put("month", singleArgumentAggregationMethodReference().forOperator("$month"));
map.put("week", singleArgumentAggregationMethodReference().forOperator("$week"));
map.put("hour", singleArgumentAggregationMethodReference().forOperator("$hour"));
map.put("minute", singleArgumentAggregationMethodReference().forOperator("$minute"));
map.put("second", singleArgumentAggregationMethodReference().forOperator("$second"));
map.put("millisecond", singleArgumentAggregationMethodReference().forOperator("$millisecond"));
map.put("dateToString", mapArgumentAggregationMethodReference().forOperator("$dateToString") //
.mappingParametersTo("format", "date"));
map.put("dateFromString", mapArgRef().forOperator("$dateFromString") //
.mappingParametersTo("dateString", "format", "timezone", "onError", "onNull"));
map.put("dateFromParts", mapArgRef().forOperator("$dateFromParts").mappingParametersTo("year", "month", "day",
"hour", "minute", "second", "milliseconds", "timezone"));
map.put("isoDateFromParts", mapArgRef().forOperator("$dateFromParts").mappingParametersTo("isoWeekYear", "isoWeek",
"isoDayOfWeek", "hour", "minute", "second", "milliseconds", "timezone"));
map.put("dateToParts", mapArgRef().forOperator("$dateToParts") //
.mappingParametersTo("date", "timezone", "iso8601"));
map.put("isoDayOfWeek", singleArgRef().forOperator("$isoDayOfWeek"));
map.put("isoWeek", singleArgRef().forOperator("$isoWeek"));
map.put("isoWeekYear", singleArgRef().forOperator("$isoWeekYear"));
map.put("isoDayOfWeek", singleArgumentAggregationMethodReference().forOperator("$isoDayOfWeek"));
map.put("isoWeek", singleArgumentAggregationMethodReference().forOperator("$isoWeek"));
map.put("isoWeekYear", singleArgumentAggregationMethodReference().forOperator("$isoWeekYear"));
// CONDITIONAL OPERATORS
map.put("cond", mapArgRef().forOperator("$cond") //
map.put("cond", mapArgumentAggregationMethodReference().forOperator("$cond") //
.mappingParametersTo("if", "then", "else"));
map.put("ifNull", arrayArgRef().forOperator("$ifNull"));
map.put("ifNull", arrayArgumentAggregationMethodReference().forOperator("$ifNull"));
// GROUP OPERATORS
map.put("sum", arrayArgRef().forOperator("$sum"));
map.put("avg", arrayArgRef().forOperator("$avg"));
map.put("first", singleArgRef().forOperator("$first"));
map.put("last", singleArgRef().forOperator("$last"));
map.put("max", arrayArgRef().forOperator("$max"));
map.put("min", arrayArgRef().forOperator("$min"));
map.put("push", singleArgRef().forOperator("$push"));
map.put("addToSet", singleArgRef().forOperator("$addToSet"));
map.put("stdDevPop", arrayArgRef().forOperator("$stdDevPop"));
map.put("stdDevSamp", arrayArgRef().forOperator("$stdDevSamp"));
map.put("sum", arrayArgumentAggregationMethodReference().forOperator("$sum"));
map.put("avg", arrayArgumentAggregationMethodReference().forOperator("$avg"));
map.put("first", singleArgumentAggregationMethodReference().forOperator("$first"));
map.put("last", singleArgumentAggregationMethodReference().forOperator("$last"));
map.put("max", arrayArgumentAggregationMethodReference().forOperator("$max"));
map.put("min", arrayArgumentAggregationMethodReference().forOperator("$min"));
map.put("push", singleArgumentAggregationMethodReference().forOperator("$push"));
map.put("addToSet", singleArgumentAggregationMethodReference().forOperator("$addToSet"));
map.put("stdDevPop", arrayArgumentAggregationMethodReference().forOperator("$stdDevPop"));
map.put("stdDevSamp", arrayArgumentAggregationMethodReference().forOperator("$stdDevSamp"));
// TYPE OPERATORS
map.put("type", singleArgRef().forOperator("$type"));
// OBJECT OPERATORS
map.put("objectToArray", singleArgRef().forOperator("$objectToArray"));
map.put("mergeObjects", arrayArgRef().forOperator("$mergeObjects"));
// CONVERT OPERATORS
map.put("convert", mapArgRef().forOperator("$convert") //
.mappingParametersTo("input", "to", "onError", "onNull"));
map.put("toBool", singleArgRef().forOperator("$toBool"));
map.put("toDate", singleArgRef().forOperator("$toDate"));
map.put("toDecimal", singleArgRef().forOperator("$toDecimal"));
map.put("toDouble", singleArgRef().forOperator("$toDouble"));
map.put("toInt", singleArgRef().forOperator("$toInt"));
map.put("toLong", singleArgRef().forOperator("$toLong"));
map.put("toObjectId", singleArgRef().forOperator("$toObjectId"));
map.put("toString", singleArgRef().forOperator("$toString"));
map.put("type", singleArgumentAggregationMethodReference().forOperator("$type"));
FUNCTIONS = Collections.unmodifiableMap(map);
}
@@ -199,7 +172,7 @@ public class MethodReferenceNode extends ExpressionNode {
/**
* Returns the name of the method.
*
* @deprecated since 1.10. Please use {@link #getMethodReference()}.
* @Deprecated since 1.10. Please use {@link #getMethodReference()}.
*/
@Nullable
@Deprecated
@@ -282,7 +255,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference singleArgRef() {
static AggregationMethodReference singleArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.SINGLE, null);
}
@@ -291,7 +264,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference arrayArgRef() {
static AggregationMethodReference arrayArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.ARRAY, null);
}
@@ -300,7 +273,7 @@ public class MethodReferenceNode extends ExpressionNode {
*
* @return never {@literal null}.
*/
static AggregationMethodReference mapArgRef() {
static AggregationMethodReference mapArgumentAggregationMethodReference() {
return new AggregationMethodReference(null, ArgumentType.MAP, null);
}

View File

@@ -1,266 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import lombok.RequiredArgsConstructor;
import reactor.core.CoreSubscriber;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Operators;
import reactor.util.concurrent.Queues;
import reactor.util.context.Context;
import java.nio.ByteBuffer;
import java.util.Queue;
import java.util.concurrent.atomic.AtomicIntegerFieldUpdater;
import java.util.concurrent.atomic.AtomicLongFieldUpdater;
import java.util.function.BiConsumer;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import com.mongodb.reactivestreams.client.Success;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Adapter accepting a binary stream {@link Publisher} and emitting its through {@link AsyncInputStream}.
* <p>
* This adapter subscribes to the binary {@link Publisher} as soon as the first chunk gets {@link #read(ByteBuffer)
* requested}. Requests are queued and binary chunks are requested from the {@link Publisher}. As soon as the
* {@link Publisher} emits items, chunks are provided to the read request which completes the number-of-written-bytes
* {@link Publisher}.
* <p>
* {@link AsyncInputStream} is supposed to work as sequential callback API that is called until reaching EOF.
* {@link #close()} is propagated as cancellation signal to the binary {@link Publisher}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
@RequiredArgsConstructor
class AsyncInputStreamAdapter implements AsyncInputStream {
private static final AtomicLongFieldUpdater<AsyncInputStreamAdapter> DEMAND = AtomicLongFieldUpdater
.newUpdater(AsyncInputStreamAdapter.class, "demand");
private static final AtomicIntegerFieldUpdater<AsyncInputStreamAdapter> SUBSCRIBED = AtomicIntegerFieldUpdater
.newUpdater(AsyncInputStreamAdapter.class, "subscribed");
private static final int SUBSCRIPTION_NOT_SUBSCRIBED = 0;
private static final int SUBSCRIPTION_SUBSCRIBED = 1;
private final Publisher<? extends DataBuffer> buffers;
private final Context subscriberContext;
private final DefaultDataBufferFactory factory = new DefaultDataBufferFactory();
private volatile Subscription subscription;
private volatile boolean cancelled;
private volatile boolean complete;
private volatile Throwable error;
private final Queue<BiConsumer<DataBuffer, Integer>> readRequests = Queues.<BiConsumer<DataBuffer, Integer>> small()
.get();
// see DEMAND
volatile long demand;
// see SUBSCRIBED
volatile int subscribed = SUBSCRIPTION_NOT_SUBSCRIBED;
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#read(java.nio.ByteBuffer)
*/
@Override
public Publisher<Integer> read(ByteBuffer dst) {
return Mono.create(sink -> {
readRequests.offer((db, bytecount) -> {
try {
if (error != null) {
sink.error(error);
return;
}
if (bytecount == -1) {
sink.success(-1);
return;
}
ByteBuffer byteBuffer = db.asByteBuffer();
int toWrite = byteBuffer.remaining();
dst.put(byteBuffer);
sink.success(toWrite);
} catch (Exception e) {
sink.error(e);
} finally {
DataBufferUtils.release(db);
}
});
request(1);
});
}
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#skip(long)
*/
@Override
public Publisher<Long> skip(long bytesToSkip) {
throw new UnsupportedOperationException("Skip is currently not implemented");
}
/*
* (non-Javadoc)
* @see com.mongodb.reactivestreams.client.gridfs.AsyncInputStream#close()
*/
@Override
public Publisher<Success> close() {
return Mono.create(sink -> {
cancelled = true;
if (error != null) {
sink.error(error);
return;
}
sink.success(Success.SUCCESS);
});
}
protected void request(int n) {
if (complete) {
terminatePendingReads();
return;
}
Operators.addCap(DEMAND, this, n);
if (SUBSCRIBED.get(this) == SUBSCRIPTION_NOT_SUBSCRIBED) {
if (SUBSCRIBED.compareAndSet(this, SUBSCRIPTION_NOT_SUBSCRIBED, SUBSCRIPTION_SUBSCRIBED)) {
buffers.subscribe(new DataBufferCoreSubscriber());
}
} else {
Subscription subscription = this.subscription;
if (subscription != null) {
requestFromSubscription(subscription);
}
}
}
void requestFromSubscription(Subscription subscription) {
long demand = DEMAND.get(AsyncInputStreamAdapter.this);
if (cancelled) {
subscription.cancel();
}
if (demand > 0 && DEMAND.compareAndSet(AsyncInputStreamAdapter.this, demand, demand - 1)) {
subscription.request(1);
}
}
/**
* Terminates pending reads with empty buffers.
*/
void terminatePendingReads() {
BiConsumer<DataBuffer, Integer> readers;
while ((readers = readRequests.poll()) != null) {
readers.accept(factory.wrap(new byte[0]), -1);
}
}
private class DataBufferCoreSubscriber implements CoreSubscriber<DataBuffer> {
@Override
public Context currentContext() {
return AsyncInputStreamAdapter.this.subscriberContext;
}
@Override
public void onSubscribe(Subscription s) {
AsyncInputStreamAdapter.this.subscription = s;
Operators.addCap(DEMAND, AsyncInputStreamAdapter.this, -1);
s.request(1);
}
@Override
public void onNext(DataBuffer dataBuffer) {
if (cancelled || complete) {
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, AsyncInputStreamAdapter.this.subscriberContext);
return;
}
BiConsumer<DataBuffer, Integer> poll = AsyncInputStreamAdapter.this.readRequests.poll();
if (poll == null) {
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, AsyncInputStreamAdapter.this.subscriberContext);
subscription.cancel();
return;
}
poll.accept(dataBuffer, dataBuffer.readableByteCount());
requestFromSubscription(subscription);
}
@Override
public void onError(Throwable t) {
if (AsyncInputStreamAdapter.this.cancelled || AsyncInputStreamAdapter.this.complete) {
Operators.onErrorDropped(t, AsyncInputStreamAdapter.this.subscriberContext);
return;
}
AsyncInputStreamAdapter.this.error = t;
AsyncInputStreamAdapter.this.complete = true;
terminatePendingReads();
}
@Override
public void onComplete() {
AsyncInputStreamAdapter.this.complete = true;
terminatePendingReads();
}
}
}

View File

@@ -1,78 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DataBufferUtils;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Utility methods to create adapters from between {@link Publisher} of {@link DataBuffer} and {@link AsyncInputStream}.
*
* @author Mark Paluch
* @since 2.2
*/
class BinaryStreamAdapters {
/**
* Creates a {@link Flux} emitting {@link DataBuffer} by reading binary chunks from {@link AsyncInputStream}.
* Publisher termination (completion, error, cancellation) closes the {@link AsyncInputStream}.
* <p/>
* The resulting {@link org.reactivestreams.Publisher} filters empty binary chunks and uses {@link DataBufferFactory}
* settings to determine the chunk size.
*
* @param inputStream must not be {@literal null}.
* @param dataBufferFactory must not be {@literal null}.
* @return {@link Flux} emitting {@link DataBuffer}s.
* @see DataBufferFactory#allocateBuffer()
*/
static Flux<DataBuffer> toPublisher(AsyncInputStream inputStream, DataBufferFactory dataBufferFactory) {
return DataBufferPublisherAdapter.createBinaryStream(inputStream, dataBufferFactory) //
.filter(it -> {
if (it.readableByteCount() == 0) {
DataBufferUtils.release(it);
return false;
}
return true;
});
}
/**
* Creates a {@link Mono} emitting a {@link AsyncInputStream} to consume a {@link Publisher} emitting
* {@link DataBuffer} and exposing the binary stream through {@link AsyncInputStream}. {@link DataBuffer}s are
* released by the adapter during consumption.
* <p/>
* This method returns a {@link Mono} to retain the {@link reactor.util.context.Context subscriber context}.
*
* @param dataBuffers must not be {@literal null}.
* @return {@link Mono} emitting {@link AsyncInputStream}.
* @see DataBufferUtils#release(DataBuffer)
*/
static Mono<AsyncInputStream> toAsyncInputStream(Publisher<? extends DataBuffer> dataBuffers) {
return Mono.create(sink -> {
sink.success(new AsyncInputStreamAdapter(dataBuffers, sink.currentContext()));
});
}
}

View File

@@ -1,219 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import lombok.RequiredArgsConstructor;
import reactor.core.CoreSubscriber;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import reactor.core.publisher.Mono;
import reactor.core.publisher.Operators;
import reactor.util.context.Context;
import java.nio.ByteBuffer;
import java.util.concurrent.atomic.AtomicIntegerFieldUpdater;
import java.util.concurrent.atomic.AtomicLongFieldUpdater;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.core.io.buffer.DataBufferFactory;
import org.springframework.core.io.buffer.DataBufferUtils;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Utility to adapt a {@link AsyncInputStream} to a {@link Publisher} emitting {@link DataBuffer}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
class DataBufferPublisherAdapter {
/**
* Creates a {@link Publisher} emitting {@link DataBuffer}s by reading binary chunks from {@link AsyncInputStream}.
* Closes the {@link AsyncInputStream} once the {@link Publisher} terminates.
*
* @param inputStream must not be {@literal null}.
* @param dataBufferFactory must not be {@literal null}.
* @return the resulting {@link Publisher}.
*/
static Flux<DataBuffer> createBinaryStream(AsyncInputStream inputStream, DataBufferFactory dataBufferFactory) {
State state = new State(inputStream, dataBufferFactory);
return Flux.usingWhen(Mono.just(inputStream), it -> {
return Flux.<DataBuffer> create((sink) -> {
sink.onDispose(state::close);
sink.onCancel(state::close);
sink.onRequest(n -> {
state.request(sink, n);
});
});
}, AsyncInputStream::close, AsyncInputStream::close, AsyncInputStream::close) //
.concatMap(Flux::just, 1);
}
@RequiredArgsConstructor
static class State {
private static final AtomicLongFieldUpdater<State> DEMAND = AtomicLongFieldUpdater.newUpdater(State.class, "demand");
private static final AtomicIntegerFieldUpdater<State> STATE = AtomicIntegerFieldUpdater.newUpdater(State.class, "state");
private static final AtomicIntegerFieldUpdater<State> READ = AtomicIntegerFieldUpdater.newUpdater(State.class, "read");
private static final int STATE_OPEN = 0;
private static final int STATE_CLOSED = 1;
private static final int READ_NONE = 0;
private static final int READ_IN_PROGRESS = 1;
final AsyncInputStream inputStream;
final DataBufferFactory dataBufferFactory;
// see DEMAND
volatile long demand;
// see STATE
volatile int state = STATE_OPEN;
// see READ_IN_PROGRESS
volatile int read = READ_NONE;
void request(FluxSink<DataBuffer> sink, long n) {
Operators.addCap(DEMAND, this, n);
if (onShouldRead()) {
emitNext(sink);
}
}
boolean onShouldRead() {
return !isClosed() && getDemand() > 0 && onWantRead();
}
boolean onWantRead() {
return READ.compareAndSet(this, READ_NONE, READ_IN_PROGRESS);
}
boolean onReadDone() {
return READ.compareAndSet(this, READ_IN_PROGRESS, READ_NONE);
}
long getDemand() {
return DEMAND.get(this);
}
void close() {
STATE.compareAndSet(this, STATE_OPEN, STATE_CLOSED);
}
boolean isClosed() {
return STATE.get(this) == STATE_CLOSED;
}
/**
* Emit the next {@link DataBuffer}.
*
* @param sink
*/
void emitNext(FluxSink<DataBuffer> sink) {
DataBuffer dataBuffer = dataBufferFactory.allocateBuffer();
ByteBuffer intermediate = ByteBuffer.allocate(dataBuffer.capacity());
Mono.from(inputStream.read(intermediate)).subscribe(new BufferCoreSubscriber(sink, dataBuffer, intermediate));
}
private class BufferCoreSubscriber implements CoreSubscriber<Integer> {
private final FluxSink<DataBuffer> sink;
private final DataBuffer dataBuffer;
private final ByteBuffer intermediate;
BufferCoreSubscriber(FluxSink<DataBuffer> sink, DataBuffer dataBuffer, ByteBuffer intermediate) {
this.sink = sink;
this.dataBuffer = dataBuffer;
this.intermediate = intermediate;
}
@Override
public Context currentContext() {
return sink.currentContext();
}
@Override
public void onSubscribe(Subscription s) {
s.request(1);
}
@Override
public void onNext(Integer bytes) {
if (isClosed()) {
onReadDone();
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, sink.currentContext());
return;
}
intermediate.flip();
dataBuffer.write(intermediate);
sink.next(dataBuffer);
try {
if (bytes == -1) {
sink.complete();
}
} finally {
onReadDone();
}
}
@Override
public void onError(Throwable t) {
if (isClosed()) {
Operators.onErrorDropped(t, sink.currentContext());
return;
}
onReadDone();
DataBufferUtils.release(dataBuffer);
Operators.onNextDropped(dataBuffer, sink.currentContext());
sink.error(t);
}
@Override
public void onComplete() {
if (onShouldRead()) {
emitNext(sink);
}
}
}
}
}

View File

@@ -1,104 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import java.util.Optional;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.client.gridfs.model.GridFSUploadOptions;
/**
* Base class offering common tasks like query mapping and {@link GridFSUploadOptions} computation to be shared across
* imperative and reactive implementations.
*
* @author Christoph Strobl
* @since 2.2
*/
class GridFsOperationsSupport {
private final QueryMapper queryMapper;
private final MongoConverter converter;
/**
* @param converter must not be {@literal null}.
*/
GridFsOperationsSupport(MongoConverter converter) {
Assert.notNull(converter, "MongoConverter must not be null!");
this.converter = converter;
this.queryMapper = new QueryMapper(converter);
}
/**
* @param query pass the given query though a {@link QueryMapper} to apply type conversion.
* @return never {@literal null}.
*/
protected Document getMappedQuery(Document query) {
return queryMapper.getMappedObject(query, Optional.empty());
}
/**
* Compute the {@link GridFSUploadOptions} to be used from the given {@literal contentType} and {@literal metadata}
* {@link Document}.
*
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return never {@literal null}.
*/
protected GridFSUploadOptions computeUploadOptionsFor(@Nullable String contentType, @Nullable Document metadata) {
Document targetMetadata = new Document();
if (StringUtils.hasText(contentType)) {
targetMetadata.put(GridFsResource.CONTENT_TYPE_FIELD, contentType);
}
if (metadata != null) {
targetMetadata.putAll(metadata);
}
GridFSUploadOptions options = new GridFSUploadOptions();
options.metadata(targetMetadata);
return options;
}
/**
* Convert a given {@literal value} into a {@link Document}.
*
* @param value can be {@literal null}.
* @return an empty {@link Document} if the source value is {@literal null}.
*/
protected Document toDocument(@Nullable Object value) {
if (value instanceof Document) {
return (Document) value;
}
Document document = new Document();
if (value != null) {
converter.write(value, document);
}
return document;
}
}

View File

@@ -29,6 +29,7 @@ import org.bson.types.ObjectId;
import org.springframework.core.io.support.ResourcePatternResolver;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -39,6 +40,7 @@ import com.mongodb.client.gridfs.GridFSBucket;
import com.mongodb.client.gridfs.GridFSBuckets;
import com.mongodb.client.gridfs.GridFSFindIterable;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.client.gridfs.model.GridFSUploadOptions;
/**
* {@link GridFsOperations} implementation to store content into MongoDB GridFS.
@@ -52,11 +54,13 @@ import com.mongodb.client.gridfs.model.GridFSFile;
* @author Hartmut Lang
* @author Niklas Helge Hanft
*/
public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOperations, ResourcePatternResolver {
public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver {
private final MongoDbFactory dbFactory;
private final @Nullable String bucket;
private final MongoConverter converter;
private final QueryMapper queryMapper;
/**
* Creates a new {@link GridFsTemplate} using the given {@link MongoDbFactory} and {@link MongoConverter}.
@@ -77,12 +81,14 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
*/
public GridFsTemplate(MongoDbFactory dbFactory, MongoConverter converter, @Nullable String bucket) {
super(converter);
Assert.notNull(dbFactory, "MongoDbFactory must not be null!");
Assert.notNull(converter, "MongoConverter must not be null!");
this.dbFactory = dbFactory;
this.converter = converter;
this.bucket = bucket;
this.queryMapper = new QueryMapper(converter);
}
/*
@@ -131,9 +137,16 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, java.lang.String, java.lang.Object)
*/
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata) {
return store(content, filename, contentType, toDocument(metadata));
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Object metadata) {
Document document = null;
if (metadata != null) {
document = new Document();
converter.write(metadata, document);
}
return store(content, filename, contentType, document);
}
/*
@@ -148,11 +161,25 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
* (non-Javadoc)
* @see org.springframework.data.mongodb.gridfs.GridFsOperations#store(java.io.InputStream, java.lang.String, com.mongodb.Document)
*/
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata) {
public ObjectId store(InputStream content, @Nullable String filename, @Nullable String contentType, @Nullable Document metadata) {
Assert.notNull(content, "InputStream must not be null!");
return getGridFs().uploadFromStream(filename, content, computeUploadOptionsFor(contentType, metadata));
GridFSUploadOptions options = new GridFSUploadOptions();
Document mData = new Document();
if (StringUtils.hasText(contentType)) {
mData.put(GridFsResource.CONTENT_TYPE_FIELD, contentType);
}
if (metadata != null) {
mData.putAll(metadata);
}
options.metadata(mData);
return getGridFs().uploadFromStream(filename, content, options);
}
/*
@@ -183,8 +210,8 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
*/
public void delete(Query query) {
for (GridFSFile gridFSFile : find(query)) {
getGridFs().delete(((BsonObjectId) gridFSFile.getId()).getValue());
for (GridFSFile x : find(query)) {
getGridFs().delete(((BsonObjectId) x.getId()).getValue());
}
}
@@ -219,9 +246,9 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.support.ResourcePatternResolver#getResources(java.lang.String)
*/
* (non-Javadoc)
* @see org.springframework.core.io.support.ResourcePatternResolver#getResources(java.lang.String)
*/
public GridFsResource[] getResources(String locationPattern) {
if (!StringUtils.hasText(locationPattern)) {
@@ -245,6 +272,10 @@ public class GridFsTemplate extends GridFsOperationsSupport implements GridFsOpe
return new GridFsResource[] { getResource(locationPattern) };
}
private Document getMappedQuery(Document query) {
return queryMapper.getMappedObject(query, Optional.empty());
}
private GridFSBucket getGridFs() {
MongoDatabase db = dbFactory.getDb();

View File

@@ -1,233 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import com.mongodb.client.gridfs.model.GridFSFile;
import com.mongodb.reactivestreams.client.gridfs.AsyncInputStream;
/**
* Collection of operations to store and read files from MongoDB GridFS using reactive infrastructure.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
public interface ReactiveGridFsOperations {
/**
* Stores the given content into a file with the given name.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, String filename) {
return store(content, filename, (Object) null);
}
/**
* Stores the given content into a file applying the given metadata.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable Object metadata) {
return store(content, null, metadata);
}
/**
* Stores the given content into a file applying the given metadata.
*
* @param content must not be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable Document metadata) {
return store(content, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType) {
return store(content, filename, contentType, (Object) null);
}
/**
* Stores the given content into a file with the given name using the given metadata. The metadata object will be
* marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename can be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable Object metadata) {
return store(content, filename, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata. The metadata
* object will be marshalled before writing.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Object metadata);
/**
* Stores the given content into a file with the given name using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
default Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable Document metadata) {
return store(content, filename, null, metadata);
}
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(AsyncInputStream content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata);
/**
* Stores the given content into a file with the given name and content type using the given metadata.
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
*/
Mono<ObjectId> store(Publisher<DataBuffer> content, @Nullable String filename, @Nullable String contentType,
@Nullable Document metadata);
/**
* Returns a {@link Flux} emitting all files matching the given query. <br />
* <strong>Note:</strong> Currently {@link Sort} criteria defined at the {@link Query} will not be regarded as MongoDB
* does not support ordering for GridFS file access.
*
* @see <a href="https://jira.mongodb.org/browse/JAVA-431">MongoDB Jira: JAVA-431</a>
* @param query must not be {@literal null}.
* @return {@link Flux#empty()} if no mach found.
*/
Flux<GridFSFile> find(Query query);
/**
* Returns a {@link Mono} emitting a single {@link com.mongodb.client.gridfs.model.GridFSFile} matching the given
* query or {@link Mono#empty()} in case no file matches. <br />
* <strong>NOTE</strong> If more than one file matches the given query the resulting {@link Mono} emits an error. If
* you want to obtain the first found file use {@link #findFirst(Query)}.
*
* @param query must not be {@literal null}.
* @return {@link Mono#empty()} if not match found.
*/
Mono<GridFSFile> findOne(Query query);
/**
* Returns a {@link Mono} emitting the frist {@link com.mongodb.client.gridfs.model.GridFSFile} matching the given
* query or {@link Mono#empty()} in case no file matches.
*
* @param query must not be {@literal null}.
* @return {@link Mono#empty()} if not match found.
*/
Mono<GridFSFile> findFirst(Query query);
/**
* Deletes all files matching the given {@link Query}.
*
* @param query must not be {@literal null}.
* @return a {@link Mono} signalling operation completion.
*/
Mono<Void> delete(Query query);
/**
* Returns a {@link Mono} emitting the {@link ReactiveGridFsResource} with the given file name.
*
* @param filename must not be {@literal null}.
* @return {@link Mono#empty()} if no match found.
*/
Mono<ReactiveGridFsResource> getResource(String filename);
/**
* Returns a {@link Mono} emitting the {@link ReactiveGridFsResource} for a {@link GridFSFile}.
*
* @param file must not be {@literal null}.
* @return {@link Mono#empty()} if no match found.
*/
Mono<ReactiveGridFsResource> getResource(GridFSFile file);
/**
* Returns a {@link Flux} emitting all {@link ReactiveGridFsResource}s matching the given file name pattern.
*
* @param filenamePattern must not be {@literal null}.
* @return {@link Flux#empty()} if no match found.
*/
Flux<ReactiveGridFsResource> getResources(String filenamePattern);
}

View File

@@ -1,180 +0,0 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.gridfs;
import reactor.core.publisher.Flux;
import java.io.ByteArrayInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import org.reactivestreams.Publisher;
import org.springframework.core.io.AbstractResource;
import org.springframework.core.io.Resource;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.gridfs.model.GridFSFile;
/**
* Reactive {@link GridFSFile} based {@link Resource} implementation.
*
* @author Mark Paluch
* @since 2.2
*/
public class ReactiveGridFsResource extends AbstractResource {
static final String CONTENT_TYPE_FIELD = "_contentType";
private static final ByteArrayInputStream EMPTY_INPUT_STREAM = new ByteArrayInputStream(new byte[0]);
private final @Nullable GridFSFile file;
private final String filename;
private final Flux<DataBuffer> content;
/**
* Creates a new, absent {@link ReactiveGridFsResource}.
*
* @param filename filename of the absent resource.
* @param content
* @since 2.1
*/
private ReactiveGridFsResource(String filename, Publisher<DataBuffer> content) {
this.file = null;
this.filename = filename;
this.content = Flux.from(content);
}
/**
* Creates a new {@link ReactiveGridFsResource} from the given {@link GridFSFile}.
*
* @param file must not be {@literal null}.
* @param content
*/
public ReactiveGridFsResource(GridFSFile file, Publisher<DataBuffer> content) {
this.file = file;
this.filename = file.getFilename();
this.content = Flux.from(content);
}
/**
* Obtain an absent {@link ReactiveGridFsResource}.
*
* @param filename filename of the absent resource, must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
public static ReactiveGridFsResource absent(String filename) {
Assert.notNull(filename, "Filename must not be null");
return new ReactiveGridFsResource(filename, Flux.empty());
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.InputStreamResource#getInputStream()
*/
@Override
public InputStream getInputStream() throws IllegalStateException {
throw new UnsupportedOperationException();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#contentLength()
*/
@Override
public long contentLength() throws IOException {
verifyExists();
return file.getLength();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#getFilename()
*/
@Override
public String getFilename() throws IllegalStateException {
return filename;
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#exists()
*/
@Override
public boolean exists() {
return file != null;
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#lastModified()
*/
@Override
public long lastModified() throws IOException {
verifyExists();
return file.getUploadDate().getTime();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#getDescription()
*/
@Override
public String getDescription() {
return String.format("GridFs resource [%s]", this.getFilename());
}
/**
* Returns the {@link Resource}'s id.
*
* @return never {@literal null}.
* @throws IllegalStateException if the file does not {@link #exists()}.
*/
public Object getId() {
Assert.state(exists(), () -> String.format("%s does not exist.", getDescription()));
return file.getId();
}
/**
* Retrieve the download stream.
*
* @return
*/
public Flux<DataBuffer> getDownloadStream() {
if (!exists()) {
return Flux.error(new FileNotFoundException(String.format("%s does not exist.", getDescription())));
}
return content;
}
private void verifyExists() throws FileNotFoundException {
if (!exists()) {
throw new FileNotFoundException(String.format("%s does not exist.", getDescription()));
}
}
}

Some files were not shown because too many files have changed in this diff Show More