Compare commits

...

473 Commits

Author SHA1 Message Date
Mark Paluch
63dfb59a3f DATAMONGO-2335 - Release version 2.2 RC3 (Moore). 2019-09-06 10:10:42 +02:00
Mark Paluch
d33ee2ffac DATAMONGO-2335 - Prepare 2.2 RC3 (Moore). 2019-09-06 10:10:12 +02:00
Mark Paluch
59388d99cc DATAMONGO-2335 - Updated changelog. 2019-09-06 10:10:07 +02:00
Christoph Strobl
ee6048e289 DATAMONGO-2357 - Fix read/write for MongoDB client.model GeoJSON types.
We now consider native GeoJSON types of the MongoDB client during conversion passing on the raw values to the driver when writing and using the configured MongoDB codecs on read.

Original pull request: #786.
2019-09-05 15:46:19 +02:00
Christoph Strobl
9a062d53f3 DATAMONGO-2310 - Update documentation for TypedAggregation. 2019-09-05 13:02:01 +02:00
Christoph Strobl
a3c5b07eb7 DATAMONGO-2348 - Update documentation of version property handling. 2019-09-05 10:29:53 +02:00
Christoph Strobl
40d30a230d DATAMONGO-2354 - Polishing.
Same as with FindPublisherPreparer the CursorPreparer needs to be public because it is used in one of the protected methods of MongoTemplate.

Original Pull Request: #784
2019-09-04 13:03:52 +02:00
kostya05983
10116f7c93 DATAMONGO-2354 - Change visibility of FindPublisherPreparer.
The FindPublisherPreparer is used in an protected method of ReactiveMongoTemplate and needs to be public to allow overriding.

Original Pull Request: #784
2019-09-04 13:03:44 +02:00
Mark Paluch
705203c898 DATAMONGO-2358 - Polishing.
Inherit dependency-management for Kotlin Coroutines.

Original pull request: #785.
2019-09-04 11:52:39 +02:00
Sebastien Deleuze
8fb9d9e5f4 DATAMONGO-2358 - Upgrade to Coroutines 1.3.0 and fix warnings.
Original pull request: #785.
2019-09-04 11:40:10 +02:00
Mark Paluch
c23c5ae6c6 DATAMONGO-2356 - Move off deprecated Flux/Mono.usingWhen to their replacement overrides. 2019-09-04 09:37:59 +02:00
Mark Paluch
e67bacf66c DATAMONGO-2352 - Polishing.
Apply typo fixes also to ReactiveMongoOperations.

Original pull request: #782.
2019-09-03 11:28:20 +02:00
Ryan Cloherty
617dbdac3f DATAMONGO-2352 - Fix documentation typos.
Original pull request: #782.
2019-09-03 11:26:47 +02:00
Mark Paluch
8ad4f4b71b DATAMONGO-2344 - Polishing.
Remove generics from FindPublisherPreparer. Rename ReadPreferenceAware.hasReadPreferences to hasReadPreference.

Original pull request: #779.
2019-09-03 11:23:37 +02:00
Christoph Strobl
9048ec83af DATAMONGO-2344 - Fix slaveOK query option not applied correctly.
Since MongoDB 3.6 the slaveOk option translates to the primaryPreferred ReadPreference that is now again applied when executing a find operation.

Original pull request: #779.
2019-09-03 11:23:31 +02:00
Christoph Strobl
b2e3e3fb8e DATAMONGO-2346 - Fix (reactive)auditing of immutable versioned entities.
We now check if the source is still the same object after potentially applying auditing modifications and make sure to pass the audited object on to the mapping layer.

Limited bean inspection scope of event listener tests to avoid side effects in index creation.

Original pull request: #780.
2019-09-03 10:43:29 +02:00
Mark Paluch
e9a2b84af5 DATAMONGO-2349 - Polishing.
Reformat code. Remove duplicate simple types.

Original pull request: #783.
2019-09-03 08:57:55 +02:00
Christoph Strobl
5b8be281fb DATAMONGO-2349 - Fix converter registration for java.time types.
The MongoDB Java Driver does not handle java.time types. Therefore those must not be considered simple types.
The behavior was changed by DATACMNS-1294 forcing usage of Reading & WritingConverter annotations to disambiguate converter direction.
This commit restores the converter registration to the state before the change in Spring Data Commons.

Original pull request: #783.
2019-09-03 08:57:41 +02:00
Christoph Strobl
66b318dabe DATAMONGO-2351 - Polishing.
Fix broken tests and favor StepVerifier over block() for reactive ones.

Original Pull Request: #781
2019-08-23 07:22:09 +02:00
Artyom Gabeev
b78569374a DATAMONGO-2351 - Return zero deleted count for unacknowledged deleteBy.
Original Pull Request: #781
2019-08-23 07:21:55 +02:00
Mark Paluch
79921a7260 DATAMONGO-2342 - Add build job against MongoDB 4.0 2019-08-16 13:19:27 +02:00
Christoph Strobl
bcd8f242e5 DATAMONGO-2342 - Upgrade ci job to MongoDB 4.2
Original pull request: #778.
2019-08-16 11:52:28 +02:00
Christoph Strobl
73fab14b21 DATAMONGO-2342 - Upgrade MongoDB java & reactive streams Driver to 3.11 and 1.12.
Original pull request: #778.
2019-08-16 11:52:23 +02:00
Christoph Strobl
d4505880c7 DATAMONGO-2339 - Fix QueryMapper field name resolution for properties containing underscore.
We now prevent splitting of paths that contain underscores if the entity contains a property that matches.

Original pull request: #777.
2019-08-13 10:26:43 +02:00
Mark Paluch
a54b91392e DATAMONGO-2338 - Open RepositoryFactoryBeans for extension.
createRepositoryFactory() is now no longer final allowing for overriding the method. This change aligns with the remanining store modules.
2019-08-06 15:57:08 +02:00
Mark Paluch
5dc7e7c65f DATAMONGO-2337 - Add HTTPS entries into spring.schemas.
To resolve XSD files properly from the classpath, their HTTPS reference must be present in the spring.schemas to avoid internet interaction for resolving an XSD file.
2019-08-06 15:55:37 +02:00
Greg Turnquist
8802e2c36f DATAMONGO-2280 - Force check for updates. 2019-08-05 11:07:18 -05:00
Mark Paluch
d8189620d2 DATAMONGO-2303 - After release cleanups. 2019-08-05 15:53:02 +02:00
Mark Paluch
fd6411f0ed DATAMONGO-2303 - Prepare next development iteration. 2019-08-05 15:53:01 +02:00
Mark Paluch
445b9c83de DATAMONGO-2303 - Release version 2.2 RC2 (Moore). 2019-08-05 15:35:35 +02:00
Mark Paluch
4e13bda302 DATAMONGO-2303 - Prepare 2.2 RC2 (Moore). 2019-08-05 15:35:05 +02:00
Mark Paluch
e634f2f7c0 DATAMONGO-2303 - Updated changelog. 2019-08-05 15:34:56 +02:00
Mark Paluch
0bb2b8785d DATAMONGO-2302 - Updated changelog. 2019-08-05 11:34:29 +02:00
Mark Paluch
9505112670 DATAMONGO-2272 - Updated changelog. 2019-08-05 11:09:00 +02:00
Mark Paluch
8167f9d199 DATAMONGO-2320 - Polishing.
Use for-loops instead of Stream API and Collectors. Reformat code. Invert condition for smoother readability.

Original pull request: #776.
2019-08-02 10:55:46 +02:00
Christoph Strobl
e6bd8b3ee3 DATAMONGO-2320 - Simplify test code.
Original pull request: #776.
2019-08-02 10:55:46 +02:00
Christoph Strobl
e33c8a3b76 DATAMONGO-2320 - Fix aggregation field reference for $filter operator.
We now render field and local variable references correctly when using the $filter aggregation operator.
Prior to this commit field references had been rendered with an additional $ prefix.

Original pull request: #776.
2019-08-02 10:55:46 +02:00
Mark Paluch
34425c54db DATAMONGO-2327 - Polishing.
Add support for DBRef encoding. Update Javadoc.

Original pull request: #774.
2019-08-01 14:14:43 +02:00
Christoph Strobl
7d9c08409b DATAMONGO-2327 - Add toJson method to Querydsl query support.
This allows to obtain the raw Json representation of the query for eg. debug usage.
We also updated the toString method to return a full Mongo Shell compatible representation of the query including projections, order, skip and limit.

Original pull request: #774.
2019-08-01 14:14:33 +02:00
Mark Paluch
69a4217c4b DATAMONGO-2328 - Polishing.
Replace static imports with qualified access.

Original pull request: #773.
2019-08-01 11:50:20 +02:00
Christoph Strobl
0a0ee417ac DATAMONGO-2328 - Add missing target type conversions for field level type hints.
We now support Date to Long, Date to ObjectId, Script to String and other conversions for both reading and writing scenarios.

Original pull request: #773.
2019-08-01 11:49:54 +02:00
Christoph Strobl
2365dba8d9 DATAMONGO-2330 - Polishing.
Added tests.

Original Pull Request: #775
2019-07-31 14:24:19 +02:00
nkey
4b22558fe7 DATAMONGO-2330 - Apply defaultWriteConcern for bulk operations.
Fixed regression introduced in DATAMONGO-1880.

Original Pull Request: #775
2019-07-31 14:23:48 +02:00
Mark Paluch
75847981c3 DATAMONGO-2312 - Polishing.
Add Javadoc comments.

Original pull request: #770.
2019-07-31 09:52:39 +02:00
Christoph Strobl
0e32b7356c DATAMONGO-2312 - Polishing.
Fix spelling and add missing operator to Sum aggregation operation.

Original pull request: #770.
2019-07-31 09:52:35 +02:00
Christoph Strobl
c2436bcdfc DATAMONGO-2312 - Add support for array projections.
Original pull request: #770.
2019-07-31 09:52:30 +02:00
Mark Paluch
7c25675cbb DATAMONGO-2326 - Upgrade to MongoDB Reactive Streams Driver 1.12.0-rc0. 2019-07-26 10:01:13 +02:00
Mark Paluch
0145b68f61 DATAMONGO-2325 - Upgrade to MongoDB Driver 3.11.0-rc0.
Adapt tests to added overloads.
2019-07-26 09:42:54 +02:00
Mark Paluch
c7ea0782df DATAMONGO-2324 - Adapt to API changes in Commons. 2019-07-26 08:48:33 +02:00
Greg Turnquist
c3ef62a7be DATAMONGO-2280 - Polishing. 2019-07-19 11:46:45 -05:00
Greg Turnquist
0eebf64ec3 DATAMONGO-2280 - Publish documentation for main branch. 2019-07-19 11:46:44 -05:00
Mark Paluch
bdd2b236e8 DATAMONGO-2322 - Polishing.
Suspend change streams tests relying heavily on timing.
2019-07-18 10:06:56 +02:00
Mark Paluch
804106ec9a DATAMONGO-2322 - Handle exceptions thrown by MessageListeners.
ErrorHandlers associated with a CursorReadingTask (Change Streams, imperative Tailable Cursors) now handle exceptions raised by the listener callback.

Exceptions are now catched and the callback continues with the next message.
2019-07-18 09:57:32 +02:00
Mark Paluch
b3e70475eb DATAMONGO-2323 - Polishing.
Use .as(StepVerifier::create) syntax where possible.
2019-07-17 14:35:18 +02:00
Mark Paluch
1f146c3b5c DATAMONGO-2323 - Fix query(…) and stream(…) to be used with BSON Document.
MongoTemplate.stream(…), MongoTemplate.query(…) and ReactiveMongoTemplate.query(…) now no longer fail when used with BSON Document.class.

Previously, field mapping failed because it required an entity type. Now we gracefully back off when find methods are used with a simple Document entity type.
2019-07-17 14:28:11 +02:00
Mark Paluch
d49b1c33b7 DATAMONGO-2306 - Polishing.
Add Nullable annotation to nullable method args. Remove IV from JSON Schema as it is not listed in Mongo specs.

Tweak wording in docs. Parse encryption-settings-ref for MongoClientOptions. Add support for KeyId's in encrypted JSON schema properties.

Original pull request: #766.
2019-07-17 14:03:44 +02:00
Christoph Strobl
da0cb6d766 DATAMONGO-2306 - Add field level encryption to JSON Schema.
We now support encrypted fields via MongoJsonSchema and allow XML configuration of com.mongodb.AutoEncryptionSettings via a dedicated factory bean.

Original pull request: #766.
2019-07-17 14:03:36 +02:00
Mark Paluch
56ac8397aa DATAMONGO-2319 - Deprecate Query.withHint(String) and introduce withHint(Document).
The $hint operator is deprecated since MongoDB 3.2 so we're now deprecating Query.withHint() accepting a String as the String is expected to be a valid document. Therefore, we're introducing withHint(Document) to accept a type-safe representation of query hints.
2019-07-11 15:16:14 +02:00
Mark Paluch
945d3b0085 DATAMONGO-2321 - Polishing.
Reduce AtTest(expected = …) and ExpectedException with the corresponding AssertJ assertThatExceptionOfType(…) and assertThatIllegalArgumentException().isThrownBy(…).
2019-07-11 12:06:27 +02:00
Mark Paluch
fad18341fa DATAMONGO-2321 - Migrate tests to AssertJ. 2019-07-11 11:07:15 +02:00
Mark Paluch
394efa8b82 DATAMONGO-2318 - Fix typo. 2019-07-10 09:46:05 +02:00
Mark Paluch
880b3c6ee0 DATAMONGO-2089 - Polishing.
Loosen assertion to prevent false positives due to non-deterministic ordering.
2019-07-09 15:59:57 +02:00
Mark Paluch
4307c46619 DATAMONGO-2280 - Cleanup release profile.
Reuse inherited configuration from parent pom.
2019-07-09 11:01:38 +02:00
Mark Paluch
86bcfde568 DATAMONGO-2318 - Revise readme for a consistent structure. 2019-07-09 11:01:38 +02:00
Mark Paluch
5b3a9434f1 DATAMONGO-2316 - Configure simple collation on resolved text-indexes for non-simple collation Collections.
We now configure resolved text-indexes with a simple collation when the text index is created in a collection that uses a non-simple collation.

This explicit setting is required by MongoDB to properly create a text index.

Previously, index creation failed.
2019-07-08 13:53:24 +02:00
Mark Paluch
f2ac413c61 DATAMONGO-2188 - Polishing.
Reduce MappingContext requirements in IndexResolver to require only a properly parametrized MappingContext instead of MongoMappingContext.
2019-07-08 13:50:44 +02:00
Christoph Strobl
4e44d07402 DATAMONGO-2307 - Polishing.
Remove unused import and turn static variable into method reference.

Original Pull Request: #765
2019-07-05 12:50:14 +02:00
Michael Simons
1dc6a04d74 DATAMONGO-2307 - Consistently use result of BeforeSaveCallback.
Original Pull Request: #765
2019-07-05 12:50:14 +02:00
Mark Paluch
513e6a0111 DATAMONGO-2315 - Polishing.
Tweak method name. Javadoc.

Original pull request: #772.
2019-07-05 12:15:17 +02:00
Christoph Strobl
8ca16e1cb4 DATAMONGO-2315 - Fix $date parameter binding for string based queries.
Original pull request: #772.
2019-07-05 12:15:10 +02:00
Mark Paluch
296e97903d DATAMONGO-2314 - Polishing.
Reformat code. Remove unnecessary warning suppressions. Switch to diamond syntax.

Original pull request: #771.
2019-07-04 16:26:46 +02:00
Christoph Strobl
617828533e DATAMONGO-2314 - Fix query by example on nested properties.
This fix allows using alike on nested properties.

new Criteria("nested").alike(Example.of(probe, matching().withIgnorePaths("_class"))));

Switch tests to AssertJ.

Original pull request: #771.
2019-07-04 16:26:32 +02:00
Greg Turnquist
c0a22667b9 DATAMONGO-2280 - Use parent 'artifactory' profile for snapshot releases. 2019-07-03 17:10:56 -05:00
Mark Paluch
ada6eb814e DATAMONGO-2089 - Polishing.
Add watchCollection(…) accepting an entity class. Use static import for assertions. Tweak javadoc.

Original pull request: #751.
2019-07-03 11:21:28 +02:00
Christoph Strobl
4a17048ec6 DATAMONGO-2089 - Add fluent change stream API to ReactiveMongoTemplate.
We now offer a fluent API for more intuitive change stream interaction.

Flux<ChangeStreamEvent<User>> flux = reactiveTemplate.changeStream(User.class)
    .watchCollection("people")
    .filter(where("age").gte(38))
    .listen();

Original pull request: #751.
2019-07-03 11:07:03 +02:00
Mark Paluch
06018fa3de DATAMONGO-2287 - Polishing.
Add since tags. Remove final modifier from method args. Switch to lambdas.

Original pull request: #760.
2019-07-02 14:57:37 +02:00
Christoph Strobl
8b406b23ff DATAMONGO-2287 - Polishing.
Add new factory method for ArrayOperators that deals with a collection of values.

Original pull request: #760.
2019-07-02 14:57:37 +02:00
Shashank Sharma
a3ef9b5856 DATAMONGO-2287 - Add support for $in aggregation pipeline operator.
Original pull request: #760.
2019-07-02 14:57:10 +02:00
Mark Paluch
bb280bd59b DATAMONGO-2200 - Polishing.
Tweak Javadoc. Simplify Fields creation from Stream. Remove final modifier from private static method. Iterate with loop over PersistentEntity.

Original pull request: #748.
2019-07-02 14:24:20 +02:00
Christoph Strobl
e24c5e0846 DATAMONGO-2200 - Use mapping context where available.
Use the mapping context to get the required fields or fall back to property descriptors if no mapping context is available.

Original pull request: #748.
2019-07-02 14:24:17 +02:00
Christoph Strobl
839aa1b1e6 DATAMONGO-2200 - Derive fields for aggregation $project stage from a given type.
We now allow to derive field names for a $project stage from a given type by including all top level fields.

    // $project : { title : 1, author : 1 }
    Aggregation.project(Book.class)

Original pull request: #748.
2019-07-02 14:24:09 +02:00
Mark Paluch
ff7a189c98 DATAMONGO-2296 - Polishing.
Use getCollectionName() in MongoTemplate.insert/save. Consistently use getCollectionName(Class) from ReactiveMongoTemplate and fluent API implementations.

Original pull request: #768.
2019-07-01 16:36:46 +02:00
Christoph Strobl
de144a6b0d DATAMONGO-2296 - Consistent use of getCollectionName(Class) throughout MongoTemplate.
Original pull request: #768.
2019-07-01 16:36:38 +02:00
Mark Paluch
bfe514acd7 DATAMONGO-2304 - Polishing.
Align anchor naming. Add code fences to EntityCallbacks.

Original pull request: #767.
2019-07-01 15:10:21 +02:00
Christoph Strobl
87343d2ae5 DATAMONGO-2304 - Fix documentation anchor for entity callback API.
Original pull request: #767.
2019-07-01 15:10:18 +02:00
Christoph Strobl
5e452e1be0 DATAMONGO-2305 - Upgrade to MongoDB Java Driver 3.11.0-beta4.
Tested against 4.0.9 and 4.2.0-rc1 servers.
Added a delay and left a todo in one of the tests where the 4.2.0-rc1 server takes a bit longer than it predecessor when creating indexes which can lead to BackgroundOperationInProgressForNamespace errors.

Original pull request: #764.
2019-07-01 13:31:59 +02:00
Greg Turnquist
8ce8a8307a DATAMONGO-2280 - Only test main branch for upstream triggers. 2019-06-28 19:29:51 -05:00
Greg Turnquist
df94214527 DATAMONGO-2280 - Set user.name and user.home for CI jobs. 2019-06-25 13:33:11 -05:00
Mark Paluch
31f8a63a17 DATAMONGO-2240 - Polishing.
Consistently use this for field access. Access GridFSFile through getter.

Original Pull Request: #741
2019-06-17 14:01:03 +02:00
Mark Paluch
888054bb2a DATAMONGO-2240 - Expose GridFSFile through GridFsResource and ReactiveGridFsResource.
Original Pull Request: #741
2019-06-17 14:00:08 +02:00
Christoph Strobl
29bf74c24c DATAMONGO-2256 - After release cleanups. 2019-06-14 15:12:52 +02:00
Christoph Strobl
1798678e4d DATAMONGO-2256 - Prepare next development iteration. 2019-06-14 15:12:50 +02:00
Christoph Strobl
bd78ccede9 DATAMONGO-2256 - Release version 2.2 RC1 (Moore). 2019-06-14 14:44:25 +02:00
Christoph Strobl
349b87a2c4 DATAMONGO-2256 - Prepare 2.2 RC1 (Moore). 2019-06-14 14:43:17 +02:00
Christoph Strobl
63ed62b988 DATAMONGO-2256 - Updated changelog. 2019-06-14 14:43:12 +02:00
Christoph Strobl
8c1a7cc163 DATAMONGO-2271 - Updated changelog. 2019-06-14 13:27:15 +02:00
Jens Schauder
597354ea7e DATAMONGO-2290 - Polishing.
Extracted complex lambdas into methods.
Added null check.

Original pull request: #762.
2019-06-13 15:18:04 +02:00
Christoph Strobl
c09188561c DATAMONGO-2290 - Trigger (convert & save) lifecycle events for bulk operations.
Original pull request: #762.
2019-06-13 14:39:33 +02:00
Greg Turnquist
c2ea595b8c DATAMONGO-2280 - Introduce Jenkins. 2019-06-11 14:55:33 -05:00
Christoph Strobl
f7731c7cf1 DATAMONGO-2293 - Fix EntityOperations id population nulling out entire entity.
We now no longer null the entire entity if a given id value is actually null.

Original Pull Request: #742
2019-06-11 11:55:44 +02:00
Christoph Strobl
a7e6b26796 DATAMONGO-2261 - Adapt to changes in DATACMNS-1467.
Use the reworked version of the EntityCallback method lookup.
Also fix issues with callbacks not invoked when intended and rework the reactive flow by removing deeply nested constructs.
Update documentation and add EntityCallbacks to BulkOperations.

Original Pull Request: #742
2019-06-11 11:37:32 +02:00
Mark Paluch
45bd6d544d DATAMONGO-2261 - Use Entity Callback API for auditing.
We now use EntityCallback to invoke callback actions on entities before saving/before conversion to provide hooks that potentially modify an entity before persisting it.

We also provide a reactive variant of entity callbacks allowing to consume Reactor Context and to defer the actual activity.

Original Pull Request: #742
2019-06-11 11:36:41 +02:00
Mark Paluch
d819028e46 DATAMONGO-2295 - Adapt to renamed TransactionSynchronizationManager.forCurrentTransaction(). 2019-06-05 14:42:57 +02:00
Mark Paluch
b353bb6165 DATAMONGO-2231 - URL Cleanup. 2019-06-05 11:19:31 +02:00
Mark Paluch
843ed64b87 DATAMONGO-2292 - Create security policy readme. 2019-05-31 15:46:46 +02:00
Hippolyte Durix
f5c00b6978 DATAMONGO-2288 - Fix wrong indentation on documentation code sample.
Original pull request: #758.
2019-05-29 14:36:14 +02:00
Mark Paluch
05d585896c DATAMONGO-2278 - Polishing.
Update method comment. Switch to diamond syntax.

Original pull request: #755.
2019-05-29 14:30:45 +02:00
owen.qqq
d2999b0918 DATAMONGO-2278 - Update Querydsl base package names in MongoAnnotationProcessor.
Current MongoAnnotationProcessor still uses 3.x.x Querydsl package names.
Update package names to com.querydsl.core.annotations.* to use Querydsl annotations for code-generation.

Original pull request: #755.
2019-05-29 14:29:36 +02:00
Christoph Strobl
2a6107fcb6 DATAMONGO-2282 - Add shortcut to create JsonSchemaProperty for ObjectId.
Original pull request: #757.
2019-05-28 13:21:34 +02:00
Mark Paluch
d937460351 DATAMONGO-1183 - Polishing.
Tweak Javadoc and reference docs. Migrate tests to AssertJ. Remove not needed warning suppressions.

Original pull request: #750.
2019-05-28 11:43:34 +02:00
Christoph Strobl
89843a1488 DATAMONGO-1183 - Add support for Hashed Indexes.
We now support hashed index definitions via IndexOperations. Reading index information back allows to identify a hashed index via isHashed().

Original pull request: #750.
2019-05-28 10:20:28 +02:00
Christoph Strobl
e06f326de3 DATAMONGO-1183 - Fix field name mapping in IndexOperations.
We now consider the Field annotation when creating indexes via IndexOperations.

Original pull request: #750.
2019-05-28 10:20:11 +02:00
Christoph Strobl
684e24aff5 DATAMONGO-2252 - Update Javadoc for Reactive/MongoOperations#getCollection(String).
Original pull request: #747.
2019-05-27 11:11:36 +02:00
Mark Paluch
7a22d697cf DATAMONGO-2067 - Polishing.
Tweak Javadoc and reference docs. Use pre/class=code instead of nested code tag.

Original pull request: #756.
2019-05-27 10:57:13 +02:00
Christoph Strobl
505ca4d2c4 DATAMONGO-2067 - Allow repeated usage of CompoundIndex annotation.
Original pull request: #756.
2019-05-27 10:57:06 +02:00
Mark Paluch
b01b25ef93 DATAMONGO-2153 - Polishing.
Relax name ordering in PersonAggregate as we're using unordered MongoDB set operations to assemble results.
2019-05-21 12:11:28 +02:00
Mark Paluch
60cb8c7de7 DATAMONGO-2259 - Polishing.
Tweak Javadoc. Refactor tests for improved readability.

Original pull request: #740.
2019-05-21 11:58:33 +02:00
Christoph Strobl
b4bc95ce5f DATAMONGO-2259 - Add MongoDB 4.2 expanded format to 'out' aggregation operation.
OutOperation now supports the expanded format for the $out aggregation operation if additional parameters, next to the target collection, are given.

    Aggregation.out("out-col").insertDocuments().in("database-2").uniqueKey("field-1“);

    {
        $out : {
            to : "out-col",
            mode : "insertDocuments",
            db : "database-2",
            uniqueKey : "field-1"
        }
    }

We’ll stick to the 2.6 format if only a collection name has been set.

   Aggregation.out("out-col“);

    { $out : "out-col" }

Original pull request: #740.
2019-05-21 11:58:12 +02:00
Christoph Strobl
b48ff3c38b DATAMONGO-2153 - Fix domain type field mapping for change stream aggregations.
We now make sure to call the delegate AggregationOperationContext without potentially overriding arguments. Without this change potentially registered target types would be overridden with null.
2019-05-20 14:26:08 +02:00
Mark Paluch
4ab61bd4d4 DATAMONGO-2081 - Polishing.
Tweak Javadoc. Refactor conditional assignment to if style for improved readability.

Original pull request: #749.
2019-05-17 14:40:42 +02:00
Christoph Strobl
24f23c5365 DATAMONGO-2081 - Expose expireAfterSeconds via IndexInfo.
Original pull request: #749.
2019-05-17 14:40:39 +02:00
Christoph Strobl
8d2a68118b DATAMONGO-2275 - Fix NPE when mapping MongoJsonSchema used in query.
We fixed a NPE when reading raw Document from a collection using a query matching against a JSON schema.

Original pull request: #752.
2019-05-17 14:15:19 +02:00
Mark Paluch
8ce3e45d60 DATAMONGO-2153 - Apply Meta comment and cursor size to AggregationOptions.
We now apply propagate the cursor size to aggregation options. We introduced AggregationOptions.comment to propagate the meta comment to aggregation execution.

Original pull request: #743.
2019-05-17 11:49:15 +02:00
Mark Paluch
f456851791 DATAMONGO-2153 - Polishing.
Use MongoQueryMethod.getDomainClass() instead of getRepositoryDomainType(). Simplify annotation presence indicator methods hasAnnotatedSort() and hasAnnotatedCollation(). Refactor getAnnotatedAggregation() to non-nullable method throwing IllegalStateException to be consistent with other getXxx() methods.

Simplify aggregation execution and consider collection/single element declaration for reactive execution.

Tweak docs.

Original pull request: #743.
2019-05-17 11:48:37 +02:00
Christoph Strobl
221ffb1947 DATAMONGO-2153 - Annotated aggregation support.
The repository layer offers means interact with the aggregation framework via annotated repository finder methods. Similar to the JSON based queries a pipeline can be defined via the Aggregation annotation. The definition may contain simple placeholders like `?0` as well as SpEL expression markers `?#{ ... }`.

public interface PersonRepository extends CrudReppsitory<Person, String> {

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $?0 } } }")
  List<PersonAggregate> groupByLastnameAnd(String property);

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $firstname } } }")
  List<PersonAggregate> groupByLastnameAndFirstnames(Sort sort);

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $?0 } } }")
  List<PersonAggregate> groupByLastnameAnd(String property, Pageable page);

  @Aggregation("{ $group : { _id : null, total : { $sum : $age } } }")
  SumValue sumAgeUsingValueWrapper();

  @Aggregation("{ $group : { _id : null, total : { $sum : $age } } }")
  Long sumAge();

  @Aggregation("{ $group : { _id : null, total : { $sum : $age } } }")
  AggregationResults<SumValue> sumAgeRaw();

  @Aggregation("{ '$project': { '_id' : '$lastname' } }")
  List<String> findAllLastnames();
}

public interface ReactivePersonRepository extends ReactiveCrudReppsitory<Person, String> {

  @Aggregation("{ $group: { _id : $lastname, names : { $addToSet : $?0 } } }")
  Flux<PersonAggregate> groupByLastnameAnd(String property);

  @Aggregation("{ $group : { _id : null, total : { $sum : $age } } }")
  Mono<Long> sumAge();

  @Aggregation("{ '$project': { '_id' : '$lastname' } }")
  Flux<String> findAllLastnames();
}

Original pull request: #743.
2019-05-17 11:48:25 +02:00
Mark Paluch
c3a5454d31 DATAMONGO-2267 - Polishing.
Reuse collection name for index creation instead of resolving the collection for every index. Switch lambda to method reference.

Original pull request: #746.
2019-05-15 10:39:31 +02:00
Christoph Strobl
207c82a50b DATAMONGO-2267 - Fix eager collection resolution in Object path.
We now lazily read the collection of an entity as it potentially requires a more expensive SpEL evaluation that might not have been required in fist place.

Original pull request: #746.
2019-05-15 10:39:23 +02:00
Mark Paluch
7596881a06 DATAMONGO-2270 - Updated changelog. 2019-05-13 18:19:07 +02:00
Mark Paluch
3ea670fb3b DATAMONGO-2269 - Updated changelog. 2019-05-13 14:59:27 +02:00
Mark Paluch
8df1c88db1 DATAMONGO-2260 - After release cleanups. 2019-05-13 12:17:54 +02:00
Mark Paluch
a34e8591de DATAMONGO-2260 - Prepare next development iteration. 2019-05-13 12:17:52 +02:00
Mark Paluch
ebc9e3ba3a DATAMONGO-2260 - Release version 2.2 M4 (Moore). 2019-05-13 11:59:50 +02:00
Mark Paluch
74368dd0e8 DATAMONGO-2260 - Prepare 2.2 M4 (Moore). 2019-05-13 11:59:04 +02:00
Mark Paluch
fff971e469 DATAMONGO-2260 - Updated changelog. 2019-05-13 11:58:56 +02:00
Christoph Strobl
352f9ad372 DATAMONGO-2268 - Upgrade to MongoDB Java Driver 3.11.0-beta3. 2019-05-10 18:05:13 +02:00
Oliver Drotbohm
89624518c9 DATAMONGO-2244 - Updated changelog. 2019-05-10 14:18:09 +02:00
Oliver Drotbohm
8e02fd0329 DATAMONGO-2246 - Updated changelog. 2019-05-10 12:57:20 +02:00
Christoph Strobl
0b49f47230 DATAMONGO-2265 - Polishing.
Fix count operation inside transaction and avoid superfluous client session instantiation.
Default MongoDatabase emission in case of non active transaction, update documentation, move test to another package.
Delay reactive collection re/creation in test to cope with issues in server version 4.1.10.

Original Pull Request: #745
2019-05-07 20:03:04 +02:00
Mark Paluch
5c10a5821b DATAMONGO-2265 - Add initial ReactiveMongoTransactionManager.
Support declarative reactive transaction via the Transactional annotation via a MongoDB specific ReactiveTransactionManager implementation.

@Bean
ReactiveMongoTransactionManager transactionManager(ReactiveDatabaseFactory factory) {
    return new ReactiveMongoTransactionManager(factory);
}

@Component
public class StateService {

    @Transactional
    Mono<UpdateResult> someBusinessFunction(Step step) {

        return template.insert(step)
                   .then(process(step))
                   .then(template.update(Step.class).apply(Update.set("state", …));
    };
});

Original Pull Request: #745
2019-05-07 19:59:43 +02:00
Mark Paluch
c15ab4c946 DATAMONGO-2264 - Polishing.
Store result to getPersistentEntity() in local variable to reduce invocation count. Use numbered postfix instead of object instantiation to generate a unique distance field name.

Deprecate geoNear methods with hint to aggregations. Fix distance field usage in ReactiveMongoTemplate.

Tweak docs.

Original pull request: #744.
2019-05-06 14:38:31 +02:00
Mark Paluch
c3769df07f DATAMONGO-2264 - Migrate tests to AssertJ.
Original pull request: #744.
2019-05-06 14:38:28 +02:00
Christoph Strobl
33d69abd53 DATAMONGO-2264 - Support upcoming MongoDB 4.2 Server.
Deprecate and guard tests for removed commands:
    - eval
    - group
    - maxScan
    - geoNear

Transition from removed geoNear command to $geoNear aggregation pipeline stage.
This allows users to still use NearQuery along with the template that now transforms the query into a GeoNearOperation applying skip (previously in memory skipping) and limit (previous num parameter) as additional aggregation pipeline stages.

Tested against:
    - 4.1.10
    - 4.0.4
    - 3.6.12
    - 4.4.20

Original pull request: #744.
2019-05-06 14:38:06 +02:00
Mark Paluch
e683c8f08f DATAMONGO-1854 - Polishing.
Extract common collation resolution code into EntityOperations (TypedOperations).

Original pull request: #644.
2019-04-23 13:46:04 +02:00
Christoph Strobl
ac1873a163 DATAMONGO-1854 - Allow Collation to be configured on entity level.
The collation can now also be configured on entity level and gets applied via MongoTemplate. However one can alway override a default collation by adding the collation explicitly to either the Query or one of the Options available for various operations.
When it comes to the repository level the hierarchy is method parameter over query annotation over entity metadata.

Remove collation annotation in favor of attributes of Query / Document.

Original pull request: #644.
2019-04-23 13:46:04 +02:00
Mark Paluch
8b7071e6ae DATAMONGO-1854 - Polishing.
Use ParameterBindingDocumentCodec to parse collation expressions.

Original pull request: #644.
2019-04-23 13:46:04 +02:00
Christoph Strobl
b8368a677d DATAMONGO-1854 - Add collation option to @Document and @Query annotation.
We now allow to specify the collation via the @Query annotation.

public interface PersonRepository extends MongoRepository<Person, String> {

	@Query(collation = "en_US")
	List<Person> findByFirstname(String firstname);

	@Query(collation = "{ 'locale' : 'en_US' }")
	List<Person> findPersonByFirstname(String firstname);

	@Query(collation = "?1")
	List<Person> findByFirstname(String firstname, Object collation);

	@Query(collation = "{ 'locale' : '?1' }")
	List<Person> findByFirstname(String firstname, String collation);

	List<Person> findByFirstname(String firstname, Collation collation);
}

We now make sure to include collation information derived from the Query method if the collation is a fixed value.

Original pull request: #644.
2019-04-23 13:46:04 +02:00
Mark Paluch
5c333d6159 DATAMONGO-2258 - Polishing.
Consider undefined state in ChangeStreamOptions.isResumeAfter()/isStartAfter() when resume token is not set.

Original pull request: #739.
2019-04-16 11:25:36 +02:00
Christoph Strobl
7f4d3f27e6 DATAMONGO-2258 - Add startAfter option to change stream support.
Original pull request: #739.
2019-04-16 11:25:27 +02:00
Christoph Strobl
a30ee07b4a DATAMONGO-2257 - Polishing.
Enable tests for issues already resolved.
2019-04-15 09:47:20 +02:00
Christoph Strobl
cc45b4e081 DATAMONGO-2257 - Upgrade to MongoDB Java Driver 3.11.0-beta2 2019-04-15 09:38:33 +02:00
Christoph Strobl
0d0d17ebf6 DATAMONGO-2222 - After release cleanups. 2019-04-11 12:00:20 +02:00
Christoph Strobl
b53caaf69d DATAMONGO-2222 - Prepare next development iteration. 2019-04-11 12:00:19 +02:00
Christoph Strobl
98091f9096 DATAMONGO-2222 - Release version 2.2 M3 (Moore). 2019-04-11 11:24:31 +02:00
Christoph Strobl
010f182c2b DATAMONGO-2222 - Prepare 2.2 M3 (Moore). 2019-04-11 11:23:56 +02:00
Christoph Strobl
e8553468e7 DATAMONGO-2222 - Updated changelog. 2019-04-11 11:23:50 +02:00
Mark Paluch
8741b85806 DATAMONGO-2112 - Polishing.
Align Indexed(expireAfter) default to empty string according to the documentation. Slightly reword Javadoc.

Import DurationStyle to reuse duration parsing until Spring Framework provides a similar utility. Document ISO-8601 duration style. Prevent null index name evaluation to render "null" as String.

Convert assertions from Hamcrest to AssertJ.

Original pull request: #647.
2019-04-10 12:45:18 +02:00
Christoph Strobl
8fbaba0a7c DATAMONGO-2112 - Allow SpEL expression to be used for annotated index & geoIndex names as well as compound index definition.
We now also evaluate SpEL expressions for the name of indices as well as the def attribute of the compound index definition.

   @CompoundIndex(name = "#{'cmp' + 2 + 'name‘}“, def = "#{T(org.bson.Document).parse(\"{ 'foo': 1, 'bar': -1 }\")}")
    class WithCompoundIndexFromExpression {
      // …
    }

An expression used for Indexed.expireAfter may now not only return a plain String value with the timeout but also a java.time.Duration.

Original pull request: #647.
2019-04-10 12:45:12 +02:00
Christoph Strobl
82da8027f5 DATAMONGO-2112 - Allow usage of SpEL expression for index timeout.
We added expireAfter which accepts numeric values followed by the unit of measure (d(ays), h(ours), m(inutes), s(econds)) or a Spring template expression to the Indexed annotation.

   @Indexed(expireAfter = "10s")
    String expireAfterTenSeconds;

    @Indexed(expireAfter = "1d")
    String expireAfterOneDay;

    @Indexed(expireAfter = "#{@mySpringBean.timeout}")
    String expireAfterTimeoutObtainedFromSpringBean;

Original pull request: #647.
2019-04-10 12:45:01 +02:00
Mark Paluch
9c26859c04 DATAMONGO-2255 - Polishing.
Add ticket references. Rename allAsFlow() to flow() to simplify naming.

Original pull request: #736.
2019-04-10 08:38:14 +02:00
Sebastien Deleuze
02931ecb3a DATAMONGO-2255 - Polishing.
Fix test naming for insert extension.

Original pull request: #736.
2019-04-09 16:25:36 +02:00
Sebastien Deleuze
32399f63bb DATAMONGO-2255 - Add Coroutines Flow extensions.
Original pull request: #736.
2019-04-09 16:25:36 +02:00
Mark Paluch
0f3c90bea9 DATAMONGO-1849 - Polishing.
Fix generics usage in MappingMongoJsonSchemaCreator. Make fields final. Rename MappingMongoConverter.computeWriteTarget to getWriteTarget and expose it publicly for reuse in custom DefaultTypeMapper setups without the need to subclass MappingMongoConverter.

Remove Nullability functionality for required fields as nullability indicators should originate from PersistentProperty and PreferredConstructor. Update documentation.

Related ticket: DATACMNS-1513
Original pull request: #733.
2019-04-09 14:16:00 +02:00
Christoph Strobl
7f88889984 DATAMONGO-1849 - Schema derivation from domain type.
MongoJsonSchemaCreator extracts the MongoJsonSchema for a given Class by applying the following mapping rules:

Required Properties:
- Properties of primitive type.

Ignored Properties:
- All properties annotated with Transient.

Property Type Mapping:
- java.lang.Object -> { type : 'object' }
- java.util.Arrays -> { type : 'array' }
- java.util.Collection -> { type : 'array'}
- java.util.Map -> { type : 'object'}
- java.lang.Enum -> { type : 'string', enum : [ ... ] }
- Simple Types -> { type : 'the corresponding bson type' }
- Domain Types -> { type : 'object', properties : { ... } }

_id properties using types that can be converted into
ObjectId like String will be mapped to { type : 'object' } unless there is more specific information available via the MongoId annotation.

By using Field#targetType it is now possible to pass down a type hint to the conversion subsystem. This allows specifying the desired target type for a property so that eg. a plain String can be stored as Code.

Original pull request: #733.
2019-04-09 14:14:56 +02:00
Christoph Strobl
f869ea4bcd DATAMONG-2254 - Upgrade to MongoDB Java driver 3.10.2 2019-04-09 10:24:37 +02:00
Mark Paluch
20fab74965 DATAMONGO-1783 - Polishing.
Move Meta cloning to Meta class. Consider all Meta properties.

Original pull request: #650.
2019-04-08 11:45:41 +02:00
Christoph Strobl
ed9173c384 DATAMONGO-1783 - Apply query limit and offset to CountOptions for MongoOperations.count.
We now pass on the range defined by Query.skip and Query.limit to MongoDB. This allows to count documents within a certain range so that it is possible to find the number of matches within eg. the first 10,000 documents.

Original pull request: #650.
2019-04-08 11:45:41 +02:00
Oliver Drotbohm
38d8529b81 DATAMONGO-2248 - Removed cross-store module.
Related tickets: DATAMONGO-1705.
2019-04-04 13:49:59 +02:00
Mark Paluch
54d1fd20ce DATAMONGO-2247 - Polishing.
Update ticket references in tests.

Original pull request: #735.
2019-04-04 13:00:25 +02:00
Sebastien Deleuze
84ead72680 DATAMONGO-2247 - Add non-nullable variants to Coroutines extensions.
Original pull request: #735.
2019-04-04 13:00:04 +02:00
Oliver Drotbohm
10128d4d38 DATAMONGO-2204 - Updated changelog. 2019-04-01 20:56:29 +02:00
Oliver Drotbohm
3fa8e80764 DATAMONGO-2186 - Updated changelog. 2019-04-01 19:37:00 +02:00
Oliver Drotbohm
54a6a28806 DATAMONGO-2243 - Updated changelog. 2019-04-01 18:52:19 +02:00
Oliver Drotbohm
7676bd28b7 DATAMONGO-2185 - Updated changelog. 2019-04-01 18:49:22 +02:00
Christoph Strobl
d2a9bf0c6c DATAMONGO-2241 - Polishing.
Ensure to have to DbRefResolver operate within the session while reusing the MappingContext.

Original Pull Request: #734
2019-04-01 12:09:01 +02:00
Mark Paluch
847dfebce0 DATAMONGO-2241 - Retain MongoConverter using MongoTemplate.withSession(…).
We now reuse the MongoConverter instance that is associated with a MongoTemplate when using withSession(…) instead of creating a new converter instance.
In consequence, we're reusing PersistentEntity instances, EntityInstantiators and generated accessor classes.

Original Pull Request: #734
2019-04-01 10:46:57 +02:00
Christoph Strobl
c3857209d5 DATAMONGO-2218 - Polishing.
Original Pull Request: #655
2019-03-28 08:41:50 +01:00
Minsu
f3b7ce8079 DATAMONGO-2218 - Add support for replaceOne operation in BulkOperations.
Original Pull Request: #655
2019-03-28 08:10:38 +01:00
Mark Paluch
24677fabd1 DATAMONGO-2221 - Polishing.
Reformat imports.

Original pull request: #732.
2019-03-26 14:45:05 +01:00
Christoph Strobl
d7cd97ea89 DATAMONGO-2221 - Fix mapping of Strings matching a valid ObjectId for unresolvable paths.
We now make sure we to not convert Strings representing valid ObjectIds into the such for paths that cannot be resolved to a Property.

Original pull request: #732.
2019-03-26 14:44:28 +01:00
Mark Paluch
9a64830e95 DATAMONGO-2223 - Polishing.
Replace DBRef annotation using FQCN with import.

Original pull request: #660.
2019-03-25 09:59:43 +01:00
Christoph Strobl
39a163bdbc DATAMONGO-2223 - Add test for DBRef resolution in a different database.
Original pull request: #660.
2019-03-25 09:59:40 +01:00
Christoph Strobl
2a7821ea9c DATAMONGO-2224 - Add trace logging to DBRef resolution.
We added trace logging to DefaultDbRefResolver.

<logger name="org.springframework.data.mongodb.core.convert.DefaultDbRefResolver" level="trace"/>

Original pull request: #659.
2019-03-25 09:48:37 +01:00
Spring Operator
3eb9d790b9 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://www.apache.org/licenses/ with 1 occurrences migrated to:
  https://www.apache.org/licenses/ ([https](https://www.apache.org/licenses/) result 200).
* [ ] http://www.apache.org/licenses/LICENSE-2.0 with 850 occurrences migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).

Original Pull Request: #727
2019-03-22 10:08:46 +01:00
Mark Paluch
28883f37d9 DATAMONGO-2230 - Decrement demand after emission in DataBufferPublisherAdapter.
We now decrement the demand and propagate exceptions from the adapter. Previously, we did not decrement the demand and kept emitting buffers although the subscriber demand was satisfied.

Original Pull Request: #693
2019-03-21 10:58:14 +01:00
Spring Operator
407f998a13 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# HTTP URLs that Could Not Be Fixed
These URLs were unable to be fixed. Please review them to see if they can be manually resolved.

* [ ] http://geojson.org/ (200) with 4 occurrences could not be migrated:
   ([https](https://geojson.org/) result SSLHandshakeException).
* [ ] http://geojson.org/geojson-spec.html (200) with 16 occurrences could not be migrated:
   ([https](https://geojson.org/geojson-spec.html) result SSLHandshakeException).
* [ ] http://site.icu-project.org (200) with 1 occurrences could not be migrated:
   ([https](https://site.icu-project.org) result ClosedChannelException).
* [ ] http://userguide.icu-project.org/collation/concepts (200) with 1 occurrences could not be migrated:
   ([https](https://userguide.icu-project.org/collation/concepts) result ClosedChannelException).
* [ ] http://www.querydsl.com/ (200) with 2 occurrences could not be migrated:
   ([https](https://www.querydsl.com/) result AnnotatedConnectException).
* [ ] http://www.querydsl.com/static/querydsl/latest/reference/html/ (200) with 2 occurrences could not be migrated:
   ([https](https://www.querydsl.com/static/querydsl/latest/reference/html/) result AnnotatedConnectException).
* [ ] http://www.querydsl.com/team (200) with 8 occurrences could not be migrated:
   ([https](https://www.querydsl.com/team) result AnnotatedConnectException).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://docs.spring.io/spring/docs/ with 18 occurrences migrated to:
  https://docs.spring.io/spring/docs/ ([https](https://docs.spring.io/spring/docs/) result 200).
* [ ] http://maven.apache.org/xsd/maven-4.0.0.xsd with 2 occurrences migrated to:
  https://maven.apache.org/xsd/maven-4.0.0.xsd ([https](https://maven.apache.org/xsd/maven-4.0.0.xsd) result 200).
* [ ] http://media.mongodb.org/zips.json with 1 occurrences migrated to:
  https://media.mongodb.org/zips.json ([https](https://media.mongodb.org/zips.json) result 200).
* [ ] http://openjdk.java.net/projects/code-tools/jmh/ with 1 occurrences migrated to:
  https://openjdk.java.net/projects/code-tools/jmh/ ([https](https://openjdk.java.net/projects/code-tools/jmh/) result 200).
* [ ] http://openmymind.net/mongodb.pdf with 1 occurrences migrated to:
  https://openmymind.net/mongodb.pdf ([https](https://openmymind.net/mongodb.pdf) result 200).
* [ ] http://pivotal.io/ with 1 occurrences migrated to:
  https://pivotal.io/ ([https](https://pivotal.io/) result 200).
* [ ] http://projectreactor.io/docs/ with 1 occurrences migrated to:
  https://projectreactor.io/docs/ ([https](https://projectreactor.io/docs/) result 200).
* [ ] http://projectreactor.io/docs/core/release/reference/ with 1 occurrences migrated to:
  https://projectreactor.io/docs/core/release/reference/ ([https](https://projectreactor.io/docs/core/release/reference/) result 200).
* [ ] http://projects.spring.io/ with 1 occurrences migrated to:
  https://projects.spring.io/ ([https](https://projects.spring.io/) result 200).
* [ ] http://projects.spring.io/spring-data-mongodb/ with 2 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb/ ([https](https://projects.spring.io/spring-data-mongodb/) result 200).
* [ ] http://repo.spring.io/milestone/org/springframework/data/ with 1 occurrences migrated to:
  https://repo.spring.io/milestone/org/springframework/data/ ([https](https://repo.spring.io/milestone/org/springframework/data/) result 200).
* [ ] http://spring.io with 1 occurrences migrated to:
  https://spring.io ([https](https://spring.io) result 200).
* [ ] http://spring.io/blog with 2 occurrences migrated to:
  https://spring.io/blog ([https](https://spring.io/blog) result 200).
* [ ] http://spring.io/docs with 2 occurrences migrated to:
  https://spring.io/docs ([https](https://spring.io/docs) result 200).
* [ ] http://spring.io/tools/sts with 1 occurrences migrated to:
  https://spring.io/tools/sts ([https](https://spring.io/tools/sts) result 200).
* [ ] http://stackoverflow.com/questions/18653574/spring-data-mongodb-aggregation-framework-invalid-reference-in-group-operati with 2 occurrences migrated to:
  https://stackoverflow.com/questions/18653574/spring-data-mongodb-aggregation-framework-invalid-reference-in-group-operati ([https](https://stackoverflow.com/questions/18653574/spring-data-mongodb-aggregation-framework-invalid-reference-in-group-operati) result 200).
* [ ] http://stackoverflow.com/questions/24185987/using-root-inside-spring-data-mongodb-for-retrieving-whole-document with 1 occurrences migrated to:
  https://stackoverflow.com/questions/24185987/using-root-inside-spring-data-mongodb-for-retrieving-whole-document ([https](https://stackoverflow.com/questions/24185987/using-root-inside-spring-data-mongodb-for-retrieving-whole-document) result 200).
* [ ] http://stackoverflow.com/questions/tagged/spring-data with 2 occurrences migrated to:
  https://stackoverflow.com/questions/tagged/spring-data ([https](https://stackoverflow.com/questions/tagged/spring-data) result 200).
* [ ] http://stackoverflow.com/questions/tagged/spring-data-mongodb with 2 occurrences migrated to:
  https://stackoverflow.com/questions/tagged/spring-data-mongodb ([https](https://stackoverflow.com/questions/tagged/spring-data-mongodb) result 200).
* [ ] http://twitter.com/SpringData with 1 occurrences migrated to:
  https://twitter.com/SpringData ([https](https://twitter.com/SpringData) result 200).
* [ ] http://www.google.com/search?q=nosoql+acronym with 1 occurrences migrated to:
  https://www.google.com/search?q=nosoql+acronym ([https](https://www.google.com/search?q=nosoql+acronym) result 200).
* [ ] http://www.reactive-streams.org/ with 1 occurrences migrated to:
  https://www.reactive-streams.org/ ([https](https://www.reactive-streams.org/) result 200).
* [ ] http://www.springframework.org/schema/beans/spring-beans.xsd with 4 occurrences migrated to:
  https://www.springframework.org/schema/beans/spring-beans.xsd ([https](https://www.springframework.org/schema/beans/spring-beans.xsd) result 200).
* [ ] http://www.springframework.org/schema/context/spring-context.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/context/spring-context.xsd ([https](https://www.springframework.org/schema/context/spring-context.xsd) result 200).
* [ ] http://www.springframework.org/schema/data/repository/spring-repository.xsd with 10 occurrences migrated to:
  https://www.springframework.org/schema/data/repository/spring-repository.xsd ([https](https://www.springframework.org/schema/data/repository/spring-repository.xsd) result 200).
* [ ] http://contributor-covenant.org with 1 occurrences migrated to:
  https://contributor-covenant.org ([https](https://contributor-covenant.org) result 301).
* [ ] http://contributor-covenant.org/version/1/3/0/ with 1 occurrences migrated to:
  https://contributor-covenant.org/version/1/3/0/ ([https](https://contributor-covenant.org/version/1/3/0/) result 301).
* [ ] http://docs.mongodb.org/ecosystem/drivers/java/ with 1 occurrences migrated to:
  https://docs.mongodb.org/ecosystem/drivers/java/ ([https](https://docs.mongodb.org/ecosystem/drivers/java/) result 301).
* [ ] http://docs.mongodb.org/manual/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/ ([https](https://docs.mongodb.org/manual/) result 301).
* [ ] http://docs.mongodb.org/manual/aggregation/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/aggregation/ ([https](https://docs.mongodb.org/manual/aggregation/) result 301).
* [ ] http://docs.mongodb.org/manual/core/2dsphere/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/core/2dsphere/ ([https](https://docs.mongodb.org/manual/core/2dsphere/) result 301).
* [ ] http://docs.mongodb.org/manual/core/introduction/ with 2 occurrences migrated to:
  https://docs.mongodb.org/manual/core/introduction/ ([https](https://docs.mongodb.org/manual/core/introduction/) result 301).
* [ ] http://docs.mongodb.org/manual/reference/operator/aggregation/bucket/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/reference/operator/aggregation/bucket/ ([https](https://docs.mongodb.org/manual/reference/operator/aggregation/bucket/) result 301).
* [ ] http://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/ ([https](https://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/) result 301).
* [ ] http://docs.mongodb.org/manual/reference/operator/aggregation/facet/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/reference/operator/aggregation/facet/ ([https](https://docs.mongodb.org/manual/reference/operator/aggregation/facet/) result 301).
* [ ] http://docs.mongodb.org/manual/reference/operator/aggregation/project/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/reference/operator/aggregation/project/ ([https](https://docs.mongodb.org/manual/reference/operator/aggregation/project/) result 301).
* [ ] http://docs.mongodb.org/manual/reference/operator/query/text/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/reference/operator/query/text/ ([https](https://docs.mongodb.org/manual/reference/operator/query/text/) result 301).
* [ ] http://docs.mongodb.org/manual/tutorial/aggregation-examples/ with 2 occurrences migrated to:
  https://docs.mongodb.org/manual/tutorial/aggregation-examples/ ([https](https://docs.mongodb.org/manual/tutorial/aggregation-examples/) result 301).
* [ ] http://docs.mongodb.org/manual/tutorial/getting-started/ with 1 occurrences migrated to:
  https://docs.mongodb.org/manual/tutorial/getting-started/ ([https](https://docs.mongodb.org/manual/tutorial/getting-started/) result 301).
* [ ] http://docs.spring.io/spring-data/mongodb/docs/current/api/ with 1 occurrences migrated to:
  https://docs.spring.io/spring-data/mongodb/docs/current/api/ ([https](https://docs.spring.io/spring-data/mongodb/docs/current/api/) result 301).
* [ ] http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/ with 1 occurrences migrated to:
  https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/ ([https](https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/) result 301).
* [ ] http://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/new-in-3.0.html with 1 occurrences migrated to:
  https://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/new-in-3.0.html ([https](https://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/new-in-3.0.html) result 301).
* [ ] http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html with 10 occurrences migrated to:
  https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html ([https](https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html) result 301).
* [ ] http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html with 1 occurrences migrated to:
  https://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html ([https](https://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html) result 301).
* [ ] http://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html with 1 occurrences migrated to:
  https://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html ([https](https://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html) result 301).
* [ ] http://help.github.com/forking/ with 1 occurrences migrated to:
  https://help.github.com/forking/ ([https](https://help.github.com/forking/) result 301).
* [ ] http://projects.spring.io/spring-data with 1 occurrences migrated to:
  https://projects.spring.io/spring-data ([https](https://projects.spring.io/spring-data) result 301).
* [ ] http://projects.spring.io/spring-data-mongodb with 3 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).
* [ ] http://springsource.org with 4 occurrences migrated to:
  https://springsource.org ([https](https://springsource.org) result 301).
* [ ] http://www.mongodb.org/ with 2 occurrences migrated to:
  https://www.mongodb.org/ ([https](https://www.mongodb.org/) result 301).
* [ ] http://www.mongodb.org/books with 1 occurrences migrated to:
  https://www.mongodb.org/books ([https](https://www.mongodb.org/books) result 301).
* [ ] http://www.mongodb.org/display/DOCS/Aggregation with 1 occurrences migrated to:
  https://www.mongodb.org/display/DOCS/Aggregation ([https](https://www.mongodb.org/display/DOCS/Aggregation) result 301).
* [ ] http://www.springframework.org/schema/beans/spring-beans-3.0.xsd with 4 occurrences migrated to:
  https://www.springframework.org/schema/beans/spring-beans-3.0.xsd ([https](https://www.springframework.org/schema/beans/spring-beans-3.0.xsd) result 301).
* [ ] http://www.springframework.org/schema/context/spring-context-3.0.xsd with 2 occurrences migrated to:
  https://www.springframework.org/schema/context/spring-context-3.0.xsd ([https](https://www.springframework.org/schema/context/spring-context-3.0.xsd) result 301).
* [ ] http://www.springframework.org/schema/data/jpa/spring-jpa-1.0.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/data/jpa/spring-jpa-1.0.xsd ([https](https://www.springframework.org/schema/data/jpa/spring-jpa-1.0.xsd) result 301).
* [ ] http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd with 3 occurrences migrated to:
  https://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd ([https](https://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd) result 301).
* [ ] http://www.springframework.org/schema/data/mongo/spring-mongo.xsd with 5 occurrences migrated to:
  https://www.springframework.org/schema/data/mongo/spring-mongo.xsd ([https](https://www.springframework.org/schema/data/mongo/spring-mongo.xsd) result 301).
* [ ] http://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd ([https](https://www.springframework.org/schema/data/repository/spring-repository-1.0.xsd) result 301).
* [ ] http://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd ([https](https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd) result 301).
* [ ] http://repo.spring.io/libs-milestone with 1 occurrences migrated to:
  https://repo.spring.io/libs-milestone ([https](https://repo.spring.io/libs-milestone) result 302).
* [ ] http://repo.spring.io/libs-snapshot with 1 occurrences migrated to:
  https://repo.spring.io/libs-snapshot ([https](https://repo.spring.io/libs-snapshot) result 302).
* [ ] http://try.mongodb.org/ with 1 occurrences migrated to:
  https://try.mongodb.org/ ([https](https://try.mongodb.org/) result 302).
* [ ] http://www.springsource.org/download with 1 occurrences migrated to:
  https://www.springsource.org/download ([https](https://www.springsource.org/download) result 302).

# Ignored
These URLs were intentionally ignored.

* http://127.0.0.1:8080/capture-benchmarks with 1 occurrences
* http://maven.apache.org/POM/4.0.0 with 4 occurrences
* http://www.springframework.org/schema/beans with 38 occurrences
* http://www.springframework.org/schema/context with 27 occurrences
* http://www.springframework.org/schema/data/jpa with 2 occurrences
* http://www.springframework.org/schema/data/mongo with 38 occurrences
* http://www.springframework.org/schema/data/repository with 22 occurrences
* http://www.springframework.org/schema/jdbc with 2 occurrences
* http://www.springframework.org/schema/tool with 22 occurrences
* http://www.w3.org/2001/XMLSchema with 11 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 20 occurrences

Original Pull Request: #695
2019-03-21 08:05:42 +01:00
Spring Operator
6e6bf3a87e DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* http://www.springframework.org/schema/beans/spring-beans-3.0.xsd with 5 occurrences migrated to:
  https://www.springframework.org/schema/beans/spring-beans-3.0.xsd ([https](https://www.springframework.org/schema/beans/spring-beans-3.0.xsd) result 200).
* http://www.springframework.org/schema/beans/spring-beans.xsd with 32 occurrences migrated to:
  https://www.springframework.org/schema/beans/spring-beans.xsd ([https](https://www.springframework.org/schema/beans/spring-beans.xsd) result 200).
* http://www.springframework.org/schema/context/spring-context-3.0.xsd with 2 occurrences migrated to:
  https://www.springframework.org/schema/context/spring-context-3.0.xsd ([https](https://www.springframework.org/schema/context/spring-context-3.0.xsd) result 200).
* http://www.springframework.org/schema/context/spring-context.xsd with 5 occurrences migrated to:
  https://www.springframework.org/schema/context/spring-context.xsd ([https](https://www.springframework.org/schema/context/spring-context.xsd) result 200).
* http://www.springframework.org/schema/data/mongo/spring-mongo.xsd with 32 occurrences migrated to:
  https://www.springframework.org/schema/data/mongo/spring-mongo.xsd ([https](https://www.springframework.org/schema/data/mongo/spring-mongo.xsd) result 200).
* http://www.springframework.org/schema/data/repository/spring-repository.xsd with 2 occurrences migrated to:
  https://www.springframework.org/schema/data/repository/spring-repository.xsd ([https](https://www.springframework.org/schema/data/repository/spring-repository.xsd) result 200).
* http://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd ([https](https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd) result 200).
* http://www.springframework.org/schema/tx/spring-tx-3.0.xsd with 1 occurrences migrated to:
  https://www.springframework.org/schema/tx/spring-tx-3.0.xsd ([https](https://www.springframework.org/schema/tx/spring-tx-3.0.xsd) result 200).
* http://www.springframework.org/schema/util/spring-util.xsd with 5 occurrences migrated to:
  https://www.springframework.org/schema/util/spring-util.xsd ([https](https://www.springframework.org/schema/util/spring-util.xsd) result 200).
* http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd with 1 occurrences migrated to:
  https://java.sun.com/xml/ns/persistence/persistence_2_0.xsd ([https](https://java.sun.com/xml/ns/persistence/persistence_2_0.xsd) result 302).

# Ignored
These URLs were intentionally ignored.

* http://java.sun.com/xml/ns/persistence with 2 occurrences
* http://maven.apache.org/POM/4.0.0 with 10 occurrences
* http://www.springframework.org/schema/beans with 74 occurrences
* http://www.springframework.org/schema/context with 14 occurrences
* http://www.springframework.org/schema/data/mongo with 64 occurrences
* http://www.springframework.org/schema/data/repository with 4 occurrences
* http://www.springframework.org/schema/jdbc with 2 occurrences
* http://www.springframework.org/schema/p with 1 occurrences
* http://www.springframework.org/schema/tx with 2 occurrences
* http://www.springframework.org/schema/util with 10 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 43 occurrences

Original Pull Request: #694
2019-03-20 10:17:01 +01:00
Christoph Strobl
2eedef5e49 DATAMONGO-2231 - URL Cleanup.
Original Pull Request: #658
2019-03-19 13:35:06 +01:00
Spring Operator
f5d6947d11 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were fixed successfully.

* http://www.gopivotal.com (302) migrated to:
  https://pivotal.io ([https](https://www.gopivotal.com) result 200).
* http://www.apache.org/licenses/LICENSE-2.0 migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).
* http://projects.spring.io/spring-data-mongodb migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).
* http://www.pivotal.io migrated to:
  https://www.pivotal.io ([https](https://www.pivotal.io) result 301).
* http://repo.mongodb.org/apt/debian migrated to:
  https://repo.mongodb.org/apt/debian ([https](https://repo.mongodb.org/apt/debian) result 302).

# Ignored
These URLs were intentionally ignored.

* http://maven.apache.org/POM/4.0.0
* http://maven.apache.org/maven-v4_0_0.xsd
* http://maven.apache.org/xsd/maven-4.0.0.xsd
* http://www.w3.org/2001/XMLSchema-instance

Original Pull Request: #658
2019-03-19 12:59:26 +01:00
Christoph Strobl
78fc2e0456 DATAMONGO-2225 - Fix potential NPE in MongoExampleMapper. 2019-03-18 13:44:13 +01:00
Christoph Strobl
2dc7c0ecb4 DATAMONGO-2228 - Polishing.
Favor AssertJ over hamcrest.

Original Pull Request: #661
2019-03-18 11:17:48 +01:00
Mikhail Kaduchka
cd930ea0b7 DATAMONGO-2228 - Fixed loosing branches in AND expressions in MongodbDocumentSerializer.
Original Pull Request: #661
2019-03-18 11:17:39 +01:00
Oliver Drotbohm
70098ae426 DATAMONGO-2227 - Moved off Spring Data Commons' deprecations and unused types.
Related ticket: DATACMNS-1496.
2019-03-13 18:51:33 +01:00
Christoph Strobl
4c241b36b6 DATAMONGO-2164 - After release cleanups. 2019-03-07 10:07:29 +01:00
Christoph Strobl
e33f992f2d DATAMONGO-2164 - Prepare next development iteration. 2019-03-07 10:07:28 +01:00
Christoph Strobl
7c8ec578f2 DATAMONGO-2164 - Release version 2.2 M2 (Moore). 2019-03-07 09:42:21 +01:00
Christoph Strobl
5dc5d7b2eb DATAMONGO-2164 - Prepare 2.2 M2 (Moore). 2019-03-07 09:41:17 +01:00
Christoph Strobl
8394a5a184 DATAMONGO-2164 - Updated changelog. 2019-03-07 09:41:11 +01:00
Christoph Strobl
acab0b2976 DATAMONGO-2199 - Correct author name. 2019-03-07 08:06:07 +01:00
Mark Paluch
7f8319785d DATAMONGO-2220 - Upgrade to MongoDB Reactive Streams Driver 1.11.0. 2019-03-06 15:54:54 +01:00
Oliver Drotbohm
ec50d96d23 DATAMONGO-2210 - Add support to persist URI instances out of the box.
We now register Converter implementations that persist a URI as String by default to avoid the reflection inspection of URI instances.
2019-03-05 15:22:27 +01:00
Christoph Strobl
3c8d84439e DATAMONGO-2219 - Polishing.
Fix change stream tests.

Original Pull Request: #657
2019-03-05 11:08:51 +01:00
Mark Paluch
c39ab8a23f DATAMONGO-2219 - Migrate tests to fluent StepVerifier usage.
We now use Publisher.as(StepVerifier::create) instead of StepVerifier.create(publisher).

Original Pull Request: #657
2019-03-05 11:08:51 +01:00
Mark Paluch
e3a81e71ab DATAMONGO-2219 - Fix ReactiveMongoTemplate.findAllAndRemove(…) if the query yields no results.
ReactiveMongoTemplate.findAllAndRemove(…) now completes successfully without emitting a result if the find query yields no hits. We no longer call the subsequent remove query if without previous results.

Original Pull Request: #657
2019-03-05 11:08:51 +01:00
Christoph Strobl
d4562ba3de DATAMONGO-2217 - Polishing.
Favor AssertJ over hamcrest.
2019-03-05 11:08:51 +01:00
Minsu
9d307bd115 DATAMONGO-2217 - Fix zonded DateTime usage in test.
Original Pull Request: #654
2019-03-05 11:08:51 +01:00
Mark Paluch
38fe8d4601 DATAMONGO-2215 - Polishing.
Update Javadoc to reflect that array filters are used in their raw form without domain-type related type- or field mapping.

Original pull request: #656.
2019-03-05 10:45:37 +01:00
Christoph Strobl
1d910da697 DATAMONGO-2215 - Add support for array filters to Update.
We now support filtered positional $[<identifier>] operator via Updates. This allows to specify a filter criteria chain for the elements in an array.

new Update()
    .set("grades.$[element]", 100)
    .filterArray(Criteria.where("element").gte(100));

Array filters can be used for MongoOperations#update* & MongoOperations#findAndModify

Original pull request: #656.
2019-03-05 10:45:34 +01:00
Mark Paluch
461b5f3568 DATAMONGO-2054 - Polishing.
Slight naming tweaks.

Original pull request: #653.
2019-03-05 10:05:17 +01:00
Christoph Strobl
745dae4993 DATAMONGO-2054 - Add support for array update operator $[].
We now support the $[] array update operator when mapping Update.

Original pull request: #653.
2019-03-05 10:04:33 +01:00
Christoph Strobl
6ce23a0d90 DATAMONGO-1593 - Migrate tests from Hamcrest to AssertJ.
Original pull request: #652.
2019-03-04 14:07:10 +01:00
Christoph Strobl
007b6d4f98 DATAMONGO-1593 - Fix JSON parsing for $oid in String based queries.
We now parse String arguments bound to an $oid correctly, so that those end up as valid ObjectId.

@Query("{ 'arg0' : { '$oid' : ?0} }")
List<Person> singeObjectIdArgInQueryString(String arg0);

Original pull request: #652.
2019-03-04 14:06:48 +01:00
Christoph Strobl
5e8d752be1 DATAMONGO-1348 - Update documentation.
Original pull request: #339.
2019-03-04 13:38:25 +01:00
Mark Paluch
722b6eb389 DATAMONGO-1348 - Polishing.
Add tests for aggregation using GeoJsonPoint. Extract GeoJson checks in own methods. Update license headers. Extract multiplier conversion in MetricConversion. Fix distanceMultiplier calculation to TargetUnit/BaseUnit instead of BaseUnit/TargetUnit.

Original pull request: #339.
2019-03-04 13:38:23 +01:00
Christoph Strobl
f8560aac0b DATAMONGO-1348 - Convert GeoJson used in NearQuery.
We now convert GeoJsonPoint used in NearQuery into its according format. This also requires to convert any given min/maxDistance as well as the distanceMultiplier into meters (metric system).

Along the way we fixed an issue where the actual Query used along with NearQuery was not properly mapped to the domain types properties.

Original pull request: #339.
2019-03-04 13:38:19 +01:00
Greg Turnquist
39859b73a3 DATAMONGO-2184 - Add CI README. 2019-03-01 09:56:43 -06:00
Greg Turnquist
09ffa02d1a DATAMONGO-2184 - Use springci Docker image for building. 2019-03-01 07:41:34 -06:00
Greg Turnquist
ec96c2d48e DATAMONGO-2184 - Polishing. 2019-03-01 07:37:17 -06:00
Greg Turnquist
c5e2d4261d DATAMONGO-2184 - Migrate to springci Docker image. 2019-03-01 07:28:09 -06:00
Greg Turnquist
2e106d4f0c DATAMONGO-2184 - Introduce Concourse. 2019-02-28 08:17:00 -06:00
Mark Paluch
88d5e4367f DATAMONGO-2208 - Deprecate Kotlin extensions providing a KClass overload.
We promote the usage of reified Kotlin API usage (myMethod<Person>() instead of myMethod(Person::class)) to facilitate a single and more idiomatic approach to Kotlin API usage.

Extension methods accepting KClass are deprecated now.

Original Pull Request: #648
2019-02-28 14:36:08 +01:00
Mark Paluch
18fa4deeb5 DATAMONGO-2209 - Polishing.
Convert spaces to tabs. Add ticket references to tests. Reformat code.

Original pull request: #649.
2019-02-22 12:07:11 +01:00
Sebastien Deleuze
717ca19ad1 DATAMONGO-2209 - Add ReactiveFluentMongoOperations Coroutines extensions.
This commit introduces Coroutines support for
ReactiveFluentMongoOperations API via Kotlin extensions that provide
suspendable functions prefixed by `await` or suffixed by `AndAwait` for
Mono based APIs.

Extensions for Flux will be added when Kotlin/kotlinx.coroutines#254
will be fixed.

Original pull request: #649.
2019-02-22 12:06:55 +01:00
Mark Paluch
dc38369f18 DATAMONGO-2072 - Polishing.
Slightly simplify method signature. Update docs.

Original pull request: #645.
2019-02-22 10:20:26 +01:00
Christoph Strobl
5b47648f49 DATAMONGO-2072 - Support Range in repository between queries.
We now support o.s.d.domain.Range as method parameter for between queries. This allows more fine grained control over the inclusion/exclusion of the upper/lower bounds. Up till now between required 2 parameters which had been strictly bound to excluding bounds using $gt and $lt.

Original pull request: #645.
2019-02-22 10:20:06 +01:00
Mark Paluch
ddad63f28e DATAMONGO-2207 - Remove version for managed Kotlin dependencies.
Versions are managed by kotlin-bom.
2019-02-21 09:55:46 +01:00
Mark Paluch
aab1649e87 DATAMONGO-2206 - Polishing.
Reformat code. Convert spaces to tabs. Use mockk version property to define mockk version. Author tags.

Original pull request: #646.
2019-02-20 11:47:41 +01:00
Sebastien Deleuze
120b7bab2c DATAMONGO-2206 - Migrate Kotlin tests to Mockk.
Original pull request: #646.
2019-02-20 11:47:41 +01:00
Mark Paluch
cbed57ee2b DATAMONGO-2105 - Remove default MongoMappingContext bean registration through @EnableMongoRepositories.
We now no longer register a MongoMappingContext bean when using @EnableMongoRepositories. Previously, we attempted to find a MongoMappingContext bean by name and if there was no such bean, we registered a new one. Additionally, the MappingContext was registered without considering SimpleTypes.

Bean discovery using aImportBeanDefinitionRegistrar is error-prone due to its ordering and timing nature because there is no guarantee that all beans are already registered.

Original Pull Request: #642
2019-02-19 09:28:50 +01:00
Mark Paluch
b8f1062ac2 DATAMONGO-2199 - Polishing.
Align copyright years with forked code. Add customization markers to identify code blocks that were altered (in comparison to the original code) for future fork updates.

Original pull request: #643.
2019-02-18 13:51:03 +01:00
Mark Paluch
271b624c56 DATAMONGO-2198 - Polishing.
Remove final keywords from method arguments. Typos, remove commented code. Simplify test.

Original pull request: #643.
2019-02-18 13:38:13 +01:00
Christoph Strobl
8e21cc181e DATAMONGO-2199 - Introduce JSON parser capable of binding parameters.
We moved in and adapted some classes from the MongoDB Java driver in order to bind parameters to placeholders while parsing a JSON string. This allows us to move off the deprecated JSON.parse method that will be removed with the 4.0 version of the driver.

Original pull request: #643.
2019-02-18 13:38:02 +01:00
Christoph Strobl
45f4b5087c DATAMONGO-2199 - Move tests to AssertJ.
Original pull request: #643.
2019-02-18 13:37:59 +01:00
Christoph Strobl
f7f004ec8a DATAMONGO-2199 - Fix deprecation warnings.
Fix those deprecations with alternatives available. Some still have to remain in code as it is unclear which API to use with the 4.x driver.
Leave some TODOs in the code to find those spots when upgrading to the 4.0 MongoDB Java Driver.

Original pull request: #643.
2019-02-18 13:37:41 +01:00
Christoph Strobl
89fde7f8e9 DATAMONGO-2198 - Upgrade to MongoDB Java Driver 3.10
Fix tests failing due to changed String rendering of Document.

Original pull request: #643.
2019-02-18 13:37:34 +01:00
Mark Paluch
9f58c78e43 DATAMONGO-2187 - Updated changelog. 2019-02-13 11:47:53 +01:00
Christoph Strobl
b54cd4e391 DATAMONGO-2195 - Documentation update.
Original pull request: #641.
2019-02-07 15:13:01 +01:00
Mark Paluch
7500ba18fd DATAMONGO-2195 - Throw OptimisticLockingFailureException on delete only in repositories.
OptimisticLockingFailureException is now thrown only when deleting an entity through a repository and no longer when using the Template API.

Original pull request: #641.
2019-02-07 15:12:58 +01:00
Christoph Strobl
4ecf20ce4c DATAMONGO-2195 - Consider version when removing an entity.
We now consider a potential @Version when removing an entity.

MongoOperations#remove(Object) and MongoOperations#remove(Object, String) include the version of the object to remove if the entity is versioned. Opposed to save(Object), remove(Object) does not throw OptimisticLockingFailureException if a versioned entity could not be removed. This behavior is subject to be changed in a future release. Throwing OptimisticLockingFailureException on failed delete on Template API level was not introduced to not break existing application code.

MongoRepository now also considers the entities version, following the very same logic as MongoOperations.
To remove an entity without version check use MongoOperations#remove(Query,…) or MongoRepository#deleteById(…).

Original pull request: #641.
2019-02-07 15:09:58 +01:00
Christoph Strobl
e8a3b6935e DATAMONGO-2196 - Remove applies WriteConcern to single Document delete operations.
We now make sure to apply the WriteConcern correctly when calling deleteOne on MongoCollection.

Original pull request: #641.
2019-02-07 15:09:51 +01:00
Mark Paluch
3a97b3fbf1 DATAMONGO-2193 - Polishing.
Reformat code.

Original pull request: #640.
2019-02-05 11:44:11 +01:00
Christoph Strobl
f4c9cdcacb DATAMONGO-2193 - Fix String <> ObjectId conversion for non-Id properties.
We now make sure to only convert valid ObjectId Strings if the property can be considered as id property.

Original pull request: #640.
2019-02-05 11:44:08 +01:00
Mark Paluch
cd8402f4ba DATAMONGO-2077 - Polishing.
Abbreviate AggregationMethodReference factory methods. Fix deprecation tag. Update documentation.

Original pull request: #639.
2019-02-05 11:09:30 +01:00
Christoph Strobl
3811ddf912 DATAMONGO-2077 - Enhance SpEL aggregation support.
Added aggregation method detection for:
- trim, ltrim, trim
- arrayToObject, objectToArray, indexOfArray
- dateFromString, dateFromParts, isoDateFromParts, dateToParts
- mergeObjects
- convert, toBool, toDate, toDecimal, toDouble, toInt, toLong, toObjectId, toString
- range

Original pull request: #639.
2019-02-05 10:59:38 +01:00
Christoph Strobl
9cc7fc2a08 DATAMONGO-1855 - Polishing
Introduce base class to share code between imperative and reactive GridFs.

Original Pull Request: #637
2019-01-29 10:32:56 +01:00
Mark Paluch
f40861beff DATAMONGO-1855 - Initial reactive GridFS support.
We now support reactive GridFS using MongoDB's reactive GridFS API. Files can be consumed and provided as binary stream.

ReactiveGridFsOperations operations = …;

Publisher<DataBuffer> buffers = …
Mono<ObjectId> id = operations.store(buffers, "foo.xml");

Flux<DataBuffer> download = operations.getResource("foo.xml").flatMap(ReactiveGridFsResource::getDownloadStream);

Original Pull Request: #637
2019-01-29 10:32:35 +01:00
Christoph Strobl
33e4e38e4a DATAMONGO-2189 - Polishing.
Assert returned object is not the same as the saved one and move helper method.

Original Pull Request: #638
2019-01-28 11:34:14 +01:00
Mark Paluch
915385b710 DATAMONGO-2189 - Fix AfterSaveEvent to contain the saved entity in ReactiveMongoTemplate.insert(…).
ReactiveMongoTemplate.insert(…) now uses the saved entity when emitting AfterSaveEvent. This change affects usage of immutable objects that are using Id generation. Previously, the to-be-saved entity instance was used which left the Id unpopulated.

Original Pull Request: #638
2019-01-28 11:33:50 +01:00
Christoph Strobl
e0a12d77f7 DATAMONGO-2182 - Polishing.
Introduce base class for common Querydsl query execution tasks that can be used for imperative and reactive implementation.
Fix test issues due to context caching where indices are not recreated when dropping the collection during test setup.
Update reference documentation.

Original Pull Request: #635
2019-01-23 10:31:12 +01:00
Mark Paluch
16051106c0 DATAMONGO-2182 - Add Querydsl support for reactive repositories.
We now support execution of Querydsl Predicates using reactive MongoDB repositories. Reactive repositories are only required to add ReactiveQuerydslPredicateExecutor to their declaration to enable Querydsl.

public interface PersonRepository extends ReactiveMongoRepository<Person, String>, ReactiveQuerydslPredicateExecutor<Person> {

   // additional query methods go here
}

PersonRepository repository = …;

QPerson person = new QPerson("person");

Flux<Person> result = repository.findAll(person.address.zipCode.eq("C0123"));

Original Pull Request: #635
2019-01-22 15:07:03 +01:00
Christoph Strobl
2f60d08019 DATAMONGO-2188 - Polishing.
Undo deprecation and transition to configuration option allowing to disable the index creation. Index creation will remain in the codebase as it is used in a lot of places to configure required geo index structures in MongoDB without which query execution would fail.

Original Pull Request: #636
2019-01-22 14:49:23 +01:00
Mark Paluch
5c79ff387e DATAMONGO-2188 - Expose IndexResolver.
We now provide IndexResolver to resolve and derive index definitions from Mongo entities.

MongoMappingContext mappingContext = new MongoMappingContext();
IndexResolver indexResolver = IndexResolver.create(mappingContext);
Iterable<? extends IndexDefinitionHolder> definitions = indexResolver.resolveIndexFor(MyEntity.class);

Original Pull Request: #636
2019-01-22 14:48:04 +01:00
Mark Paluch
33fa79b29f DATAMONGO-2188 - Deprecate auto-index creation & introduce configuration to disable it.
Auto-index creation can now be disabled by setting MongoMappingContext.setAutoIndexCreation(false). This configuration prevents automatic index creation on application startup and during access to entities.

Original Pull Request: #636
2019-01-22 14:47:12 +01:00
Mark Paluch
8f3dacd55e DATAMONGO-2145 - Updated changelog. 2019-01-10 14:15:38 +01:00
Mark Paluch
4ffd1e10e8 DATAMONGO-2144 - Updated changelog. 2019-01-10 12:26:34 +01:00
Mark Paluch
f1a18040bb DATAMONGO-2143 - Updated changelog. 2019-01-10 11:01:20 +01:00
Christoph Strobl
0f1536f136 DATAMONGO-2168 - Polishing.
MetadataBackedField no longer fails when Path detects reference to field within java.lang.Class. This can happen when splitting the property name via camel case where the first part matches to class which resolves to the getClass() call on java.lang.Object. When then the 2nd part also maps to a method (like getName()) on Class an error would be thrown.

Original Pull Request: #631
2019-01-09 17:34:49 +01:00
Mark Paluch
d8bfe68b07 DATAMONGO-2168 - Convert assertions to AssertJ.
Original Pull Request: #631
2019-01-09 17:34:49 +01:00
Mark Paluch
2da0ecbe72 DATAMONGO-2168 - Do not map type key in QueryMapper.
QueryMapper now excludes the type key in during mapping and retains the value as-is. This change fixes an issue in which type keys were attempted to be mapped using the entity model. Type key resolution for _class failed silently while other type keys such as className failed in property path resolution and the failure became visible.

Original Pull Request: #631
2019-01-09 17:34:49 +01:00
Mark Paluch
8ad16c1f23 DATAMONGO-2155 - Polishing.
Reduce visibility of MappedUpdated to package-protected to avoid exposure. Rename UpdateDefinition.incVersion() to inc().

Original pull request: #625.
2019-01-09 16:35:38 +01:00
Christoph Strobl
4be3fe006f DATAMONGO-2155 - Introduce UpdateDefinition.
Original pull request: #625.
2019-01-09 16:35:38 +01:00
Oliver Drotbohm
103de476cd DATAMONGO-2155 - Polishing.
Original pull request: #625.
2019-01-09 16:35:38 +01:00
Christoph Strobl
630da43b69 DATAMONGO-2155 - Bypass mapping for already mapped updates.
We now make sure that mapped updates (as in doSaveVersioned and doUpdate) are mapped only once as mapping is required only once. Mapping already mapped query/update objects comes with undesired side-effects such as following invalid property paths or reduction of type information availability.

We now make sure to map key/value pairs of Map like properties to the values domain type, and apply potentially registered custom converters to the keys.
Fixed invalid test for DATAMONGO-1423 as this one did not check the application of the registered converter.

Original pull request: #625.
2019-01-09 16:35:34 +01:00
Mark Paluch
5c79e26046 DATAMONGO-2173 - Polishing.
Set interrupted thread state after catching InterruptedException. Fix potential NPE by checking the cursor. Streamline generics to not hide class-level generic types.

Original pull request: #634.
2019-01-09 12:59:46 +01:00
Christoph Strobl
c9d441027a DATAMONGO-2173 - Translate and forward exceptions during CursorReadingTask#start() to ErrorHandler.
We now make sure to translate and pass on errors during the cursor initialization procedure to the configured error handler.

Original pull request: #634.
2019-01-09 12:59:37 +01:00
Christoph Strobl
161a983c5d DATAMONGO-2174 - Fix InvalidPersistentPropertyPath exception when updating documents.
MetadataBackedField.getPath() now returns null instead throwing an error for fields that are not part of the domain model. This allows adding any field when updating an entity.

Original pull request: #633.
2019-01-09 10:34:18 +01:00
Mark Paluch
9288472fd9 DATAMONGO-2179 - Fixed broken auditing for entities using optimistic locking via batch save.
The previous implementation of (Reactive)MongoTemplate.doInsertBatch(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Related tickets: DATAMONGO-2139, DATAMONGO-2150.

Original Pull Request: #632
2019-01-07 15:56:25 +01:00
Mark Paluch
7d5d0bbdfc DATAMONGO-2181 - Consider repository collection name in ReactiveMongoRepository.saveAll(…).
We now consider the collection name that is bound to the repository when inserting entities of which all are new. Previously, the collection name was derived from the entity.

Original Pull Request: #632
2019-01-07 15:55:26 +01:00
Mark Paluch
eacc085863 DATAMONGO-2170 - Polishing.
Use ObjectUtils to compote hash code as hash code implementation contained artifacts that do not belong there. Extract test method.

Original pull request: #629.
2019-01-07 13:10:49 +01:00
Christoph Strobl
9f6f60c769 DATAMONGO-2170 - Return null instead of empty string for IndexInfo#getPartialFilterExpression when not set.
We now return null instead of an empty string when calling IndexInfo#getPartialFilterExpression. The method has been marked to return null vales before and we’re complying to that contract and return value expectation.

Original pull request: #629.
2019-01-07 13:10:39 +01:00
Mark Paluch
2a45695ef3 DATAMONGO-2175 - Update copyright years to 2019. 2019-01-02 14:11:18 +01:00
Christoph Strobl
2c9042a027 DATAMONGO-2160 - After release cleanups. 2018-12-11 11:07:52 +01:00
Christoph Strobl
c339c24b34 DATAMONGO-2160 - Prepare next development iteration. 2018-12-11 11:07:51 +01:00
Christoph Strobl
32ae2e9ddc DATAMONGO-2160 - Release version 2.2 M1 (Moore). 2018-12-11 10:53:53 +01:00
Christoph Strobl
6fa0625167 DATAMONGO-2160 - Prepare 2.2 M1 (Moore). 2018-12-11 10:53:11 +01:00
Christoph Strobl
e88b88ef36 DATAMONGO-2160 - Updated changelog. 2018-12-11 10:53:07 +01:00
Mark Paluch
2ca879f534 DATAMONGO-2138 - Polishing.
Rename NestedProperty to KPropertyPath to reflect the underlying concept in alignment with our own PropertyPath type. Rename nestedFieldName(…) method to asString(…) to align with Kotlin method terminology. Reformat.

Slightly reword documentation. Add Type-safe Queries for Kotlin to What's New section.

Original pull request: #622.
2018-12-10 15:49:49 +01:00
Tjeu Kayim
8d969ef41d DATAMONGO-2138 - Add Type-safe Kotlin query extension.
We now support type-safe queries using Kotlin's DSL capabilities by accepting property references. Property references map to property paths and are translated to Criteria objects.

mongoOperations.find<Book>(
  Query(Book::title isEqualTo "Moby-Dick")
)

mongoOperations.find<Book>(
  Query(Book::title exists true)
)

mongoOperations.find<Book>(
 Query(Criteria().andOperator(
  Book::price gt 5,
  Book::price lt 10)
))

Original pull request: #622.
2018-12-10 15:49:32 +01:00
Mark Paluch
bbd02107b2 DATAMONGO-2161 - Simplify reference documentation setup.
Remove Asciidoctor plugin from all module builds and run asciidoctor only in distribution build.
2018-12-10 10:20:35 +01:00
Mark Paluch
4a1a9a7b0c DATAMONGO-2150 - Polishing.
Fix imperative auditing test to use intended persist mechanism. Remove final keywords from method args and local variables in ReactiveMongoTemplate. Rename DBObject to Document.

Original Pull Request: #627
2018-12-07 14:32:32 +01:00
Mark Paluch
adf16bb31f DATAMONGO-2150 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of ReactiveMongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via ReactiveMongoTemplate and repositories.

Related ticket: DATAMONGO-2139.

Original Pull Request: #627
2018-12-07 14:32:05 +01:00
Mark Paluch
a834f29f32 DATAMONGO-2156 - Polishing.
Original pull request: #626.
2018-12-05 14:35:00 +01:00
Mark Paluch
d8721c9930 DATAMONGO-2156 - Remove dependency to javax.xml.bind.
We now no longer use DatatypeConverter to convert byte[] to its Base64-representation but use Spring Framework's Base64 utils.

Original pull request: #626.
2018-12-05 14:35:00 +01:00
Mark Paluch
8fe734deb1 DATAMONGO-2115 - Polishing.
Simplify asTimestampOfType(…) retrieval and move the cast to outer method. Simplify test.

Original pull request: #624.
2018-12-04 15:46:17 +01:00
Christoph Strobl
9246d3311d DATAMONGO-2115 - Add support for BsonTimestamp when resuming Change Streams.
Original pull request: #624.
2018-12-04 15:46:08 +01:00
Mark Paluch
6e3e3e4a01 DATAMONGO-2149 - Polishing.
Add ticket reference to follow-up ticket regarding array matching on partial DBRef expressions.

Related ticket: DATAMONGO-2154

Original pull request: #623.
2018-11-30 14:54:05 +01:00
Christoph Strobl
37b9eb3580 DATAMONGO-2149 - Fix $slice in fields projection when pointing to array of DBRefs.
We now no longer try to convert the actual slice parameters into a DBRef.

Original pull request: #623.
2018-11-30 14:54:05 +01:00
Mark Paluch
da1384b84d DATAMONGO-2148 - Polishing.
Add author tag. Add logging for ReactiveMongoTemplate.count(…) and findDistinct(…) operations. Fix variable names.

Original pull request: #620.
2018-11-28 17:25:14 +01:00
Cimon Lucas (LCM)
6c0dd2c80e DATAMONGO-2148 - Add query logging for MongoTemplate.count(…).
Original pull request: #620.
2018-11-28 17:25:02 +01:00
Mark Paluch
28d96f5fe2 DATAMONGO-2121 - Updated changelog. 2018-11-27 14:54:01 +01:00
Mark Paluch
5372a5c8c1 DATAMONGO-2109 - Updated changelog. 2018-11-27 12:36:46 +01:00
Mark Paluch
8672dff5bc DATAMONGO-2110 - Updated changelog. 2018-11-27 11:27:20 +01:00
Mark Paluch
e010bee286 DATAMONGO-2119 - Polishing.
Convert anonymous JSON callback class into a private static one. Use an expressive Pattern constant.

Original pull request: #621.
2018-11-23 09:48:09 +01:00
Christoph Strobl
07b8e8278e DATAMONGO-2119 - Allow SpEL usage for annotated $regex query.
Original pull request: #621.
2018-11-23 09:47:50 +01:00
Oliver Drotbohm
8f4db5540d DATAMONGO-2108 - Fixed broken auditing for entities using optimistic locking.
The previous implementation of MongoTemplate.doSaveVersioned(…) prematurely initialized the version property so that the entity wasn't considered new by the auditing subsystem. Even worse, for primitive version properties, the initialization kept the property at a value of 0, so that the just persisted entity was still considered new. This mean that via the repository route, inserts are triggered even for subsequent attempts to save an entity which caused duplicate key exceptions.

We now make sure we fire the BeforeConvertEvent before the version property is initialized or updated. Also, the initialization of the property now sets primitive properties to 1 initially.

Added integration tests for the auditing via MongoOperations and repositories.
2018-11-22 15:05:10 +01:00
Oliver Drotbohm
9a0dad1723 DATAMONGO-2135 - Default to intermediate List for properties typed to Collection.
We now defensively create a List rather than a LinkedHashSet (which Spring's CollectionFactory.createCollection(…) defaults to) to make sure we're not accidentally dropping values that are considered equal according to their Java class definition.
2018-11-19 19:36:22 +01:00
Mark Paluch
a4c5682f4d DATAMONGO-2130 - Polishing.
Replace duplicate checks to ClientSession.hasActiveTransaction() with MongoResourceHolder.hasActiveTransaction(). Introduce MongoResourceHolder.getRequiredSession() to avoid nullability warnings.

Original pull request: #618.
2018-11-16 12:58:26 +01:00
Christoph Strobl
b2d2c12347 DATAMONGO-2130 - Polishing.
Set timeout for InetAdress host lookup to reduce test execution time.

Original pull request: #618.
2018-11-16 12:58:23 +01:00
Christoph Strobl
b50b645d1f DATAMONGO-2130 - Fix Repository count & exists inside transaction.
We now make sure invocations on repository count and exists methods delegate to countDocuments when inside a transaction.

Original pull request: #618.
2018-11-16 12:58:16 +01:00
Mark Paluch
4c6f793870 DATAMONGO-1798 - Polishing.
Introduce FieldType to express the desired field type to use for MongoDB Id conversion. Adapt Querydsl Id conversion so Id values are converted in the QueryMapper and no longer in SpringDataMongodbSerializer.

Adapt tests. Move MongoId from o.s.d.mongodb to o.s.d.m.c.core. Javadoc, reference docs.

Original pull request: #617.
2018-11-16 11:37:16 +01:00
Christoph Strobl
5d39191f0b DATAMONGO-1798 - Introduce @MongoId annotation for fine grained id conversion control.
@MongoId allows more fine grained control over id conversion by specifying the intended id target type. This allows to skip the automatic to ObjectId conversion of values that happen to be valid ObjectId hex strings.

public class PlainStringId {
  @MongoId String id; // treated as String no matter what
}

public class PlainObjectId {
  @MongoId ObjectId id; // treated as ObjectId
}

public class StringToObjectId {
  @MongoId(FieldType.OBJECT_ID) String id; // treated as ObjectId if the value is a valid ObjectId hex string
}

Original pull request: #617.
2018-11-16 11:36:36 +01:00
Mark Paluch
75b6dc7a0e DATAMONGO-2107 - Updated changelog. 2018-10-29 14:30:29 +01:00
Mark Paluch
0c8f176e70 DATAMONGO-2118 - Polishing.
Fix typo in reactive repositories reference documentation.

Original pull request: #611.
2018-10-26 10:07:54 +02:00
Mona Mohamadinia
c900ceb0d9 DATAMONGO-2118 - Fix typo in repositories reference documentation.
Original pull request: #611.
2018-10-26 10:07:48 +02:00
Mark Paluch
9072e0a703 DATAMONGO-2098 - Polishing.
Annotate methods and parameters with Nullable. Use diamond syntax where appropriate.

Original pull request: #612.
2018-10-25 15:35:10 +02:00
Zied Yaich
67991568f3 DATAMONGO-2098 - Fix typo in MappingMongoConverterParser method.
Original pull request: #612.
2018-10-25 15:34:49 +02:00
Mark Paluch
bc981be9f3 DATAMONGO-2113 - Polishing.
Increase subscription await timeout to allow for slow system processing such as on TravisCI.

Original pull request: #615.
2018-10-25 14:33:16 +02:00
Christoph Strobl
398eafe615 DATAMONGO-2113 - Polishing.
Use AssertJ in tests.

Original pull request: #615.
2018-10-25 14:33:13 +02:00
Christoph Strobl
4673525462 DATAMONGO-2113 - Fix resumeTimestamp conversion for change streams.
We now use the first 32 bits of the timestamp to create the instant and ignore the ordinal value.

Original pull request: #615.
2018-10-25 14:33:01 +02:00
Mark Paluch
5c009e393f DATAMONGO-2083 - Updated changelog. 2018-10-15 14:28:03 +02:00
Mark Paluch
e8e24632e4 DATAMONGO-2094 - Updated changelog. 2018-10-15 11:37:23 +02:00
Mark Paluch
5c54663adf DATAMONGO-2096 - Polishing.
Migrate assertions to AssertJ.

Original pull request: #613.
2018-10-05 15:02:25 +02:00
Christoph Strobl
f04e67edf0 DATAMONGO-2096 - Fix target field name for GraphLookup aggregation operation.
We now make sure to use the target field name instead of the alias when processing GraphLookupOperation.

Original pull request: #613.
2018-10-05 15:02:10 +02:00
Mark Paluch
74269aa6c0 DATAMONGO-2061 - After release cleanups. 2018-09-21 07:45:28 -04:00
Mark Paluch
bb743f494a DATAMONGO-2061 - Prepare next development iteration. 2018-09-21 07:45:26 -04:00
Mark Paluch
a502ffabc3 DATAMONGO-2061 - Release version 2.1 GA (Lovelace). 2018-09-21 07:08:38 -04:00
Mark Paluch
ffe4e9b914 DATAMONGO-2061 - Prepare 2.1 GA (Lovelace). 2018-09-21 07:07:51 -04:00
Mark Paluch
914bdd9434 DATAMONGO-2061 - Updated changelog. 2018-09-21 07:07:46 -04:00
Christoph Strobl
3cd9542483 DATAMONGO-2091 - Upgrade to MongoDB Java Driver 3.8.2 and Reactive Streams Driver 1.9.2 2018-09-20 10:45:23 +02:00
Khaled Baklouti
586bf858f9 DATAMONGO-2087 - Fix typo in MongoRepository.
Original Pull Request: #610
2018-09-20 08:28:44 +02:00
Mark Paluch
3478fd5ab3 DATAMONGO-2090 - Include documentation about Object Mapping Fundamentals.
Related ticket: DATACMNS-1374.
2018-09-18 13:24:40 +02:00
Christoph Strobl
fa5f523c92 DATAMONGO-2086 - Polishing.
Add fix for bound reified type in fluent MapReduce operations.
Also add missing reified type extension to FindDistinct with projection.

Original Pull Request: #609
2018-09-17 14:02:05 +02:00
Mark Paluch
2191ab3bba DATAMONGO-2086 - Fix Fluent API Kotlin extension generics to allow projections.
We now fixed Kotlin extension generics to properly use projections by ignoring the source type of the Fluent API object. Previously, the source and target type were linked which prevented the use of a different result type.

Original Pull Request: #609
2018-09-17 13:49:17 +02:00
Mark Paluch
a79142931f DATAMONGO-2034 - Updated changelog. 2018-09-10 14:15:49 +02:00
Mark Paluch
1ba210366d DATAMONGO-2035 - Updated changelog. 2018-09-10 10:20:54 +02:00
Christoph Strobl
16aa611007 DATAMONGO-2080 - Polishing.
Remove obsolete classes, update Javadoc and fix tests calling all() instead of tail().

Original Pull Request: #608
2018-09-06 15:10:21 +02:00
Mark Paluch
13e29eb81f DATAMONGO-2080 - Use fluent API for reactive tailable query methods.
Using the fluent API allows using DTO projections with properties that are unknown to the actual domain object. Previously, DTO projections attempted to read the domain type and during property access, missing properties were reported with an IllegalArgumentException. Unknown properties remain now unset.

Original Pull Request: #608
2018-09-06 15:09:53 +02:00
Mark Paluch
fe90950880 DATAMONGO-2080 - Support tailable cursors with the fluent reactive API.
We now support queries to return a tailable cursor using the fluent reactive API.

 query(Human.class)
     .inCollection("star-wars")
     .as(Jedi.class)
     .matching(query(where("firstname").is("luke")))
     .tail();

Original Pull Request: #608
2018-09-06 15:09:14 +02:00
Christoph Strobl
492dec8ecf DATAMONGO-2078 - Update reference documentation.
Move and enhance tailable cursor documetation. Move to separate file, preserve anchor and add imperative way using a MessageListener.

Add additional notes on usage of com.mongodb.client.MongoClient.

Original pull request: #607.
2018-09-03 14:09:15 +02:00
Mark Paluch
a1ac2f7c1d DATAMONGO-2075 - Polishing.
Tweaks to Javadoc and reference docs to align with american-english spelling.

Original pull request: #606.
2018-09-03 11:20:45 +02:00
Christoph Strobl
04e53316c6 DATAMONGO-2075 - Open up MongoTransactionManager to allow transaction commit customization and commit retry.
Original pull request: #606.
2018-09-03 11:20:45 +02:00
Oliver Gierke
a991b96518 DATAMONGO-2076 - Fixed attribute substitution in reactive MongoDB section.
We now redeclare the Asciidoctor Maven plugin to register the store specific attributes. Apparently they must not contain dots, so we replaced them with dashes.
2018-08-30 11:45:01 +02:00
Oliver Gierke
d53c5cf5c4 DATAMONGO-2076 - Fixed attribute substitution in getting started section. 2018-08-30 09:30:38 +02:00
Christoph Strobl
90779bbb27 DATAMONGO-2069 - Replace com.mysema.commons.lang.Assert with o.s.util.Assert. 2018-08-24 08:42:36 +02:00
Oliver Gierke
892cc2e69a DATAMONGO-2065 - Polishing. 2018-08-22 11:16:21 +02:00
Oliver Gierke
a69f1b4d51 DATAMONGO-2065 - Make sure that MongoTemplate.doSave(…) triggers overridable property population.
We now consistently call MongoTemplate.populateIdIfNecessary(…) to allow subclasses to override these calls.
2018-08-22 11:16:21 +02:00
Christoph Strobl
7859ee1013 DATAMONGO-2064 - Upgrade MongoDB Java Driver to 3.8.1. 2018-08-21 10:12:42 +02:00
Oliver Gierke
a58562ba69 DATAMONGO-2033 - After release cleanups. 2018-08-20 10:56:52 +02:00
Oliver Gierke
779b0da358 DATAMONGO-2033 - Prepare next development iteration. 2018-08-20 10:56:51 +02:00
Oliver Gierke
ff1703f7c9 DATAMONGO-2033 - Release version 2.1 RC2 (Lovelace). 2018-08-20 10:40:11 +02:00
Oliver Gierke
7b23f8eee2 DATAMONGO-2033 - Prepare 2.1 RC2 (Lovelace). 2018-08-20 10:39:43 +02:00
Oliver Gierke
cc97c5a961 DATAMONGO-2033 - Updated changelog. 2018-08-20 10:39:34 +02:00
Christoph Strobl
08a57e58fd DATAMONGO-2052 - Add support for $arrayToObject and $objectToArray aggregation operators.
Original pull request: #603.
2018-08-17 17:33:16 +02:00
Mark Paluch
9d27d2ff8e DATAMONGO-2059 - Document count helper restrictions for geo commands inside of transactions. 2018-08-17 17:23:53 +02:00
Mark Paluch
3eba7de073 DATAMONGO-2053 - Polishing.
Tweak Javadoc. Surpress generics warnings. Remove nullable annotation from ObjectOperatorFactory.value as it cannot be null. Extend tests. Reformat.

Original pull request: #601.
2018-08-16 11:40:07 +02:00
Christoph Strobl
3dc6cab132 DATAMONGO-2053 - Add support for $mergeObjects aggregation operator.
Original pull request: #601.
2018-08-16 11:39:57 +02:00
Mark Paluch
799fa6c87e DATAMONGO-2046 - Performance improvements in mapping and conversion subsystem.
In MappingMongoConverter, we now avoid the creation of a ParameterValueProvider for parameter-less constructors. We also skip property population if entity can be constructed entirely through constructor creation. Replaced the lambda in MappingMongoConverter.readAndPopulateIdentifier(…) with direct call to ….readIdValue(…). Objectpath now uses decomposed ObjectPathItems to avoid array copying and creation. It now stores a reference to its parent and ObjectPathItem fields are now merged into ObjectPath, which reduces the number of created objects during reads.

Extended CachingMongoPersistentProperty with DBRef caching. Turned key access in DocumentAccessor into an optimistic lookup. DbRefResolverCallbacks are now created lazily.

Related tickets: DATACMNS-1366.
Original pull request: #602.
2018-08-15 16:11:46 +02:00
Mark Paluch
c58032cf37 DATAMONGO-2055 - Polishing.
Move test to UpdateMapperUnitTests.

Original pull request: #600.
2018-08-15 16:00:12 +02:00
Christoph Strobl
67c3f02dcc DATAMONGO-2055 - Allow position modifier to be negative using push at position on Update.
Original pull request: #600.
2018-08-15 15:53:43 +02:00
Mark Paluch
208bd6ae52 DATAMONGO-2050 - Polishing.
Tweak Javadoc.

Original pull request: #596.
2018-08-15 15:04:30 +02:00
Christoph Strobl
64419751c0 DATAMONGO-2050 - Polishing.
Move to AssertJ.

Original pull request: #596.
2018-08-15 15:04:28 +02:00
Christoph Strobl
cd089d4a54 DATAMONGO-2050 - Allow to specify the index to use for $geoNear aggregation operation.
Original pull request: #596.
2018-08-15 15:04:19 +02:00
Mark Paluch
e484337dcf DATAMONGO-2051 - Polishing.
Introduce MongoClientVersion.isMongo38Driver() for API parity between versions 2.0.x and 2.1.

Original pull request: #598.
2018-08-14 16:40:31 +02:00
Christoph Strobl
e4da45baed DATAMONGO-2051 - Add support for SCRAM-SHA-256 authentication mechanism to MongoCredentialPropertyEditor.
Original pull request: #598.
Related pull request: #597.
2018-08-14 16:26:12 +02:00
Mark Paluch
03246f04b8 DATAMONGO-2040 - Polishing.
Deprecate Index.Duplicates. Remove unused imports. Mention deprecation in What's new.

Original pull request: #599.
2018-08-14 16:04:36 +02:00
Christoph Strobl
50070dfc64 DATAMONGO-2040 - Deprecate Indexed.dropDups and CompoundIndex.dropDups.
Add deprecation warning and remove options no longer in use.

Original pull request: #599.
2018-08-14 16:04:03 +02:00
Mark Paluch
029d50e526 DATAMONGO-2049 - Polishing.
Add static import for assertThat(…).

Original pull request: #594.
2018-08-14 10:50:15 +02:00
Christoph Strobl
9764ce0147 DATAMONGO-2049 - Add support for $ltrim, $rtrim, and $trim.
Original pull request: #594.
2018-08-14 10:50:15 +02:00
Mark Paluch
c00f461d06 DATAMONGO-2048 - Polishing.
Javadoc tweaks.

Original pull request: #595.
2018-08-13 15:58:58 +02:00
Christoph Strobl
4205516446 DATAMONGO-2048 - Add support for MongoDB 4.0 $convert aggregation operator.
We now support the following type conversion aggregation operators:

* $convert
* $toBool
* $toDate
* $toDecimal
* $toDouble
* $toInt
* $toLong
* $toObjectId
* $toString

Original pull request: #595.
2018-08-13 15:58:58 +02:00
Mark Paluch
beced8184f DATAMONGO-2047 - Polishing.
Retain previous options when calling withTimezone(…)/onNull…(…). Add tests. Javadoc.

Original pull request: #593.
2018-08-13 13:25:44 +02:00
Christoph Strobl
64dc3dbb1d DATAMONGO-2047 - Update $dateToString and $dateFromString aggregation operators to match MongoDB 4.0 changes.
We added the format and onNull options to DateFromString and changed format to an optional parameter.

Original pull request: #593.
2018-08-13 13:25:33 +02:00
Mark Paluch
7b67ad4f6c DATAMONGO-2045 - Polishing.
Return false instead of null in isTransactionFailureCode(…)/isClientSessionFailureCode(…) to prevent null-dereference. Add initial size to HashMap instances with known number of elements. Fix typos in private constant names. Fix duplicate error code ids.

Original pull request: #592.
2018-08-13 10:30:06 +02:00
Christoph Strobl
c2373d05fe DATAMONGO-2045 - Add session & transaction specific error codes for exception translation.
Original pull request: #592.
2018-08-13 10:30:02 +02:00
Mark Paluch
da63788a52 DATAMONGO-2041 - Polishing.
Use getRequiredPersistentEntity() instead of getPersistentEntity() for improved null-safety. Use Lombok to for required args constructors. Slightly tweak Javadoc.

Original pull request: #591.
2018-08-13 10:10:22 +02:00
Christoph Strobl
016892085c DATAMONGO-2041 - Apply field restriction to DTO projections.
We now derive field projections for DTO projections if the field projection document is unrestricted.

Original pull request: #591.
2018-08-13 10:10:23 +02:00
Mark Paluch
4f9c0fa6b3 DATAMONGO-2043 - Polishing.
Slightly tweak Javadoc.

Original pull request: #589.
2018-08-08 11:01:30 +02:00
Christoph Strobl
e1393847be DATAMONGO-2043 - Omit type hint when mapping simple types.
Original pull request: #589.
2018-08-08 11:01:27 +02:00
Christoph Strobl
ff6f5d9ef3 DATAMONGO-2027 - Polishing.
Remove duplicate tests and fix assertions on existing ones. Move tests over to AssertJ and fix output database not applied correctly.

Original Pull Request: #588
2018-08-07 13:00:58 +02:00
Mark Paluch
d4f351a37c DATAMONGO-2027 - Consider MapReduce output type.
We now consider the output type (collection output) when rendering the MapReduce command. Previously, all output was returned inline without storing the results in the configured collection.

Original Pull Request: #588
2018-08-07 13:00:30 +02:00
Mark Paluch
67281916c2 DATAMONGO-2006 - Updated changelog. 2018-07-27 11:45:21 +02:00
Mark Paluch
f8b2781ec8 DATAMONGO-2007 - Updated changelog. 2018-07-26 16:23:57 +02:00
Mark Paluch
58116dfd63 DATAMONGO-1982 - After release cleanups. 2018-07-26 12:32:27 +02:00
Mark Paluch
5f2c411501 DATAMONGO-1982 - Prepare next development iteration. 2018-07-26 12:32:24 +02:00
Mark Paluch
ac84c7bf57 DATAMONGO-1982 - Release version 2.1 RC1 (Lovelace). 2018-07-26 12:06:34 +02:00
Mark Paluch
db2c05e8fc DATAMONGO-1982 - Prepare 2.1 RC1 (Lovelace). 2018-07-26 12:04:30 +02:00
Mark Paluch
5a735138fc DATAMONGO-1982 - Updated changelog. 2018-07-26 12:04:19 +02:00
Mark Paluch
7f28aaf60d DATAMONGO-2029 - Encode collections of UUID and byte array query method arguments to their binary form.
We now convert collections that only contain UUID or byte array items to a BSON list that contains the encoded form of these items. Previously, we only converted single UUID and byte arrays into $binary so lists rendered to e.g. $uuid which does not work for queries.

Encoding is now encapsulated in strategy objects that implement the encoding only for their type. This allows to break up the conditional flow and improve organization of responsibilities.
2018-07-25 15:05:51 +02:00
Mark Paluch
8b8eb3cfe5 DATAMONGO-2030 - Reinstantiate existsBy queries for reactive repositories.
We now support existsBy queries for reactive repositories to align with blocking repository support. ExistsBy support got lost during merging and is now back in place.

Extract boolean flag counting into BooleanUtil.
2018-07-23 16:28:51 +02:00
Mark Paluch
7f9352f9b8 DATAMONGO-2028 - Polishing.
Remove trailing whitespaces.
2018-07-17 10:03:00 +02:00
Mark Paluch
9d1471bb28 DATAMONGO-2028 - Reinstantiate Map-like document conversion on save.
We now convert values of Map-like documents (Document, DBObject, Map) before writing these into MongoDB. Conversion got lost as result of a refactoring and missing tests.
2018-07-17 10:03:00 +02:00
Christoph Strobl
088928c64a DATAMONGO-2011 - Relax type check when mapping collections.
Original pull request: #587.
2018-07-13 12:42:08 +02:00
Oliver Gierke
648bfdfc67 DATAMONGO-2026 - Polishing.
Slighly polished the initial identifier population and lookup in MappingMongoConverter.

Original pull request: #586.
2018-07-13 12:28:17 +02:00
Christoph Strobl
390b00d5fe DATAMONGO-2026 - Fix id property resolution for immutable objects.
We now make sure id properties used as persistence constructor arguments are no longer set via the property accessor, but during object instantiation. Previous to this change this caused an UnsupportedOperationException.

Original pull request: #586.
2018-07-13 12:28:15 +02:00
Oliver Gierke
98433250c8 DATAMONGO-1992 - Introduced MappingMongoEvent.mapSource(…).
We're currently using application events to allow users to pre-process both entities and documents persisted via our …Template classes. That approach actually exposes a conceptual mismatch as events should be immutable and the hardly can be if event listeners try to modify the entity instance or even exchange them (in case the entity itself is immutable).

We now introduce an intermediate, package protected MappingMongoEvent.mapSource(…) that allows to exchange the source of the event. This is now used by the refined auditing infrastructure as this now returns the manipulated entity as it supports immutable ones as well. This will be removed as soon as we've come up with an alternative callback API that doesn't suffer from these conceptual mismatches (currently scheduled for release train Moore).
2018-07-12 10:57:13 +02:00
Oliver Gierke
323b0a8479 DATAMONGO-1992 - Extract common entity operations API from (Reactive)MongoTemplate.
Introduced EntityOperations and MappedDocument to allow to share common operations from MongoTemplate and ReactiveMongoTemplate.
2018-07-12 10:57:13 +02:00
Mark Paluch
d1b1dfbae9 DATAMONGO-1992 - Disable test for storing Optional properties.
Reading java.util.Optional is now no longer possible as we do not allow modifying immutable (final) fields.
2018-07-12 10:57:13 +02:00
Mark Paluch
d1dea13c32 DATAMONGO-1992 - Add mutation support for immutable objects through MongoTemplate and SimpleMongoRepository.
Persisting methods of MongoTemplate and SimpleMongoRepository now return potentially new object instances of immutable objects. New instances are created using wither methods/Kotlin copy(…) methods if an immutable object requires association with an Id or the version number needs to be incremented.
2018-07-12 10:57:13 +02:00
Mark Paluch
ba2ab183ed DATAMONGO-1992 - Add mutation support for immutable objects through ReactiveMongoTemplate.
Persisting methods of ReactiveMongoTemplate now return potentially new object instances of immutable objects. New instances are created using wither methods/Kotlin copy(…) methods if an immutable object requires association with an Id or the version number needs to be incremented.
2018-07-12 10:57:13 +02:00
Mark Paluch
1eab66aff4 DATAMONGO-1992 - Add mutation support for immutable types.
We now return new instances that are potentially created by wither/Kotlin copy(…) methods when reading immutable properties.
2018-07-12 10:57:13 +02:00
Mark Paluch
8cc4ef3c3f DATAMONGO-1992 - Adapt existing tests to immutable object.
Turn immutable id properties to a mutable one to adapt with removed mutation support for final fields.
2018-07-12 10:57:13 +02:00
Christoph Strobl
1e49c95e41 DATAMONGO-1848 - Polishing.
Prefix types with Querydsl and update visibility to allow construction of custom queries using SpringDataMongodbQuery. Reintroduce generics for JoinBuilder usage, fix warnings and nullability issues. Also add BsonValue types to simple types and use native BsonRegularExpression for regex conversion.

Add tests for "in" on dbref, exception translation, any embedded, join and lifecycle events related to DATAMONGO-362, DATAMONGO-595, DATAMONGO-700, DATAMONGO-1434, DATAMONGO-1810 and DATAMONGO-2010.

Original Pull Request: #579
2018-07-11 13:13:05 +02:00
Mark Paluch
7d06f2b040 DATAMONGO-1848 - Use imported Querydsl support for Document API MongoDB.
Original Pull Request: #579
2018-07-11 13:12:06 +02:00
Mark Paluch
b7755e71f6 DATAMONGO-1848 - Import Document-based Querydsl support.
Original Pull Request: #579
2018-07-11 13:10:51 +02:00
Mark Paluch
0ec82e1f2e DATAMONGO-2021 - Polishing.
Adapt getResources(…) to use the file id and no longer the file name when opening a download stream. Add author tag. Add test to verify content retrieval by identity.

Original pull request: #581.
2018-07-06 13:12:46 +02:00
Niklas Helge Hanft
e18f506edd DATAMONGO-2021 - Use getObjectId() instead of getFilename() for opening the GridFS download stream.
Using the file name leads to duplicate resource streams as file names are not unique therefore we're using the file's ObjectId to lookup the file content.

Original pull request: #581.
2018-07-06 13:12:46 +02:00
Mark Paluch
46ed58b465 DATAMONGO-2016 - Polishing.
Fail gracefully if query string parameter has no value. Reformat test. Convert assertions to AssertJ.

Original pull request: #578.
2018-07-04 11:25:53 +02:00
Stephen Tyler Conrad
fb8084c9f7 DATAMONGO-2016 - Fix username/password extraction in MongoCredentialPropertyEditor.
MongoCredentialPropertyEditor inspects now the connection URI for the appropriate delimiter tokens. Previously, inspection used the char questionmark for username/password delimiter inspection.

Original pull request: #578.
2018-07-04 11:25:50 +02:00
Mark Paluch
5a0171203d DATAMONGO-2005 - Polishing.
Reformat code.

Original pull request: #574.
2018-07-04 09:28:46 +02:00
Christoph Strobl
c1d840d87d DATAMONGO-2005 - Use Flux.usingWhen for resource management in reactive transactions.
Original pull request: #574.
2018-07-04 09:28:15 +02:00
Christoph Strobl
ed1f2c7833 DATAMONGO-2004 - Polishing.
Make sure to place the LazyLoadingProxy early in the mapping process to avoid eager fetching of documents that might then get replaced by the LazyLoadingProxy.

Original Pull Request: #571
2018-07-03 14:21:23 +02:00
Mark Paluch
c545c855b9 DATAMONGO-2004 - Support lazy DBRef resolution through constructor creation of the enclosing entity.
We now respect eager/lazy loading preferences of the annotated association property when the enclosing entity is created through its constructor and the reference is passed as constructor argument.

Previously, we eagerly resolved DBRefs and passed the resolved value to the constructor.

Original Pull Request: #571
2018-07-03 14:09:38 +02:00
Christoph Strobl
1b7678a6af DATAMONGO-1919 - Polishing.
Remove unused imports, deprecated MongoClientVersion methods for drivers no longer supported and remove their usage throughout the codebase and partially revert changes in MongoSimpleTypes because all org.bson types have been included in the 3.8 release of org.mongodb.bson.

Original Pull Request: #572
2018-07-02 14:23:29 +02:00
Mark Paluch
0d06e141a3 DATAMONGO-1919 - Polishing.
Fix typo, use diamond syntax where possible.

Original Pull Request: #572
2018-07-02 14:17:54 +02:00
Mark Paluch
2d36fc3050 DATAMONGO-1919 - Allow reactive-only usage of ReactiveMongoTemplate.
We now support reactive-only usage of Spring Data MongoDB without the need to use the synchronous driver or even having it on the class path. We conditionally register CodeWScope/CodeWithScope types that are bundled with the particular driver distributions.

NoOpDbRefResolver is now a top-level type to be used with a reactive-only MongoMappingContext.

Original Pull Request: #572
2018-07-02 14:17:28 +02:00
Mark Paluch
78c2ab290d DATAMONGO-2012 - Polishing.
Simplify conditional flow. Replace AtomicReference construction in ChangeStreamEvent with AtomicReferenceFieldUpdater usage to reduce object allocations to streamline lazy body conversion usage. Tweak Javadoc and reference docs.

Original pull request: #576.
2018-07-02 09:59:11 +02:00
Christoph Strobl
88150eca54 DATAMONGO-2012 - Upgrade drivers to 3.8 (sync) and 1.9 (reactive).
We still stick to count for non session operations as countDocuments does not allow geo operators like $near in the filter query. For now we will wait to see if this is resolved within the driver.

Added options to watch an entire database and resume the changestream from a given point in time (UTC).

Original pull request: #576.
2018-07-02 09:58:47 +02:00
Christoph Strobl
30b86e7612 DATAMONGO-1311 - Polishing.
Update Javadoc and add reference documentation.
Alter @Meta batchSize default to zero, as negative values bear a special meaning.
Along the lines remove deprecated driver method usage and add deprecations for options about the be removed in subsequent MongoDB server releases.

Original Pull Request: #575
2018-06-29 10:28:33 +02:00
Mark Paluch
d3976f5199 DATAMONGO-1311 - Add configuration options for query batch size.
We now allow configuration of the find cursor/find publisher batch sizes using Query.cursorBatchSize(…).
Configuring the batch size gives users more fine grained control over the fetch behavior especially in reactive usage scenarios as the batch size defaults in FindPublisher to the remaining demand. This can cause several roundtrips in cases the remaining demand is small and the emitted elements are dropped rapidly (e.g. using filter(…)).

On the repository level @Meta allows now configuration of the cursor batch size for derived finder methods.

interface PersonRepository extends Repository<Person, Long> {

	@Meta(cursorBatchSize = 100)
	Stream<Person> findAllByLastname(String lastname);
}

Original Pull Request: #575
2018-06-29 10:25:06 +02:00
Christoph Strobl
f587e1f42a DATAMONGO-1827 - Add guard to tests for pre MongoDB 3.6.
We need a guard for pre MongoDB 3.6 versions using different error label.

Original Pull Request: #569
2018-06-27 15:54:48 +02:00
Christoph Strobl
bd5815dbcb DATAMONGO-1827 - Polishing.
Allow open/close projection on return type for findAndReplace.
Use default methods for delegation and remove collation from FindAndRemoveOption in favor of the collation set on the query itself.
Update Javadoc and reference documentation.

Original Pull Request: #569
2018-06-27 14:33:19 +02:00
Mark Paluch
ac89ce1b2c DATAMONGO-1827 - Polishing.
Use diamond syntax in imperative and reactive Template API implementations. Rename ReactiveMongoTemplate.toDbObject to toDocument. Move Terminating interfaces in ExecutableUpdateOperation and ExecutableRemoveOperation to the top-most position to align with other fluent interface declarations and to improve discoverability of terminating operations.

Convert ReactiveMongoTemplateTests assertions to AssertJ.

Original Pull Request: #569
2018-06-27 14:32:09 +02:00
Mark Paluch
fa880f1c5c DATAMONGO-1827 - Add support for findAndReplace.
We now support findAndReplace operations through the imperative and reactive Template API to find an object by a query and entirely replace it (except for the _id).

template.findAndReplace(query(where("name").is("Han")), new Person("Luke"))

template.update(Person.class).inCollection(STAR_WARS).matching(query(where("name").is("Han"))).replaceWith(luke).findAndReplace()

Original Pull Request: #569
2018-06-27 14:31:47 +02:00
Mark Paluch
56e61a2965 DATAMONGO-1969 - Updated changelog. 2018-06-13 21:39:49 +02:00
Mark Paluch
fcb8647c59 DATAMONGO-1967 - Updated changelog. 2018-06-13 15:01:56 +02:00
Mark Paluch
07e0e78aec DATAMONGO-2003 - Polishing.
Add nullability annotation to MongoParameterAccessor.getPoint(). Remove superfluous casts.

Convert MongoQueryCreatorUnitTests to user AssertJ assertions.

Original pull request: #570.
2018-06-11 14:18:07 +02:00
Christoph Strobl
0c0f47f76f DATAMONGO-2003 - Fix derived query using regex pattern with options.
We now consider regex pattern options when using the pattern as a derived finder argument.

Original pull request: #570.
2018-06-11 14:17:45 +02:00
Mark Paluch
18313db8fb DATAMONGO-2001 - Polishing.
Extract count aggregation pipeline setup to AggregationUtil. Fix count extraction if aggregation returns no results. Fix nullability of Query argument in ReactiveMongoTemplate.count(…). Improve synchronization of multi-threaded aggregation count test to prevent commit before all threads have issued a count query and to await thread completion.

Upgrade to MongoDB 4.0.0-rc4.

Original pull request: #568.
2018-06-11 11:06:23 +02:00
Christoph Strobl
05f325687c DATAMONGO-2001 - Count within transaction should return only the total count of documents visible to the specific session.
We now delegate count operations within an active transaction to an aggregation.

Once `MongoTemplate` detects an active transaction, all exposed `count()` methods are converted and delegated to the
aggregation framework using `$match` and `$count` operators, preserving `Query` settings, such as `collation`.

The following snippet of `count` inside the session bound closure

session.startTransaction();
template.withSession(session)
    .execute(action -> {
        action.count(query(where("state").is("active")), Step.class)
        ...

runs:

db.collection.aggregate(
   [
      { $match: { state: "active" } },
      { $count: "totalEntityCount" }
   ]
)

instead of:

db.collection.find( { state: "active" } ).count()

Original pull request: #568.
2018-06-11 10:52:59 +02:00
Oliver Gierke
8145b84dbe DATAMONGO-2002 - Fixed Criteria.equals(…) for usage with Pattern instances.
For Criteria instances that use regular expressions we now properly compare the two Pattern instances produced by also including the pattern flags in the comparison.
2018-06-07 19:16:27 +02:00
Mark Paluch
794b026f73 DATAMONGO-1979 - Polishing.
Rename QueryUtils method to decorateSort(…) to reflect the nature of the method. Add missing generics. Convert ReactiveMongoRepositoryTests to AssertJ. Add missing verifyComplete() steps to StepVerifier. Slight tweaks to Javadoc and reference docs.

Original pull request: #566.
2018-06-07 10:07:00 +02:00
Christoph Strobl
8f11916014 DATAMONGO-1979 - Use annotation cache throughout MongoQueryMethod.
Original pull request: #566.
2018-06-07 09:59:23 +02:00
Christoph Strobl
c5129aca45 DATAMONGO-1979 - Add default sorting for repository query methods using @Query(sort = "…").
We now allow to set a default sort for repository query methods via the @Query annotation.

	@Query(sort = "{ age : -1 }")
	List<Person> findByFirstname(String firstname);

Using an explicit Sort parameter along with the annotated one allows to alter the defaults set via the annotation. Method argument sort parameters add to / override the annotated defaults.

	@Query(sort = "{ age : -1 }")
	List<Person> findByFirstname(String firstname, Sort sort);

Original pull request: #566.
2018-06-07 09:58:40 +02:00
Mark Paluch
dfede781fb DATAMONGO-1998 - Polishing.
Switch id field name check to equals or to match the last property path segment.

Original pull request: #567.
2018-06-06 11:35:17 +02:00
Christoph Strobl
fe43ba470b DATAMONGO-1998 - Fix Querydsl id handling for nested property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when parsing Querydsl queries.

Original pull request: #567.
2018-06-06 11:35:17 +02:00
Mark Paluch
06622bed35 DATAMONGO-1986 - Polishing.
Refactor duplicated code into AggregationUtil.

Original pull request: #564.
2018-06-06 10:37:29 +02:00
Christoph Strobl
2bac54c70f DATAMONGO-1986 - Always provide a typed AggregationOperationContext for TypedAggregation.
We now initialize a TypeBasedAggregationOperationContext for TypedAggregations if no context is provided. This makes sure that potential Criteria objects are run trough the QueryMapper.
In case the default context is used we now also make sure to at least run the aggregation pipeline through the QueryMapper to avoid passing on non MongoDB simple types to the driver.

Original pull request: #564.
2018-06-06 10:37:29 +02:00
Mark Paluch
daae696c78 DATAMONGO-1988 - Polishing.
Match exactly for either top-level properties of leaf-properties instead of accepting the property/field name suffix.

Original pull request: #565.
2018-06-05 11:15:10 +02:00
Christoph Strobl
9f77aba8bb DATAMONGO-1988 - Fix query creation for id property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when creating queries. Prior to this change e.g. String values would have been turned into ObejctIds when saving a document, but not when querying the latter.

Original pull request: #565.
2018-06-05 11:12:28 +02:00
Oliver Gierke
2d9232ca04 DATAMONGO-1990 - Adapt build to changes in is-new-detection.
Removed a couple of Mockito stubbings that became unnecessary after the changes for DATACMNS-1333. Removed PersistableMongoEntityInformation as handling Persistable is now transparently taken care of by PersistentEntityInformation which MappingMongoEntityInformation extends.

Related tickets: DATACMNS-1333.
2018-06-01 13:29:27 +02:00
Christoph Strobl
e90c5bad2c DATAMONGO-1987 - Upgrade test infrastructure to MongoDB 4.0-rc0
Also make sure to use WriteConcern.MAJORITY when (re)creating collections in test setup.
2018-05-25 08:58:55 +02:00
Mark Paluch
bec79e6d7b DATAMONGO-1983 - Include transactions reference documentation page.
We now render the transaction support within our reference docs.
2018-05-17 11:15:29 +02:00
Christoph Strobl
619d344f57 DATAMONGO-1927 - After release cleanups. 2018-05-17 10:09:35 +02:00
Christoph Strobl
6f1b9d3fa4 DATAMONGO-1927 - Prepare next development iteration. 2018-05-17 10:09:34 +02:00
Christoph Strobl
24417c4c99 DATAMONGO-1927 - Release version 2.1 M3 (Lovelace). 2018-05-17 09:51:42 +02:00
Christoph Strobl
eec36b791a DATAMONGO-1927 - Prepare 2.1 M3 (Lovelace). 2018-05-17 09:50:48 +02:00
Christoph Strobl
512fa036bb DATAMONGO-1927 - Updated changelog. 2018-05-17 09:50:42 +02:00
Sébastien Deleuze
ea8df26eee DATAMONGO-1980 - Fix a typo in CriteriaExtensions.kt.
Original pull request: #563.
2018-05-16 09:43:53 +02:00
Oliver Gierke
43d821aab0 DATAMONGO-1874 - Polishing.
Cleanups in test case. Moved assertions to AssertJ.
2018-05-15 14:39:35 +02:00
Oliver Gierke
9bb8211ed7 DATAMONGO-1874 - @Document's collection attribute now supports EvaluationContextExtensions.
We now use the API newly introduced with DATACMNS-1260 to expose EvaluationContextExtensions to the SpEL evaluation in case the collection attribute of @Document uses SpEL.

Related tickets: DATACMNS-1260.
2018-05-15 14:39:35 +02:00
Victor
5a24e04226 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:30:22 +02:00
Mark Paluch
521d28ff3f DATAMONGO-1466 - Polishing.
Switch conditionals to Map-based Function registry to pick the appropriate converter. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:27:04 +02:00
Christoph Strobl
e38e7d89f4 DATAMONGO-1466 - Polishing.
Just some minor code style improvements.

Original pull request: #561.
2018-05-15 11:27:04 +02:00
Christoph Strobl
fff69b9ed7 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:27:04 +02:00
Oliver Gierke
e2e9e92563 DATAMONGO-1880 - Improve test execution in SessionBoundMongotTemplateTests.
We now also require a replica set for the test execution.
2018-05-15 10:39:36 +02:00
Oliver Gierke
3e040d283b DATAMONGO-1976 - Adapt to SpEL extension API changes in Spring Data Commons.
Related tickets: DATACMNS-1260.
2018-05-15 10:36:18 +02:00
Mark Paluch
a66b87118e DATAMONGO-1970 - Polishing.
ReactiveMongoOperations.withSession(…) no longer commits transactions if a transaction is active. ReactiveSessionScoped obtained through inTransaction() solely manages transactions and participates in ongoing transactions if a given ClientSession has already an active transaction. Remove ReactiveSessionScoped.executeSingle methods to align with ReactiveMongoOperations.

Add tests. Switch reactive tests to .as(StepVerifier:create) form. Extend documentation.

Original pull request: #560.
2018-05-14 13:18:15 +02:00
Christoph Strobl
f296a499e5 DATAMONGO-1970 - Add support for MongoDB 4.0 transactions (reactive).
We now support Mongo Transactions through the reactive Template API. However, there's no reactive repository transaction support yet.

Mono<DeleteResult> result = template.inTransaction()
                              .execute(action -> action.remove(query(where("id").is("step-1")), Step.class));

Original pull request: #560.
2018-05-14 13:16:22 +02:00
Mark Paluch
1acf00b039 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:30:05 +02:00
Jay Bryant
e23c861a39 DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:30:05 +02:00
Mark Paluch
85aef4836d DATAMONGO-1968 - Polishing.
Rename MongoDbFactoryBase to MongoDbFactorySupport. Add constructor to MongoTemplate accepting the new MongoClient type. Extend Javadoc. Switch tests to use the new MongoTemplate constructor.

Original pull request: #557.
2018-05-11 10:29:58 +02:00
Christoph Strobl
5470486d8d DATAMONGO-1968 - Add configuration support for com.mongodb.client.MongoClient.
We now accept MongoDB's new com.mongodb.client.MongoClient object to setup Spring Data MongoDB infrastructure through AbstractMongoClientConfiguration. The new MongoClient does not support DBObject anymore hence it cannot be used with Querydsl.

@Configuration
public class MongoClientConfiguration extends AbstractMongoClientConfiguration {

	@Override
	protected String getDatabaseName() {
		return "database";
	}

	@Override
	public MongoClient mongoClient() {
		return MongoClients.create("mongodb://localhost:27017/?replicaSet=rs0&w=majority");
	}
}

Original pull request: #557.
2018-05-11 10:27:30 +02:00
Mark Paluch
42c02c9b70 DATAMONGO-1971 - Polishing.
Remove outdated profiles.

Original pull request: #554.
2018-05-09 16:35:21 +02:00
Mark Paluch
ee9bca4856 DATAMONGO-1971 - Install MongoDB 3.7.9 on TravisCI.
We now download and unpack MongoDB directly instead of using TravisCI's outdated MongoDB version.

Original pull request: #554.
2018-05-09 16:35:18 +02:00
Mark Paluch
0d823df7f3 DATAMONGO-1920 - Polishing.
Slightly tweak method names. Document MongoDatabaseUtils usage in the context of MongoTransactionManager. Rename SessionSynchronization constants to align with AbstractPlatformTransactionManager. Slightly tweak Javadoc and reference docs for typos.

Original pull request: #554.
2018-05-09 16:31:58 +02:00
Christoph Strobl
4cd2935087 DATAMONGO-1920 - Add support for MongoDB 4.0 transactions (synchronous driver).
MongoTransactionManager is the gateway to the well known Spring transaction support. It allows applications to use managed transaction features of Spring.
The MongoTransactionManager binds a ClientSession to the thread. MongoTemplate automatically detects those and operates on them accordingly.

static class Config extends AbstractMongoConfiguration {

	// ...

	@Bean
	MongoTransactionManager transactionManager(MongoDbFactory dbFactory) {
		return new MongoTransactionManager(dbFactory);
	}
}

@Component
public class StateService {

	@Transactional
	void someBusinessFunction(Step step) {

		template.insert(step);

		process(step);

		template.update(Step.class).apply(update.set("state", // ...
	};
});

Original pull request: #554.
2018-05-09 16:31:06 +02:00
Mark Paluch
6fbb7cec22 DATAMONGO-1918 - Updated changelog. 2018-05-08 15:27:17 +02:00
Mark Paluch
44ea579b69 DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:51 +02:00
Mark Paluch
30af34f80a DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:41 +02:00
Christoph Strobl
247f30143b DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:18:52 +02:00
Mark Paluch
364f266a3a DATAMONGO-1914 - Polishing.
Throw FileNotFoundException on inherited methods throwing IOException if resource is absent. Retain filename for absent resources to provide context through GridFsResource.getFilename(). Switch exists() to determine presence/absence based on GridFSFile presence. Extend tests.

Original pull request: #555.
2018-05-07 14:53:27 +02:00
Christoph Strobl
f92bd20384 DATAMONGO-1914 - Return an empty GridFsResource instead of null when resource does not exist.
Original pull request: #555.
2018-05-07 14:53:21 +02:00
Mark Paluch
304e1c607f DATAMONGO-1929 - Polishing.
Add missing Nullable annotations and missing diamond operators. Slightly reorder null guards.

Use Lombok to generate required constructors.

Original pull request: #551.
2018-04-23 10:34:37 +02:00
Christoph Strobl
449780573e DATAMONGO-1929 - Add Kotlin extensions for Executable and ReactiveMapReduceOperation.
Original pull request: #551.
2018-04-23 10:34:31 +02:00
Christoph Strobl
e424573f0d DATAMONGO-1929 - Add fluent mapReduce template API.
Original pull request: #551.
2018-04-23 10:34:10 +02:00
Christoph Strobl
31630c0dcc DATAMONGO-1928 - Polishing.
Use native driver operations to avoid potential unwanted template index interaction.

Original Pull Request: #550
2018-04-20 13:17:38 +02:00
Mark Paluch
7cdc3d00c1 DATAMONGO-1928 - Polishing.
Migrate test to AssertJ and as-style for StepVerifier.

Original Pull Request: #550
2018-04-20 13:02:00 +02:00
Mark Paluch
a9f5d7bd3d DATAMONGO-1928 - Use non-blocking index creation through ReactiveMongoTemplate.
We now use ReactiveIndexOperationsProvider to inspect and create indexes for MongoDB collections without using blocking methods. Indexes are created for initial entities and whenever a MongoPersistentEntity is registered in MongoMappingContext.

Index creation is now decoupled from the actual ReactiveMongoTemplate call causing indexes to be created asynchronously. Mongo commands no longer depend on the completion of index creation commands. Decoupling also comes with the aspect that ReactiveMongoTemplate creation/command invocation no longer fails if the actual index creation fails. Previous usage of blocking index creation caused the actual ReactiveMongoTemplate call to fail.

ReactiveMongoTemplate objects can be created with a Consumer<Throwable> callback that is notified if an index creation fails.

Original Pull Request: #550
2018-04-20 13:00:54 +02:00
Christoph Strobl
b5f18468db DATAMONGO-1808 - Polishing.
Move individual methods under BitwiseCriteriaOperators, update Javadoc and split tests.

Original Pull Request: #507
2018-04-18 13:15:05 +02:00
Andreas Zink
ef872d2527 DATAMONGO-1808 - Add support for bitwise query operators.
Add support for $bitsAllClear, $bitsAllSet, $bitsAnyClear and $bitsAnySet.

Original Pull Request: #507
2018-04-18 13:14:45 +02:00
Mark Paluch
c2516946e9 DATAMONGO-1890 - Polishing.
Remove mapReduce default methods in favor of adding variants through a fluent API at a later stage. Assert mapReduce arguments and remove subsequent null guards. Adapt tests.

Original pull request: #548.
2018-04-17 10:12:46 +02:00
Christoph Strobl
857add7349 DATAMONGO-1890 - Add support for mapReduce to ReactiveMongoOperations.
We now support mapReduce via ReactiveMongoTemplate returning a Flux<T> as the operations result.

Original pull request: #548.
2018-04-17 10:11:29 +02:00
Mark Paluch
2ec3f219c8 DATAMONGO-1869 - After release cleanups. 2018-04-13 15:08:33 +02:00
Mark Paluch
5c8701f79c DATAMONGO-1869 - Prepare next development iteration. 2018-04-13 15:08:32 +02:00
1007 changed files with 57169 additions and 15637 deletions

BIN
.mvn/wrapper/maven-wrapper.jar vendored Executable file

Binary file not shown.

1
.mvn/wrapper/maven-wrapper.properties vendored Executable file
View File

@@ -0,0 +1 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip

View File

@@ -3,44 +3,36 @@ language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
before_install:
- mkdir -p downloads
- mkdir -p var/db var/log
- if [[ ! -d downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION} ]] ; then cd downloads && wget https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && tar xzf mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && cd ..; fi
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --version
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --dbpath var/db --replSet rs0 --fork --logpath var/log/mongod.log
- sleep 10
- |-
echo "replication:
replSetName: rs0" | sudo tee -a /etc/mongod.conf
- sudo service mongod restart
- sleep 20
- |-
mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
- sleep 15
services:
- mongodb
downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
sleep 15
env:
matrix:
- MONGO_VERSION=4.1.10
- MONGO_VERSION=4.0.4
- MONGO_VERSION=3.6.12
- MONGO_VERSION=3.4.20
global:
- PROFILE=ci
- PROFILE=mongo36-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install:
- |-
mongo admin --eval "db.adminCommand({setFeatureCompatibilityVersion: '3.4'});"
- downloads
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

43
CI.adoc Normal file
View File

@@ -0,0 +1,43 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
== Running CI tasks locally
Since this pipeline is purely Docker-based, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your test routine before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what the CI server does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, run the tests from inside the container:
+
3. `./mvnw clean dependency:list test -Dsort -Dbundlor.enabled=false -B` (or with whatever profile you need to test out)
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to package things up, do this:
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, package things from inside the container doing this:
+
3. `./mvnw clean dependency:list package -Dsort -Dbundlor.enabled=false -B`
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -24,4 +24,4 @@ Instances of abusive, harassing, or otherwise unacceptable behavior may be repor
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].
This Code of Conduct is adapted from the https://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at https://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

219
Jenkinsfile vendored Normal file
View File

@@ -0,0 +1,219 @@
pipeline {
agent none
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/master", threshold: hudson.model.Result.SUCCESS)
}
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '14'))
}
stages {
stage("Docker images") {
parallel {
stage('Publish JDK 8 + MongoDB 4.0') {
when {
changeset "ci/openjdk8-mongodb-4.0/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.0", "ci/openjdk8-mongodb-4.0/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
stage('Publish JDK 8 + MongoDB 4.1') {
when {
changeset "ci/openjdk8-mongodb-4.1/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.1", "ci/openjdk8-mongodb-4.1/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
stage('Publish JDK 8 + MongoDB 4.2') {
when {
changeset "ci/openjdk8-mongodb-4.2/**"
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-openjdk8-with-mongodb-4.2", "ci/openjdk8-mongodb-4.2/")
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
image.push()
}
}
}
}
}
}
stage("test: baseline") {
when {
anyOf {
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.2:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
}
}
stage("Test other configurations") {
when {
anyOf {
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: mongodb 4.0") {
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
}
}
stage("test: mongodb 4.1") {
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.1:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
}
}
}
}
stage('Release to artifactory') {
when {
anyOf {
branch 'master'
not { triggeredBy 'UpstreamCause' }
}
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
stage('Publish documentation') {
when {
branch 'master'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.distribution-repository=temp-private-local " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
post {
changed {
script {
slackSend(
color: (currentBuild.currentResult == 'SUCCESS') ? 'good' : 'danger',
channel: '#spring-data-dev',
message: "${currentBuild.fullDisplayName} - `${currentBuild.currentResult}`\n${env.BUILD_URL}")
emailext(
subject: "[${currentBuild.fullDisplayName}] ${currentBuild.currentResult}",
mimeType: 'text/html',
recipientProviders: [[$class: 'CulpritsRecipientProvider'], [$class: 'RequesterRecipientProvider']],
body: "<a href=\"${env.BUILD_URL}\">${currentBuild.fullDisplayName} is reported as ${currentBuild.currentResult}</a>")
}
}
}
}

159
README.adoc Normal file
View File

@@ -0,0 +1,159 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
== Code of Conduct
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
[source,java]
----
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
@Service
public class MyService {
private final PersonRepository repository;
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
----
=== Maven configuration
Add the Maven dependency:
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
----
== Getting Help
Having trouble with Spring Data? Wed love to help!
* Check the
https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/[reference documentation], and https://docs.spring.io/spring-data/mongodb/docs/current/api/[Javadocs].
* Learn the Spring basics Spring Data builds on Spring Framework, check the https://spring.io[spring.io] web-site for a wealth of reference documentation.
If you are just starting out with Spring, try one of the https://spring.io/guides[guides].
* If you are upgrading, check out the https://docs.spring.io/spring-data/mongodb/docs/current/changelog.txt[changelog] for "`new and noteworthy`" features.
* Ask a question - we monitor https://stackoverflow.com[stackoverflow.com] for questions tagged with https://stackoverflow.com/tags/spring-data[`spring-data-mongodb`].
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://jira.spring.io/browse/DATAMONGO[jira.spring.io/browse/DATAMONGO].
== Reporting Issues
Spring Data uses JIRA as issue tracking system to record bugs and feature requests. If you want to raise an issue, please follow the recommendations below:
* Before you log a bug, please search the
https://jira.spring.io/browse/DATAMONGO[issue tracker] to see if someone has already reported the problem.
* If the issue doesnt already exist, https://jira.spring.io/browse/DATAMONGO[create a new issue].
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using and JVM version.
* If you need to paste code, or include a stack trace use JIRA `{code}…{code}` escapes before and after your text.
* If possible try to create a test-case or project that replicates the issue. Attach a link to your code or a compressed file containing your code.
== Building from Source
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
[source,bash]
----
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
Building the documentation builds also the project without running tests.
[source,bash]
----
$ ./mvnw clean install -Pdistribute
----
The generated documentation is available from `target/site/reference/html/index.html`.
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

150
README.md
View File

@@ -1,150 +0,0 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
## Getting Help
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
### Maven configuration
Add the Maven dependency:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
```
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() throws Exception {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
```xml
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
```java
@Service
public class MyService {
private final PersonRepository repository;
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
## Contributing to Spring Data
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

9
SECURITY.adoc Normal file
View File

@@ -0,0 +1,9 @@
# Security Policy
## Supported Versions
Please see the https://spring.io/projects/spring-data-mongodb[Spring Data MongoDB] project page for supported versions.
## Reporting a Vulnerability
Please don't raise security vulnerabilities here. Head over to https://pivotal.io/security to learn how to disclose them responsibly.

39
ci/README.adoc Normal file
View File

@@ -0,0 +1,39 @@
== Running CI tasks locally
Since Concourse is built on top of Docker, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your `test.sh` script before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what Concourse does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
Next, run the `test.sh` script from inside the container:
+
2. `PROFILE=none spring-data-mongodb-github/ci/test.sh`
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to test the `build.sh` script, do this:
1. `mkdir /tmp/spring-data-mongodb-artifactory`
2. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github --mount type=bind,source="/tmp/spring-data-mongodb-artifactory",target=/spring-data-mongodb-artifactory springci/spring-data-8-jdk-with-mongodb /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github` and the temporary
artifactory output directory at `spring-data-mongodb-artifactory`.
+
Next, run the `build.sh` script from inside the container:
+
3. `spring-data-mongodb-github/ci/build.sh`
IMPORTANT: `build.sh` doesn't actually push to Artifactory so don't worry about accidentally deploying anything.
It just deploys to a local folder. That way, the `artifactory-resource` later in the pipeline can pick up these artifacts
and deliver them to artifactory.
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.9 mongodb-org-server=4.0.9 mongodb-org-shell=4.0.9 mongodb-org-mongos=4.0.9 mongodb-org-tools=4.0.9
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 4B7C549A058F8B6B
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.1 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.1.list
RUN apt-get update
RUN apt-get install -y mongodb-org-unstable=4.1.13 mongodb-org-unstable-server=4.1.13 mongodb-org-unstable-shell=4.1.13 mongodb-org-unstable-mongos=4.1.13 mongodb-org-unstable-tools=4.1.13
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv e162f504a20cdf15827f718d4b7c549a058f8b6b
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.2 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.2.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.2.0 mongodb-org-server=4.2.0 mongodb-org-shell=4.2.0 mongodb-org-mongos=4.2.0 mongodb-org-tools=4.2.0
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

286
mvnw vendored Executable file
View File

@@ -0,0 +1,286 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Maven2 Start Up Batch script
#
# Required ENV vars:
# ------------------
# JAVA_HOME - location of a JDK home dir
#
# Optional ENV vars
# -----------------
# M2_HOME - location of maven2's installed home dir
# MAVEN_OPTS - parameters passed to the Java VM when running Maven
# e.g. to debug Maven itself, use
# set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
# MAVEN_SKIP_RC - flag to disable loading of mavenrc files
# ----------------------------------------------------------------------------
if [ -z "$MAVEN_SKIP_RC" ] ; then
if [ -f /etc/mavenrc ] ; then
. /etc/mavenrc
fi
if [ -f "$HOME/.mavenrc" ] ; then
. "$HOME/.mavenrc"
fi
fi
# OS specific support. $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false
case "`uname`" in
CYGWIN*) cygwin=true ;;
MINGW*) mingw=true;;
Darwin*) darwin=true
# Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home
# See https://developer.apple.com/library/mac/qa/qa1170/_index.html
if [ -z "$JAVA_HOME" ]; then
if [ -x "/usr/libexec/java_home" ]; then
export JAVA_HOME="`/usr/libexec/java_home`"
else
export JAVA_HOME="/Library/Java/Home"
fi
fi
;;
esac
if [ -z "$JAVA_HOME" ] ; then
if [ -r /etc/gentoo-release ] ; then
JAVA_HOME=`java-config --jre-home`
fi
fi
if [ -z "$M2_HOME" ] ; then
## resolve links - $0 may be a link to maven's home
PRG="$0"
# need this for relative symlinks
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG="`dirname "$PRG"`/$link"
fi
done
saveddir=`pwd`
M2_HOME=`dirname "$PRG"`/..
# make it fully qualified
M2_HOME=`cd "$M2_HOME" && pwd`
cd "$saveddir"
# echo Using m2 at $M2_HOME
fi
# For Cygwin, ensure paths are in UNIX format before anything is touched
if $cygwin ; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --unix "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
[ -n "$JAVA_HOME" ] &&
JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
# TODO classpath?
fi
if [ -z "$JAVA_HOME" ]; then
javaExecutable="`which javac`"
if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then
# readlink(1) is not available as standard on Solaris 10.
readLink=`which readlink`
if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then
if $darwin ; then
javaHome="`dirname \"$javaExecutable\"`"
javaExecutable="`cd \"$javaHome\" && pwd -P`/javac"
else
javaExecutable="`readlink -f \"$javaExecutable\"`"
fi
javaHome="`dirname \"$javaExecutable\"`"
javaHome=`expr "$javaHome" : '\(.*\)/bin'`
JAVA_HOME="$javaHome"
export JAVA_HOME
fi
fi
fi
if [ -z "$JAVACMD" ] ; then
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
else
JAVACMD="`which java`"
fi
fi
if [ ! -x "$JAVACMD" ] ; then
echo "Error: JAVA_HOME is not defined correctly." >&2
echo " We cannot execute $JAVACMD" >&2
exit 1
fi
if [ -z "$JAVA_HOME" ] ; then
echo "Warning: JAVA_HOME environment variable is not set."
fi
CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
# traverses directory structure from process work directory to filesystem root
# first directory with .mvn subdirectory is considered project base directory
find_maven_basedir() {
if [ -z "$1" ]
then
echo "Path not specified to find_maven_basedir"
return 1
fi
basedir="$1"
wdir="$1"
while [ "$wdir" != '/' ] ; do
if [ -d "$wdir"/.mvn ] ; then
basedir=$wdir
break
fi
# workaround for JBEAP-8937 (on Solaris 10/Sparc)
if [ -d "${wdir}" ]; then
wdir=`cd "$wdir/.."; pwd`
fi
# end of workaround
done
echo "${basedir}"
}
# concatenates all lines of a file
concat_lines() {
if [ -f "$1" ]; then
echo "$(tr -s '\n' ' ' < "$1")"
fi
}
BASE_DIR=`find_maven_basedir "$(pwd)"`
if [ -z "$BASE_DIR" ]; then
exit 1;
fi
##########################################################################################
# Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
# This allows using the maven wrapper in projects that prohibit checking in binary data.
##########################################################################################
if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found .mvn/wrapper/maven-wrapper.jar"
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..."
fi
jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
while IFS="=" read key value; do
case "$key" in (wrapperUrl) jarUrl="$value"; break ;;
esac
done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties"
if [ "$MVNW_VERBOSE" = true ]; then
echo "Downloading from: $jarUrl"
fi
wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar"
if command -v wget > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found wget ... using wget"
fi
wget "$jarUrl" -O "$wrapperJarPath"
elif command -v curl > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found curl ... using curl"
fi
curl -o "$wrapperJarPath" "$jarUrl"
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Falling back to using Java to download"
fi
javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java"
if [ -e "$javaClass" ]; then
if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Compiling MavenWrapperDownloader.java ..."
fi
# Compiling the Java class
("$JAVA_HOME/bin/javac" "$javaClass")
fi
if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
# Running the downloader
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Running MavenWrapperDownloader.java ..."
fi
("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR")
fi
fi
fi
fi
##########################################################################################
# End of extension
##########################################################################################
export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"}
if [ "$MVNW_VERBOSE" = true ]; then
echo $MAVEN_PROJECTBASEDIR
fi
MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS"
# For Cygwin, switch paths to Windows format before running java
if $cygwin; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --path --windows "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --windows "$CLASSPATH"`
[ -n "$MAVEN_PROJECTBASEDIR" ] &&
MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"`
fi
WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
exec "$JAVACMD" \
$MAVEN_OPTS \
-classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \
"-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \
${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@"

161
mvnw.cmd vendored Executable file
View File

@@ -0,0 +1,161 @@
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM https://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Maven2 Start Up Batch script
@REM
@REM Required ENV vars:
@REM JAVA_HOME - location of a JDK home dir
@REM
@REM Optional ENV vars
@REM M2_HOME - location of maven2's installed home dir
@REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands
@REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a key stroke before ending
@REM MAVEN_OPTS - parameters passed to the Java VM when running Maven
@REM e.g. to debug Maven itself, use
@REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files
@REM ----------------------------------------------------------------------------
@REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on'
@echo off
@REM set title of command window
title %0
@REM enable echoing my setting MAVEN_BATCH_ECHO to 'on'
@if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO%
@REM set %HOME% to equivalent of $HOME
if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%")
@REM Execute a user defined script before this one
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre
@REM check for pre script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat"
if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd"
:skipRcPre
@setlocal
set ERROR_CODE=0
@REM To isolate internal variables from possible post scripts, we use another setlocal
@setlocal
@REM ==== START VALIDATION ====
if not "%JAVA_HOME%" == "" goto OkJHome
echo.
echo Error: JAVA_HOME not found in your environment. >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
:OkJHome
if exist "%JAVA_HOME%\bin\java.exe" goto init
echo.
echo Error: JAVA_HOME is set to an invalid directory. >&2
echo JAVA_HOME = "%JAVA_HOME%" >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
@REM ==== END VALIDATION ====
:init
@REM Find the project base dir, i.e. the directory that contains the folder ".mvn".
@REM Fallback to current working directory if not found.
set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR%
IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir
set EXEC_DIR=%CD%
set WDIR=%EXEC_DIR%
:findBaseDir
IF EXIST "%WDIR%"\.mvn goto baseDirFound
cd ..
IF "%WDIR%"=="%CD%" goto baseDirNotFound
set WDIR=%CD%
goto findBaseDir
:baseDirFound
set MAVEN_PROJECTBASEDIR=%WDIR%
cd "%EXEC_DIR%"
goto endDetectBaseDir
:baseDirNotFound
set MAVEN_PROJECTBASEDIR=%EXEC_DIR%
cd "%EXEC_DIR%"
:endDetectBaseDir
IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig
@setlocal EnableExtensions EnableDelayedExpansion
for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a
@endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS%
:endReadAdditionalConfig
SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe"
set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar"
set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
FOR /F "tokens=1,2 delims==" %%A IN (%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties) DO (
IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B
)
@REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
@REM This allows using the maven wrapper in projects that prohibit checking in binary data.
if exist %WRAPPER_JAR% (
echo Found %WRAPPER_JAR%
) else (
echo Couldn't find %WRAPPER_JAR%, downloading it ...
echo Downloading from: %DOWNLOAD_URL%
powershell -Command "(New-Object Net.WebClient).DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"
echo Finished downloading %WRAPPER_JAR%
)
@REM End of extension
%MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %*
if ERRORLEVEL 1 goto error
goto end
:error
set ERROR_CODE=1
:end
@endlocal & set ERROR_CODE=%ERROR_CODE%
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost
@REM check for post script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat"
if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd"
:skipRcPost
@REM pause the script if MAVEN_BATCH_PAUSE is set to 'on'
if "%MAVEN_BATCH_PAUSE%" == "on" pause
if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE%
exit /B %ERROR_CODE%

77
pom.xml
View File

@@ -1,35 +1,34 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.2.0.RC3</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://projects.spring.io/spring-data-mongodb</url>
<url>https://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.2.0.RC3</version>
</parent>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-distribution</module>
</modules>
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.1.0.M2</springdata.commons>
<mongo>3.6.3</mongo>
<mongo.reactivestreams>1.7.1</mongo.reactivestreams>
<springdata.commons>2.2.0.RC3</springdata.commons>
<mongo>3.11.0</mongo>
<mongo.reactivestreams>1.12.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -39,7 +38,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -50,7 +49,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -61,7 +60,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -72,7 +71,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -83,7 +82,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -94,7 +93,7 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -105,7 +104,7 @@
<name>Mark Paluch</name>
<email>mpaluch at pivotal.io</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.pivotal.io</organizationUrl>
<organizationUrl>https://www.pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -114,62 +113,14 @@
</developers>
<profiles>
<!-- not-yet available profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile -->
<profile>
<id>mongo36-next</id>
<properties>
<mongo>3.6.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>benchmarks</id>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-distribution</module>
<module>spring-data-mongodb-benchmarks</module>
</modules>
</profile>
</profiles>
<dependencies>

View File

@@ -1,6 +1,6 @@
# Benchmarks
Benchmarks are based on [JMH](http://openjdk.java.net/projects/code-tools/jmh/).
Benchmarks are based on [JMH](https://openjdk.java.net/projects/code-tools/jmh/).
# Running Benchmarks

View File

@@ -1,13 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -87,6 +87,7 @@
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -1,148 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate>
<java-module-name>spring.data.mongodb.cross.store</java-module-name>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<exclusions>
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.1.0.M2</version>
</dependency>
<!-- reactive -->
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${hibernate}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>1.8.0.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.2.4.Final</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,214 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
* @deprecated will be removed without replacement.
*/
@Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -1,272 +0,0 @@
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import java.lang.reflect.Field;
import javax.persistence.EntityManager;
import javax.persistence.Transient;
import javax.persistence.Entity;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.crossstore.ChangeSetPersister.NotFoundException;
import org.springframework.data.crossstore.HashMapChangeSet;
import org.springframework.transaction.support.TransactionSynchronizationManager;
/**
* Aspect to turn an object annotated with @Document into a persistent document using Mongo.
*
* @author Thomas Risberg
* @deprecated will be removed without replacement.
*/
@Deprecated
public aspect MongoDocumentBacking {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDocumentBacking.class);
// Aspect shared config
private ChangeSetPersister<Object> changeSetPersister;
public void setChangeSetPersister(ChangeSetPersister<Object> changeSetPersister) {
this.changeSetPersister = changeSetPersister;
}
// ITD to introduce N state to Annotated objects
declare parents : (@Entity *) implements DocumentBacked;
// The annotated fields that will be persisted in MongoDB rather than with JPA
declare @field: @RelatedDocument * (@Entity+ *).*:@Transient;
// -------------------------------------------------------------------------
// Advise user-defined constructors of ChangeSetBacked objects to create a new
// backing ChangeSet
// -------------------------------------------------------------------------
pointcut arbitraryUserConstructorOfChangeSetBackedObject(DocumentBacked entity) :
execution((DocumentBacked+).new(..)) &&
!execution((DocumentBacked+).new(ChangeSet)) &&
this(entity);
pointcut finderConstructorOfChangeSetBackedObject(DocumentBacked entity, ChangeSet cs) :
execution((DocumentBacked+).new(ChangeSet)) &&
this(entity) &&
args(cs);
protected pointcut entityFieldGet(DocumentBacked entity) :
get(@RelatedDocument * DocumentBacked+.*) &&
this(entity) &&
!get(* DocumentBacked.*);
protected pointcut entityFieldSet(DocumentBacked entity, Object newVal) :
set(@RelatedDocument * DocumentBacked+.*) &&
this(entity) &&
args(newVal) &&
!set(* DocumentBacked.*);
// intercept EntityManager.merge calls
public pointcut entityManagerMerge(EntityManager em, Object entity) :
call(* EntityManager.merge(Object)) &&
target(em) &&
args(entity);
// intercept EntityManager.remove calls
// public pointcut entityManagerRemove(EntityManager em, Object entity) :
// call(* EntityManager.remove(Object)) &&
// target(em) &&
// args(entity);
// move changeSet from detached entity to the newly merged persistent object
Object around(EntityManager em, Object entity) : entityManagerMerge(em, entity) {
Object mergedEntity = proceed(em, entity);
if (entity instanceof DocumentBacked && mergedEntity instanceof DocumentBacked) {
((DocumentBacked) mergedEntity).changeSet = ((DocumentBacked) entity).getChangeSet();
}
return mergedEntity;
}
// clear changeSet from removed entity
// Object around(EntityManager em, Object entity) : entityManagerRemove(em, entity) {
// if (entity instanceof DocumentBacked) {
// removeChangeSetValues((DocumentBacked)entity);
// }
// return proceed(em, entity);
// }
private static void removeChangeSetValues(DocumentBacked entity) {
LOGGER.debug("Removing all change-set values for " + entity);
ChangeSet nulledCs = new HashMapChangeSet();
DocumentBacked documentEntity = (DocumentBacked) entity;
@SuppressWarnings("unchecked")
ChangeSetPersister<Object> changeSetPersister = (ChangeSetPersister<Object>) documentEntity.itdChangeSetPersister;
try {
changeSetPersister.getPersistentState(documentEntity.getClass(), documentEntity.get_persistent_id(),
documentEntity.getChangeSet());
} catch (DataAccessException e) {
} catch (NotFoundException e) {
}
for (String key : entity.getChangeSet().getValues().keySet()) {
nulledCs.set(key, null);
}
entity.setChangeSet(nulledCs);
}
before(DocumentBacked entity) : arbitraryUserConstructorOfChangeSetBackedObject(entity) {
LOGGER.debug("User-defined constructor called on DocumentBacked object of class " + entity.getClass());
// Populate all ITD fields
entity.setChangeSet(new HashMapChangeSet());
entity.itdChangeSetPersister = changeSetPersister;
entity.itdTransactionSynchronization = new ChangeSetBackedTransactionSynchronization(changeSetPersister, entity);
// registerTransactionSynchronization(entity);
}
private static void registerTransactionSynchronization(DocumentBacked entity) {
if (TransactionSynchronizationManager.isSynchronizationActive()) {
if (!TransactionSynchronizationManager.getSynchronizations().contains(entity.itdTransactionSynchronization)) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Adding transaction synchronization for " + entity);
}
TransactionSynchronizationManager.registerSynchronization(entity.itdTransactionSynchronization);
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Transaction synchronization already active for " + entity);
}
}
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Transaction synchronization is not active for " + entity);
}
}
}
// -------------------------------------------------------------------------
// ChangeSet-related mixins
// -------------------------------------------------------------------------
// Introduced field
@Transient
private ChangeSet DocumentBacked.changeSet;
@Transient
private ChangeSetPersister<?> DocumentBacked.itdChangeSetPersister;
@Transient
private ChangeSetBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
public void DocumentBacked.setChangeSet(ChangeSet cs) {
this.changeSet = cs;
}
public ChangeSet DocumentBacked.getChangeSet() {
return changeSet;
}
// Flush the entity state to the persistent store
public void DocumentBacked.flush() {
Object id = itdChangeSetPersister.getPersistentId(this, this.changeSet);
itdChangeSetPersister.persistState(this, this.changeSet);
}
public Object DocumentBacked.get_persistent_id() {
return itdChangeSetPersister.getPersistentId(this, this.changeSet);
}
// lifecycle methods
@javax.persistence.PostPersist
public void DocumentBacked.itdPostPersist() {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("JPA lifecycle event PrePersist: " + this.getClass().getName());
}
registerTransactionSynchronization(this);
}
@javax.persistence.PreUpdate
public void DocumentBacked.itdPreUpdate() {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("JPA lifecycle event PreUpdate: " + this.getClass().getName() + " :: " + this);
}
registerTransactionSynchronization(this);
}
@javax.persistence.PostUpdate
public void DocumentBacked.itdPostUpdate() {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("JPA lifecycle event PostUpdate: " + this.getClass().getName() + " :: " + this);
}
registerTransactionSynchronization(this);
}
@javax.persistence.PostRemove
public void DocumentBacked.itdPostRemove() {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("JPA lifecycle event PostRemove: " + this.getClass().getName() + " :: " + this);
}
registerTransactionSynchronization(this);
removeChangeSetValues(this);
}
@javax.persistence.PostLoad
public void DocumentBacked.itdPostLoad() {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("JPA lifecycle event PostLoad: " + this.getClass().getName() + " :: " + this);
}
registerTransactionSynchronization(this);
}
/**
* delegates field reads to the state accessors instance
*/
Object around(DocumentBacked entity): entityFieldGet(entity) {
Field f = field(thisJoinPoint);
String propName = f.getName();
LOGGER.trace("GET " + f + " -> ChangeSet value property [" + propName + "] using: " + entity.getChangeSet());
if (entity.getChangeSet().getValues().get(propName) == null) {
try {
this.changeSetPersister
.getPersistentState(entity.getClass(), entity.get_persistent_id(), entity.getChangeSet());
} catch (NotFoundException e) {
}
}
Object fValue = entity.getChangeSet().getValues().get(propName);
if (fValue != null) {
return fValue;
}
return proceed(entity);
}
/**
* delegates field writes to the state accessors instance
*/
Object around(DocumentBacked entity, Object newVal) : entityFieldSet(entity, newVal) {
Field f = field(thisJoinPoint);
String propName = f.getName();
LOGGER.trace("SET " + f + " -> ChangeSet number value property [" + propName + "] with value=[" + newVal + "]");
entity.getChangeSet().set(propName, newVal);
return proceed(entity, newVal);
}
Field field(JoinPoint joinPoint) {
FieldSignature fieldSignature = (FieldSignature) joinPoint.getSignature();
return fieldSignature.getField();
}
}

View File

@@ -1,31 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
/**
* @author Thomas Risberg
* @deprecated will be removed without replacement.
*/
@Deprecated
@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.FIELD })
public @interface RelatedDocument {
}

View File

@@ -1,5 +0,0 @@
/**
* Infrastructure for Spring Data's MongoDB cross store support.
*/
package org.springframework.data.mongodb.crossstore;

View File

@@ -1,195 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.bson.Document;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
* @author Thomas Risberg
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired MongoTemplate mongoTemplate;
@PersistenceContext EntityManager entityManager;
@Autowired PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate;
@Before
public void setUp() {
txTemplate = new TransactionTemplate(transactionManager);
clearData(Person.class);
Address address = new Address(12, "MAin St.", "Boston", "MA", "02101");
Resume resume = new Resume();
resume.addEducation("Skanstulls High School, 1975");
resume.addEducation("Univ. of Stockholm, 1980");
resume.addJob("DiMark, DBA, 1990-2000");
resume.addJob("VMware, Developer, 2007-");
final Person person = new Person("Thomas", 20);
person.setAddress(address);
person.setResume(resume);
person.setId(1L);
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.persist(person);
return null;
}
});
}
@After
public void tearDown() {
txTemplate.execute(new TransactionCallback<Void>() {
public Void doInTransaction(TransactionStatus status) {
entityManager.remove(entityManager.find(Person.class, 1L));
return null;
}
});
}
private void clearData(Class<?> domainType) {
String collectionName = mongoTemplate.getCollectionName(domainType);
mongoTemplate.dropCollection(collectionName);
}
@Test
@Transactional
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; " + "VMware, Developer, 2007-", found.getResume().getJobs());
}
@Test
@Transactional
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
found.setAge(44);
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
entityManager.merge(found);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; " + "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
final Person detached = entityManager.find(Person.class, 1L);
entityManager.detach(detached);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person result = entityManager.merge(detached);
entityManager.flush();
return result;
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (Document dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,75 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
public class Address {
private Integer streetNumber;
private String streetName;
private String city;
private String state;
private String zip;
public Address(Integer streetNumber, String streetName, String city, String state, String zip) {
super();
this.streetNumber = streetNumber;
this.streetName = streetName;
this.city = city;
this.state = state;
this.zip = zip;
}
public Integer getStreetNumber() {
return streetNumber;
}
public void setStreetNumber(Integer streetNumber) {
this.streetNumber = streetNumber;
}
public String getStreetName() {
return streetName;
}
public void setStreetName(String streetName) {
this.streetName = streetName;
}
public String getCity() {
return city;
}
public void setCity(String city) {
this.city = city;
}
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
public String getZip() {
return zip;
}
public void setZip(String zip) {
this.zip = zip;
}
}

View File

@@ -1,102 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -1,63 +0,0 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -1,15 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
version="2.0"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="test" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>org.springframework.data.mongodb.crossstore.test.Person</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect"/>
<!--value='create' to build a new database on each run; value='update' to modify an existing database; value='create-drop' means the same as 'create' but also drops tables when Hibernate closes; value='validate' makes no changes to the database-->
<property name="hibernate.hbm2ddl.auto" value="update"/>
<property name="hibernate.ejb.naming_strategy" value="org.hibernate.cfg.ImprovedNamingStrategy"/>
</properties>
</persistence-unit>
</persistence>

View File

@@ -1,72 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:tx="http://www.springframework.org/schema/tx"
xmlns:jdbc="http://www.springframework.org/schema/jdbc"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<context:spring-configured/>
<context:component-scan base-package="org.springframework.persistence.mongodb.test">
<context:exclude-filter expression="org.springframework.stereotype.Controller" type="annotation"/>
</context:component-scan>
<mongo:mapping-converter/>
<!-- Mongo config -->
<bean id="mongoClient" class="org.springframework.data.mongodb.core.MongoClientFactoryBean">
<property name="host" value="localhost"/>
<property name="port" value="27017"/>
</bean>
<bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongoClient" ref="mongoClient"/>
<constructor-arg name="databaseName" value="database"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mappingConverter"/>
</bean>
<bean class="org.springframework.data.mongodb.core.MongoExceptionTranslator"/>
<!-- Mongo aspect config -->
<bean class="org.springframework.data.mongodb.crossstore.MongoDocumentBacking"
factory-method="aspectOf">
<property name="changeSetPersister" ref="mongoChangeSetPersister"/>
</bean>
<bean id="mongoChangeSetPersister"
class="org.springframework.data.mongodb.crossstore.MongoChangeSetPersister">
<property name="mongoTemplate" ref="mongoTemplate"/>
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<jdbc:embedded-database id="dataSource" type="HSQL">
</jdbc:embedded-database>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<tx:annotation-driven mode="aspectj" transaction-manager="transactionManager"/>
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean" id="entityManagerFactory">
<property name="persistenceUnitName" value="test"/>
<property name="dataSource" ref="dataSource"/>
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="true"/>
<property name="generateDdl" value="true"/>
<property name="databasePlatform" value="org.hibernate.dialect.HSQLDialect"/>
</bean>
</property>
</bean>
</beans>

View File

@@ -1,18 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<!--
<logger name="org.springframework" level="debug" />
-->
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -1,5 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -28,15 +29,18 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.2.0.RC3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -83,14 +83,14 @@
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
<version>${mongo.reactivestreams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
@@ -107,7 +107,7 @@
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<optional>true</optional>
</dependency>
@@ -119,14 +119,14 @@
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
@@ -253,44 +253,43 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<!-- Kotlin extension -->
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
<version>${kotlin}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-test</artifactId>
<version>${kotlin}</version>
<scope>test</scope>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-core</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.nhaarman</groupId>
<artifactId>mockito-kotlin</artifactId>
<version>1.5.0</version>
<groupId>org.jetbrains.kotlinx</groupId>
<artifactId>kotlinx-coroutines-reactor</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.mockk</groupId>
<artifactId>mockk</artifactId>
<version>${mockk}</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
</exclusion>
<exclusion>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
</exclusion>
<exclusion>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
@@ -328,6 +327,7 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
<useFile>false</useFile>
<includes>
<include>**/*Tests.java</include>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,69 +0,0 @@
/*
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.lang.Nullable;
/**
* Exception being thrown in case we cannot connect to a MongoDB instance.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class CannotGetMongoDbConnectionException extends DataAccessResourceFailureException {
private final UserCredentials credentials;
private final @Nullable String database;
private static final long serialVersionUID = 1172099106475265589L;
public CannotGetMongoDbConnectionException(String msg, Throwable cause) {
super(msg, cause);
this.database = null;
this.credentials = UserCredentials.NO_CREDENTIALS;
}
public CannotGetMongoDbConnectionException(String msg) {
this(msg, null, UserCredentials.NO_CREDENTIALS);
}
public CannotGetMongoDbConnectionException(String msg, @Nullable String database, UserCredentials credentials) {
super(msg);
this.database = database;
this.credentials = credentials;
}
/**
* Returns the {@link UserCredentials} that were used when trying to connect to the MongoDB instance.
*
* @return
*/
public UserCredentials getCredentials() {
return this.credentials;
}
/**
* Returns the name of the database trying to be accessed.
*
* @return
*/
@Nullable
public String getDatabase() {
return database;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2018 the original author or authors.
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,240 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.lang.Nullable;
import org.springframework.transaction.support.ResourceHolderSynchronization;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
/**
* Helper class for managing a {@link MongoDatabase} instances via {@link MongoDbFactory}. Used for obtaining
* {@link ClientSession session bound} resources, such as {@link MongoDatabase} and
* {@link com.mongodb.client.MongoCollection} suitable for transactional usage.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
*/
public class MongoDatabaseUtils {
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDbFactory factory) {
return doGetMongoDatabase(null, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDbFactory factory, SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(null, factory, sessionSynchronization);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(String dbName, MongoDbFactory factory) {
return doGetMongoDatabase(dbName, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(dbName, factory, sessionSynchronization);
}
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "Factory must not be null!");
if (!TransactionSynchronizationManager.isSynchronizationActive()) {
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
ClientSession session = doGetSession(factory, sessionSynchronization);
if (session == null) {
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
MongoDbFactory factoryToUse = factory.withSession(session);
return StringUtils.hasText(dbName) ? factoryToUse.getDb(dbName) : factoryToUse.getDb();
}
/**
* Check if the {@link MongoDbFactory} is actually bound to a {@link ClientSession} that has an active transaction, or
* if a {@link TransactionSynchronization} has been registered for the {@link MongoDbFactory resource} and if the
* associated {@link ClientSession} has an {@link ClientSession#hasActiveTransaction() active transaction}.
*
* @param dbFactory the resource to check transactions for. Must not be {@literal null}.
* @return {@literal true} if the factory has an ongoing transaction.
* @since 2.1.3
*/
public static boolean isTransactionActive(MongoDbFactory dbFactory) {
if (dbFactory.isTransactionActive()) {
return true;
}
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager.getResource(dbFactory);
return resourceHolder != null && resourceHolder.hasActiveTransaction();
}
@Nullable
private static ClientSession doGetSession(MongoDbFactory dbFactory, SessionSynchronization sessionSynchronization) {
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager.getResource(dbFactory);
// check for native MongoDB transaction
if (resourceHolder != null && (resourceHolder.hasSession() || resourceHolder.isSynchronizedWithTransaction())) {
if (!resourceHolder.hasSession()) {
resourceHolder.setSession(createClientSession(dbFactory));
}
return resourceHolder.getSession();
}
if (SessionSynchronization.ON_ACTUAL_TRANSACTION.equals(sessionSynchronization)) {
return null;
}
// init a non native MongoDB transaction by registering a MongoSessionSynchronization
resourceHolder = new MongoResourceHolder(createClientSession(dbFactory), dbFactory);
resourceHolder.getRequiredSession().startTransaction();
TransactionSynchronizationManager
.registerSynchronization(new MongoSessionSynchronization(resourceHolder, dbFactory));
resourceHolder.setSynchronizedWithTransaction(true);
TransactionSynchronizationManager.bindResource(dbFactory, resourceHolder);
return resourceHolder.getSession();
}
private static ClientSession createClientSession(MongoDbFactory dbFactory) {
return dbFactory.getSession(ClientSessionOptions.builder().causallyConsistent(true).build());
}
/**
* MongoDB specific {@link ResourceHolderSynchronization} for resource cleanup at the end of a transaction when
* participating in a non-native MongoDB transaction, such as a Jta or JDBC transaction.
*
* @author Christoph Strobl
* @since 2.1
*/
private static class MongoSessionSynchronization extends ResourceHolderSynchronization<MongoResourceHolder, Object> {
private final MongoResourceHolder resourceHolder;
MongoSessionSynchronization(MongoResourceHolder resourceHolder, MongoDbFactory dbFactory) {
super(resourceHolder, dbFactory);
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
if (resourceHolder.hasActiveTransaction()) {
resourceHolder.getRequiredSession().commitTransaction();
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
if (status == TransactionSynchronization.STATUS_ROLLED_BACK && this.resourceHolder.hasActiveTransaction()) {
resourceHolder.getRequiredSession().abortTransaction();
}
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {
if (resourceHolder.hasActiveSession()) {
resourceHolder.getRequiredSession().close();
}
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -22,8 +22,8 @@ import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating {@link MongoDatabase} instances.
@@ -32,7 +32,7 @@ import com.mongodb.session.ClientSession;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface MongoDbFactory extends CodecRegistryProvider {
public interface MongoDbFactory extends CodecRegistryProvider, MongoSessionProvider {
/**
* Creates a default {@link MongoDatabase} instance.
@@ -58,6 +58,14 @@ public interface MongoDbFactory extends CodecRegistryProvider {
*/
PersistenceExceptionTranslator getExceptionTranslator();
/**
* Get the legacy database entry point. Please consider {@link #getDb()} instead.
*
* @return
* @deprecated since 2.1, use {@link #getDb()}. This method will be removed with a future version as it works only
* with the legacy MongoDB driver.
*/
@Deprecated
DB getLegacyDb();
/**
@@ -100,4 +108,15 @@ public interface MongoDbFactory extends CodecRegistryProvider {
* @since 2.1
*/
MongoDbFactory withSession(ClientSession session);
/**
* Returns if the given {@link MongoDbFactory} is bound to a {@link ClientSession} that has an
* {@link ClientSession#hasActiveTransaction() active transaction}.
*
* @return {@literal true} if there's an active transaction, {@literal false} otherwise.
* @since 2.1.3
*/
default boolean isTransactionActive() {
return false;
}
}

View File

@@ -0,0 +1,153 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.lang.Nullable;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.support.ResourceHolderSupport;
import com.mongodb.client.ClientSession;
/**
* MongoDB specific {@link ResourceHolderSupport resource holder}, wrapping a {@link ClientSession}.
* {@link MongoTransactionManager} binds instances of this class to the thread.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see MongoTransactionManager
* @see org.springframework.data.mongodb.core.MongoTemplate
*/
class MongoResourceHolder extends ResourceHolderSupport {
private @Nullable ClientSession session;
private MongoDbFactory dbFactory;
/**
* Create a new {@link MongoResourceHolder} for a given {@link ClientSession session}.
*
* @param session the associated {@link ClientSession}. Can be {@literal null}.
* @param dbFactory the associated {@link MongoDbFactory}. must not be {@literal null}.
*/
MongoResourceHolder(@Nullable ClientSession session, MongoDbFactory dbFactory) {
this.session = session;
this.dbFactory = dbFactory;
}
/**
* @return the associated {@link ClientSession}. Can be {@literal null}.
*/
@Nullable
ClientSession getSession() {
return session;
}
/**
* @return the required associated {@link ClientSession}.
* @throws IllegalStateException if no {@link ClientSession} is associated with this {@link MongoResourceHolder}.
* @since 2.1.3
*/
ClientSession getRequiredSession() {
ClientSession session = getSession();
if (session == null) {
throw new IllegalStateException("No session available!");
}
return session;
}
/**
* @return the associated {@link MongoDbFactory}.
*/
public MongoDbFactory getDbFactory() {
return dbFactory;
}
/**
* Set the {@link ClientSession} to guard.
*
* @param session can be {@literal null}.
*/
public void setSession(@Nullable ClientSession session) {
this.session = session;
}
/**
* Only set the timeout if it does not match the {@link TransactionDefinition#TIMEOUT_DEFAULT default timeout}.
*
* @param seconds
*/
void setTimeoutIfNotDefaulted(int seconds) {
if (seconds != TransactionDefinition.TIMEOUT_DEFAULT) {
setTimeoutInSeconds(seconds);
}
}
/**
* @return {@literal true} if session is not {@literal null}.
*/
boolean hasSession() {
return session != null;
}
/**
* @return {@literal true} if the session is active and has not been closed.
*/
boolean hasActiveSession() {
if (!hasSession()) {
return false;
}
return hasServerSession() && !getRequiredSession().getServerSession().isClosed();
}
/**
* @return {@literal true} if the session has an active transaction.
* @since 2.1.3
* @see #hasActiveSession()
*/
boolean hasActiveTransaction() {
if (!hasActiveSession()) {
return false;
}
return getRequiredSession().hasActiveTransaction();
}
/**
* @return {@literal true} if the {@link ClientSession} has a {@link com.mongodb.session.ServerSession} associated
* that is accessible via {@link ClientSession#getServerSession()}.
*/
boolean hasServerSession() {
try {
return getRequiredSession().getServerSession() != null;
} catch (IllegalStateException serverSessionClosed) {
// ignore
}
return false;
}
}

View File

@@ -0,0 +1,41 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import com.mongodb.ClientSessionOptions;
import com.mongodb.client.ClientSession;
/**
* A simple interface for obtaining a {@link ClientSession} to be consumed by
* {@link org.springframework.data.mongodb.core.MongoOperations} and MongoDB native operations that support causal
* consistency and transactions.
*
* @author Christoph Strobl
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
*/
@FunctionalInterface
public interface MongoSessionProvider {
/**
* Obtain a {@link ClientSession} with with given options.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @throws org.springframework.dao.DataAccessException
*/
ClientSession getSession(ClientSessionOptions options);
}

View File

@@ -0,0 +1,47 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.lang.Nullable;
/**
* A specific {@link ClientSessionException} related to issues with a transaction such as aborted or non existing
* transactions.
*
* @author Christoph Strobl
* @since 2.1
*/
public class MongoTransactionException extends ClientSessionException {
/**
* Constructor for {@link MongoTransactionException}.
*
* @param msg the detail message. Must not be {@literal null}.
*/
public MongoTransactionException(String msg) {
super(msg);
}
/**
* Constructor for {@link ClientSessionException}.
*
* @param msg the detail message. Can be {@literal null}.
* @param cause the root cause. Can be {@literal null}.
*/
public MongoTransactionException(@Nullable String msg, @Nullable Throwable cause) {
super(msg, cause);
}
}

View File

@@ -0,0 +1,526 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.lang.Nullable;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.TransactionException;
import org.springframework.transaction.TransactionSystemException;
import org.springframework.transaction.support.AbstractPlatformTransactionManager;
import org.springframework.transaction.support.DefaultTransactionStatus;
import org.springframework.transaction.support.ResourceTransactionManager;
import org.springframework.transaction.support.SmartTransactionObject;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoException;
import com.mongodb.TransactionOptions;
import com.mongodb.client.ClientSession;
/**
* A {@link org.springframework.transaction.PlatformTransactionManager} implementation that manages
* {@link ClientSession} based transactions for a single {@link MongoDbFactory}.
* <p />
* Binds a {@link ClientSession} from the specified {@link MongoDbFactory} to the thread.
* <p />
* {@link TransactionDefinition#isReadOnly() Readonly} transactions operate on a {@link ClientSession} and enable causal
* consistency, and also {@link ClientSession#startTransaction() start}, {@link ClientSession#commitTransaction()
* commit} or {@link ClientSession#abortTransaction() abort} a transaction.
* <p />
* Application code is required to retrieve the {@link com.mongodb.client.MongoDatabase} via
* {@link MongoDatabaseUtils#getDatabase(MongoDbFactory)} instead of a standard {@link MongoDbFactory#getDb()} call.
* Spring classes such as {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly.
* <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. One may override
* {@link #doCommit(MongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
* behavior as outlined in the MongoDB reference manual.
*
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
* @see <a href="https://www.mongodb.com/transactions">MongoDB Transaction Documentation</a>
* @see MongoDatabaseUtils#getDatabase(MongoDbFactory, SessionSynchronization)
*/
public class MongoTransactionManager extends AbstractPlatformTransactionManager
implements ResourceTransactionManager, InitializingBean {
private @Nullable MongoDbFactory dbFactory;
private @Nullable TransactionOptions options;
/**
* Create a new {@link MongoTransactionManager} for bean-style usage.
* <p />
* <strong>Note:</strong>The {@link MongoDbFactory db factory} has to be {@link #setDbFactory(MongoDbFactory) set}
* before using the instance. Use this constructor to prepare a {@link MongoTransactionManager} via a
* {@link org.springframework.beans.factory.BeanFactory}.
* <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
*
* @see #setDbFactory(MongoDbFactory)
* @see #setTransactionSynchronization(int)
*/
public MongoTransactionManager() {}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory}.
*
* @param dbFactory must not be {@literal null}.
*/
public MongoTransactionManager(MongoDbFactory dbFactory) {
this(dbFactory, null);
}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory} applying the
* given {@link TransactionOptions options}, if present, when starting a new transaction.
*
* @param dbFactory must not be {@literal null}.
* @param options can be {@literal null}.
*/
public MongoTransactionManager(MongoDbFactory dbFactory, @Nullable TransactionOptions options) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager
.getResource(getRequiredDbFactory());
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
MongoResourceHolder resourceHolder = newResourceHolder(definition,
ClientSessionOptions.builder().causallyConsistent(true).build());
mongoTransactionObject.setResourceHolder(resourceHolder);
if (logger.isDebugEnabled()) {
logger
.debug(String.format("About to start transaction for session %s.", debugString(resourceHolder.getSession())));
}
try {
mongoTransactionObject.startTransaction(options);
} catch (MongoException ex) {
throw new TransactionSystemException(String.format("Could not start Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
if (logger.isDebugEnabled()) {
logger.debug(String.format("Started transaction for session %s.", debugString(resourceHolder.getSession())));
}
resourceHolder.setSynchronizedWithTransaction(true);
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
mongoTransactionObject.setResourceHolder(null);
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to commit transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
try {
doCommit(mongoTransactionObject);
} catch (Exception ex) {
throw new TransactionSystemException(String.format("Could not commit Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
}
/**
* Customization hook to perform an actual commit of the given transaction.<br />
* If a commit operation encounters an error, the MongoDB driver throws a {@link MongoException} holding
* {@literal error labels}. <br />
* By default those labels are ignored, nevertheless one might check for
* {@link MongoException#UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL transient commit errors labels} and retry the the
* commit. <br />
* <code>
* <pre>
* int retries = 3;
* do {
* try {
* transactionObject.commitTransaction();
* break;
* } catch (MongoException ex) {
* if (!ex.hasErrorLabel(MongoException.UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL)) {
* throw ex;
* }
* }
* Thread.sleep(500);
* } while (--retries > 0);
* </pre>
* </code>
*
* @param transactionObject never {@literal null}.
* @throws Exception in case of transaction errors.
*/
protected void doCommit(MongoTransactionObject transactionObject) throws Exception {
transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to abort transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
try {
mongoTransactionObject.abortTransaction();
} catch (MongoException ex) {
throw new TransactionSystemException(String.format("Could not abort Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject transactionObject = extractMongoTransaction(status);
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
Assert.isInstanceOf(MongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
transaction.getClass()));
MongoTransactionObject mongoTransactionObject = (MongoTransactionObject) transaction;
// Remove the connection holder from the thread.
TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
mongoTransactionObject.getRequiredResourceHolder().clear();
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to release Session %s after transaction.",
debugString(mongoTransactionObject.getSession())));
}
mongoTransactionObject.closeSession();
}
/**
* Set the {@link MongoDbFactory} that this instance should manage transactions for.
*
* @param dbFactory must not be {@literal null}.
*/
public void setDbFactory(MongoDbFactory dbFactory) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
}
/**
* Set the {@link TransactionOptions} to be applied when starting transactions.
*
* @param options can be {@literal null}.
*/
public void setOptions(@Nullable TransactionOptions options) {
this.options = options;
}
/**
* Get the {@link MongoDbFactory} that this instance manages transactions for.
*
* @return can be {@literal null}.
*/
@Nullable
public MongoDbFactory getDbFactory() {
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDbFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
}
private MongoResourceHolder newResourceHolder(TransactionDefinition definition, ClientSessionOptions options) {
MongoDbFactory dbFactory = getResourceFactory();
MongoResourceHolder resourceHolder = new MongoResourceHolder(dbFactory.getSession(options), dbFactory);
resourceHolder.setTimeoutIfNotDefaulted(determineTimeout(definition));
return resourceHolder;
}
/**
* @throws IllegalStateException if {@link #dbFactory} is {@literal null}.
*/
private MongoDbFactory getRequiredDbFactory() {
Assert.state(dbFactory != null,
"MongoTransactionManager operates upon a MongoDbFactory. Did you forget to provide one? It's required.");
return dbFactory;
}
private static MongoTransactionObject extractMongoTransaction(Object transaction) {
Assert.isInstanceOf(MongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
transaction.getClass()));
return (MongoTransactionObject) transaction;
}
private static MongoTransactionObject extractMongoTransaction(DefaultTransactionStatus status) {
Assert.isInstanceOf(MongoTransactionObject.class, status.getTransaction(),
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
status.getTransaction().getClass()));
return (MongoTransactionObject) status.getTransaction();
}
private static String debugString(@Nullable ClientSession session) {
if (session == null) {
return "null";
}
String debugString = String.format("[%s@%s ", ClassUtils.getShortName(session.getClass()),
Integer.toHexString(session.hashCode()));
try {
if (session.getServerSession() != null) {
debugString += String.format("id = %s, ", session.getServerSession().getIdentifier());
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("clusterTime = %s", session.getClusterTime());
}
} catch (RuntimeException e) {
debugString += String.format("error = %s", e.getMessage());
}
debugString += "]";
return debugString;
}
/**
* MongoDB specific transaction object, representing a {@link MongoResourceHolder}. Used as transaction object by
* {@link MongoTransactionManager}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see MongoResourceHolder
*/
protected static class MongoTransactionObject implements SmartTransactionObject {
private @Nullable MongoResourceHolder resourceHolder;
MongoTransactionObject(@Nullable MongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* Set the {@link MongoResourceHolder}.
*
* @param resourceHolder can be {@literal null}.
*/
void setResourceHolder(@Nullable MongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* @return {@literal true} if a {@link MongoResourceHolder} is set.
*/
final boolean hasResourceHolder() {
return resourceHolder != null;
}
/**
* Start a MongoDB transaction optionally given {@link TransactionOptions}.
*
* @param options can be {@literal null}
*/
void startTransaction(@Nullable TransactionOptions options) {
ClientSession session = getRequiredSession();
if (options != null) {
session.startTransaction(options);
} else {
session.startTransaction();
}
}
/**
* Commit the transaction.
*/
public void commitTransaction() {
getRequiredSession().commitTransaction();
}
/**
* Rollback (abort) the transaction.
*/
public void abortTransaction() {
getRequiredSession().abortTransaction();
}
/**
* Close a {@link ClientSession} without regard to its transactional state.
*/
void closeSession() {
ClientSession session = getRequiredSession();
if (session.getServerSession() != null && !session.getServerSession().isClosed()) {
session.close();
}
}
@Nullable
public ClientSession getSession() {
return resourceHolder != null ? resourceHolder.getSession() : null;
}
private MongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present. o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null.");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import reactor.core.publisher.Mono;
@@ -24,8 +23,8 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating reactive {@link MongoDatabase} instances.
@@ -88,4 +87,16 @@ public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
* @since 2.1
*/
ReactiveMongoDatabaseFactory withSession(ClientSession session);
/**
* Returns if the given {@link ReactiveMongoDatabaseFactory} is bound to a
* {@link com.mongodb.reactivestreams.client.ClientSession} that has an
* {@link com.mongodb.reactivestreams.client.ClientSession#hasActiveTransaction() active transaction}.
*
* @return {@literal true} if there's an active transaction, {@literal false} otherwise.
* @since 2.2
*/
default boolean isTransactionActive() {
return false;
}
}

View File

@@ -0,0 +1,278 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import reactor.core.publisher.Mono;
import reactor.util.context.Context;
import org.springframework.lang.Nullable;
import org.springframework.transaction.NoTransactionException;
import org.springframework.transaction.reactive.ReactiveResourceSynchronization;
import org.springframework.transaction.reactive.TransactionSynchronization;
import org.springframework.transaction.reactive.TransactionSynchronizationManager;
import org.springframework.transaction.support.ResourceHolderSynchronization;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Helper class for managing reactive {@link MongoDatabase} instances via {@link ReactiveMongoDatabaseFactory}. Used for
* obtaining {@link ClientSession session bound} resources, such as {@link MongoDatabase} and {@link MongoCollection}
* suitable for transactional usage.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
*/
public class ReactiveMongoDatabaseUtils {
/**
* Check if the {@link ReactiveMongoDatabaseFactory} is actually bound to a
* {@link com.mongodb.reactivestreams.client.ClientSession} that has an active transaction, or if a
* {@link org.springframework.transaction.reactive.TransactionSynchronization} has been registered for the
* {@link ReactiveMongoDatabaseFactory resource} and if the associated
* {@link com.mongodb.reactivestreams.client.ClientSession} has an
* {@link com.mongodb.reactivestreams.client.ClientSession#hasActiveTransaction() active transaction}.
*
* @param databaseFactory the resource to check transactions for. Must not be {@literal null}.
* @return a {@link Mono} emitting {@literal true} if the factory has an ongoing transaction.
*/
public static Mono<Boolean> isTransactionActive(ReactiveMongoDatabaseFactory databaseFactory) {
if (databaseFactory.isTransactionActive()) {
return Mono.just(true);
}
return TransactionSynchronizationManager.forCurrentTransaction() //
.map(it -> {
ReactiveMongoResourceHolder holder = (ReactiveMongoResourceHolder) it.getResource(databaseFactory);
return holder != null && holder.hasActiveTransaction();
}) //
.onErrorResume(NoTransactionException.class, e -> Mono.just(false));
}
/**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link ReactiveMongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static Mono<MongoDatabase> getDatabase(ReactiveMongoDatabaseFactory factory) {
return doGetMongoDatabase(null, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the default {@link MongoDatabase database} form the given {@link ReactiveMongoDatabaseFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link ReactiveMongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static Mono<MongoDatabase> getDatabase(ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(null, factory, sessionSynchronization);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory} using {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link ReactiveMongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static Mono<MongoDatabase> getDatabase(String dbName, ReactiveMongoDatabaseFactory factory) {
return doGetMongoDatabase(dbName, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link ReactiveMongoDatabaseFactory
* factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the subscriber
* {@link Context} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link ReactiveMongoDatabaseFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static Mono<MongoDatabase> getDatabase(String dbName, ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(dbName, factory, sessionSynchronization);
}
private static Mono<MongoDatabase> doGetMongoDatabase(@Nullable String dbName, ReactiveMongoDatabaseFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "DatabaseFactory must not be null!");
return TransactionSynchronizationManager.forCurrentTransaction()
.filter(TransactionSynchronizationManager::isSynchronizationActive) //
.flatMap(synchronizationManager -> {
return doGetSession(synchronizationManager, factory, sessionSynchronization) //
.map(it -> getMongoDatabaseOrDefault(dbName, factory.withSession(it)));
})
.onErrorResume(NoTransactionException.class,
e -> Mono.fromSupplier(() -> getMongoDatabaseOrDefault(dbName, factory)))
.defaultIfEmpty(getMongoDatabaseOrDefault(dbName, factory));
}
private static MongoDatabase getMongoDatabaseOrDefault(@Nullable String dbName,
ReactiveMongoDatabaseFactory factory) {
return StringUtils.hasText(dbName) ? factory.getMongoDatabase(dbName) : factory.getMongoDatabase();
}
private static Mono<ClientSession> doGetSession(TransactionSynchronizationManager synchronizationManager,
ReactiveMongoDatabaseFactory dbFactory, SessionSynchronization sessionSynchronization) {
final ReactiveMongoResourceHolder registeredHolder = (ReactiveMongoResourceHolder) synchronizationManager
.getResource(dbFactory);
// check for native MongoDB transaction
if (registeredHolder != null
&& (registeredHolder.hasSession() || registeredHolder.isSynchronizedWithTransaction())) {
return registeredHolder.hasSession() ? Mono.just(registeredHolder.getSession())
: createClientSession(dbFactory).map(registeredHolder::setSessionIfAbsent);
}
if (SessionSynchronization.ON_ACTUAL_TRANSACTION.equals(sessionSynchronization)) {
return Mono.empty();
}
// init a non native MongoDB transaction by registering a MongoSessionSynchronization
return createClientSession(dbFactory).map(session -> {
ReactiveMongoResourceHolder newHolder = new ReactiveMongoResourceHolder(session, dbFactory);
newHolder.getRequiredSession().startTransaction();
synchronizationManager
.registerSynchronization(new MongoSessionSynchronization(synchronizationManager, newHolder, dbFactory));
newHolder.setSynchronizedWithTransaction(true);
synchronizationManager.bindResource(dbFactory, newHolder);
return newHolder.getSession();
});
}
private static Mono<ClientSession> createClientSession(ReactiveMongoDatabaseFactory dbFactory) {
return dbFactory.getSession(ClientSessionOptions.builder().causallyConsistent(true).build());
}
/**
* MongoDB specific {@link ResourceHolderSynchronization} for resource cleanup at the end of a transaction when
* participating in a non-native MongoDB transaction, such as a R2CBC transaction.
*
* @author Mark Paluch
* @since 2.2
*/
private static class MongoSessionSynchronization
extends ReactiveResourceSynchronization<ReactiveMongoResourceHolder, Object> {
private final ReactiveMongoResourceHolder resourceHolder;
MongoSessionSynchronization(TransactionSynchronizationManager synchronizationManager,
ReactiveMongoResourceHolder resourceHolder, ReactiveMongoDatabaseFactory dbFactory) {
super(resourceHolder, dbFactory, synchronizationManager);
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
if (isTransactionActive(resourceHolder)) {
return Mono.from(resourceHolder.getRequiredSession().commitTransaction());
}
return Mono.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override
public Mono<Void> afterCompletion(int status) {
return Mono.defer(() -> {
if (status == TransactionSynchronization.STATUS_ROLLED_BACK && isTransactionActive(this.resourceHolder)) {
return Mono.from(resourceHolder.getRequiredSession().abortTransaction()) //
.then(super.afterCompletion(status));
}
return super.afterCompletion(status);
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {
return Mono.fromRunnable(() -> {
if (resourceHolder.hasActiveSession()) {
resourceHolder.getRequiredSession().close();
}
});
}
private boolean isTransactionActive(ReactiveMongoResourceHolder resourceHolder) {
if (!resourceHolder.hasSession()) {
return false;
}
return resourceHolder.getRequiredSession().hasActiveTransaction();
}
}
}

View File

@@ -0,0 +1,155 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.lang.Nullable;
import org.springframework.transaction.support.ResourceHolderSupport;
import com.mongodb.reactivestreams.client.ClientSession;
/**
* MongoDB specific resource holder, wrapping a {@link ClientSession}. {@link ReactiveMongoTransactionManager} binds
* instances of this class to the subscriber context.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.2
* @see ReactiveMongoTransactionManager
* @see ReactiveMongoTemplate
*/
class ReactiveMongoResourceHolder extends ResourceHolderSupport {
private @Nullable ClientSession session;
private ReactiveMongoDatabaseFactory databaseFactory;
/**
* Create a new {@link ReactiveMongoResourceHolder} for a given {@link ClientSession session}.
*
* @param session the associated {@link ClientSession}. Can be {@literal null}.
* @param databaseFactory the associated {@link MongoDbFactory}. must not be {@literal null}.
*/
ReactiveMongoResourceHolder(@Nullable ClientSession session, ReactiveMongoDatabaseFactory databaseFactory) {
this.session = session;
this.databaseFactory = databaseFactory;
}
/**
* @return the associated {@link ClientSession}. Can be {@literal null}.
*/
@Nullable
ClientSession getSession() {
return session;
}
/**
* @return the required associated {@link ClientSession}.
* @throws IllegalStateException if no session is associated.
*/
ClientSession getRequiredSession() {
ClientSession session = getSession();
if (session == null) {
throw new IllegalStateException("No ClientSession associated");
}
return session;
}
/**
* @return the associated {@link ReactiveMongoDatabaseFactory}.
*/
public ReactiveMongoDatabaseFactory getDatabaseFactory() {
return databaseFactory;
}
/**
* Set the {@link ClientSession} to guard.
*
* @param session can be {@literal null}.
*/
public void setSession(@Nullable ClientSession session) {
this.session = session;
}
/**
* @return {@literal true} if session is not {@literal null}.
*/
boolean hasSession() {
return session != null;
}
/**
* If the {@link ReactiveMongoResourceHolder} is {@link #hasSession() not already associated} with a
* {@link ClientSession} the given value is {@link #setSession(ClientSession) set} and returned, otherwise the current
* bound session is returned.
*
* @param session
* @return
*/
@Nullable
public ClientSession setSessionIfAbsent(@Nullable ClientSession session) {
if (!hasSession()) {
setSession(session);
}
return session;
}
/**
* @return {@literal true} if the session is active and has not been closed.
*/
boolean hasActiveSession() {
if (!hasSession()) {
return false;
}
return hasServerSession() && !getRequiredSession().getServerSession().isClosed();
}
/**
* @return {@literal true} if the session has an active transaction.
* @see #hasActiveSession()
*/
boolean hasActiveTransaction() {
if (!hasActiveSession()) {
return false;
}
return getRequiredSession().hasActiveTransaction();
}
/**
* @return {@literal true} if the {@link ClientSession} has a {@link com.mongodb.session.ServerSession} associated
* that is accessible via {@link ClientSession#getServerSession()}.
*/
boolean hasServerSession() {
try {
return getRequiredSession().getServerSession() != null;
} catch (IllegalStateException serverSessionClosed) {
// ignore
}
return false;
}
}

View File

@@ -0,0 +1,530 @@
/*
* Copyright 2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import reactor.core.publisher.Mono;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.lang.Nullable;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.TransactionException;
import org.springframework.transaction.TransactionSystemException;
import org.springframework.transaction.reactive.AbstractReactiveTransactionManager;
import org.springframework.transaction.reactive.GenericReactiveTransaction;
import org.springframework.transaction.reactive.TransactionSynchronizationManager;
import org.springframework.transaction.support.SmartTransactionObject;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoException;
import com.mongodb.TransactionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
/**
* A {@link org.springframework.transaction.ReactiveTransactionManager} implementation that manages
* {@link com.mongodb.reactivestreams.client.ClientSession} based transactions for a single
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory}.
* <p />
* Binds a {@link ClientSession} from the specified
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory} to the subscriber
* {@link reactor.util.context.Context}.
* <p />
* {@link org.springframework.transaction.TransactionDefinition#isReadOnly() Readonly} transactions operate on a
* {@link ClientSession} and enable causal consistency, and also {@link ClientSession#startTransaction() start},
* {@link com.mongodb.reactivestreams.client.ClientSession#commitTransaction() commit} or
* {@link ClientSession#abortTransaction() abort} a transaction.
* <p />
* Application code is required to retrieve the {@link com.mongodb.reactivestreams.client.MongoDatabase} via
* {@link org.springframework.data.mongodb.ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory)} instead
* of a standard {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()} call. Spring
* classes such as {@link org.springframework.data.mongodb.core.ReactiveMongoTemplate} use this strategy implicitly.
* <p />
* By default failure of a {@literal commit} operation raises a {@link TransactionSystemException}. You can override
* {@link #doCommit(TransactionSynchronizationManager, ReactiveMongoTransactionObject)} to implement the
* <a href="https://docs.mongodb.com/manual/core/transactions/#retry-commit-operation">Retry Commit Operation</a>
* behavior as outlined in the MongoDB reference manual.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
* @see <a href="https://www.mongodb.com/transactions">MongoDB Transaction Documentation</a>
* @see ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory, SessionSynchronization)
*/
public class ReactiveMongoTransactionManager extends AbstractReactiveTransactionManager implements InitializingBean {
private @Nullable ReactiveMongoDatabaseFactory databaseFactory;
private @Nullable TransactionOptions options;
/**
* Create a new {@link ReactiveMongoTransactionManager} for bean-style usage.
* <p />
* <strong>Note:</strong>The {@link org.springframework.data.mongodb.ReactiveMongoDatabaseFactory db factory} has to
* be {@link #setDatabaseFactory(ReactiveMongoDatabaseFactory)} set} before using the instance. Use this constructor
* to prepare a {@link ReactiveMongoTransactionManager} via a {@link org.springframework.beans.factory.BeanFactory}.
* <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
*
* @see #setDatabaseFactory(ReactiveMongoDatabaseFactory)
*/
public ReactiveMongoTransactionManager() {}
/**
* Create a new {@link ReactiveMongoTransactionManager} obtaining sessions from the given
* {@link ReactiveMongoDatabaseFactory}.
*
* @param databaseFactory must not be {@literal null}.
*/
public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory) {
this(databaseFactory, null);
}
/**
* Create a new {@link ReactiveMongoTransactionManager} obtaining sessions from the given
* {@link ReactiveMongoDatabaseFactory} applying the given {@link TransactionOptions options}, if present, when
* starting a new transaction.
*
* @param databaseFactory must not be {@literal null}.
* @param options can be {@literal null}.
*/
public ReactiveMongoTransactionManager(ReactiveMongoDatabaseFactory databaseFactory,
@Nullable TransactionOptions options) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory;
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException {
ReactiveMongoResourceHolder resourceHolder = (ReactiveMongoResourceHolder) synchronizationManager
.getResource(getRequiredDatabaseFactory());
return new ReactiveMongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException {
return Mono.defer(() -> {
ReactiveMongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
Mono<ReactiveMongoResourceHolder> holder = newResourceHolder(definition,
ClientSessionOptions.builder().causallyConsistent(true).build());
return holder.doOnNext(resourceHolder -> {
mongoTransactionObject.setResourceHolder(resourceHolder);
if (logger.isDebugEnabled()) {
logger.debug(
String.format("About to start transaction for session %s.", debugString(resourceHolder.getSession())));
}
}).doOnNext(resourceHolder -> {
mongoTransactionObject.startTransaction(options);
if (logger.isDebugEnabled()) {
logger.debug(String.format("Started transaction for session %s.", debugString(resourceHolder.getSession())));
}
})//
.onErrorMap(
ex -> new TransactionSystemException(String.format("Could not start Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex))
.doOnSuccess(resourceHolder -> {
synchronizationManager.bindResource(getRequiredDatabaseFactory(), resourceHolder);
}).then();
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException {
return Mono.fromSupplier(() -> {
ReactiveMongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
mongoTransactionObject.setResourceHolder(null);
return synchronizationManager.unbindResource(getRequiredDatabaseFactory());
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) {
return Mono
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
return Mono.defer(() -> {
ReactiveMongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to commit transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
return doCommit(synchronizationManager, mongoTransactionObject).onErrorMap(ex -> {
return new TransactionSystemException(String.format("Could not commit Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
});
});
}
/**
* Customization hook to perform an actual commit of the given transaction.<br />
* If a commit operation encounters an error, the MongoDB driver throws a {@link MongoException} holding
* {@literal error labels}. <br />
* By default those labels are ignored, nevertheless one might check for
* {@link MongoException#UNKNOWN_TRANSACTION_COMMIT_RESULT_LABEL transient commit errors labels} and retry the the
* commit.
*
* @param synchronizationManager reactive synchronization manager.
* @param transactionObject never {@literal null}.
*/
protected Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
ReactiveMongoTransactionObject transactionObject) {
return transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) {
return Mono.defer(() -> {
ReactiveMongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to abort transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
return mongoTransactionObject.abortTransaction().onErrorResume(MongoException.class, ex -> {
return Mono
.error(new TransactionSystemException(String.format("Could not abort Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex));
});
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
return Mono.fromRunnable(() -> {
ReactiveMongoTransactionObject transactionObject = extractMongoTransaction(status);
transactionObject.getRequiredResourceHolder().setRollbackOnly();
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) {
Assert.isInstanceOf(ReactiveMongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", ReactiveMongoTransactionObject.class,
transaction.getClass()));
return Mono.fromRunnable(() -> {
ReactiveMongoTransactionObject mongoTransactionObject = (ReactiveMongoTransactionObject) transaction;
// Remove the connection holder from the thread.
synchronizationManager.unbindResource(getRequiredDatabaseFactory());
mongoTransactionObject.getRequiredResourceHolder().clear();
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to release Session %s after transaction.",
debugString(mongoTransactionObject.getSession())));
}
mongoTransactionObject.closeSession();
});
}
/**
* Set the {@link ReactiveMongoDatabaseFactory} that this instance should manage transactions for.
*
* @param databaseFactory must not be {@literal null}.
*/
public void setDatabaseFactory(ReactiveMongoDatabaseFactory databaseFactory) {
Assert.notNull(databaseFactory, "DatabaseFactory must not be null!");
this.databaseFactory = databaseFactory;
}
/**
* Set the {@link TransactionOptions} to be applied when starting transactions.
*
* @param options can be {@literal null}.
*/
public void setOptions(@Nullable TransactionOptions options) {
this.options = options;
}
/**
* Get the {@link ReactiveMongoDatabaseFactory} that this instance manages transactions for.
*
* @return can be {@literal null}.
*/
@Nullable
public ReactiveMongoDatabaseFactory getDatabaseFactory() {
return databaseFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDatabaseFactory();
}
private Mono<ReactiveMongoResourceHolder> newResourceHolder(TransactionDefinition definition,
ClientSessionOptions options) {
ReactiveMongoDatabaseFactory dbFactory = getRequiredDatabaseFactory();
return dbFactory.getSession(options).map(session -> new ReactiveMongoResourceHolder(session, dbFactory));
}
/**
* @throws IllegalStateException if {@link #databaseFactory} is {@literal null}.
*/
private ReactiveMongoDatabaseFactory getRequiredDatabaseFactory() {
Assert.state(databaseFactory != null,
"ReactiveMongoTransactionManager operates upon a ReactiveMongoDatabaseFactory. Did you forget to provide one? It's required.");
return databaseFactory;
}
private static ReactiveMongoTransactionObject extractMongoTransaction(Object transaction) {
Assert.isInstanceOf(ReactiveMongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", ReactiveMongoTransactionObject.class,
transaction.getClass()));
return (ReactiveMongoTransactionObject) transaction;
}
private static ReactiveMongoTransactionObject extractMongoTransaction(GenericReactiveTransaction status) {
Assert.isInstanceOf(ReactiveMongoTransactionObject.class, status.getTransaction(),
() -> String.format("Expected to find a %s but it turned out to be %s.", ReactiveMongoTransactionObject.class,
status.getTransaction().getClass()));
return (ReactiveMongoTransactionObject) status.getTransaction();
}
private static String debugString(@Nullable ClientSession session) {
if (session == null) {
return "null";
}
String debugString = String.format("[%s@%s ", ClassUtils.getShortName(session.getClass()),
Integer.toHexString(session.hashCode()));
try {
if (session.getServerSession() != null) {
debugString += String.format("id = %s, ", session.getServerSession().getIdentifier());
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("clusterTime = %s", session.getClusterTime());
}
} catch (RuntimeException e) {
debugString += String.format("error = %s", e.getMessage());
}
debugString += "]";
return debugString;
}
/**
* MongoDB specific transaction object, representing a {@link MongoResourceHolder}. Used as transaction object by
* {@link ReactiveMongoTransactionManager}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
* @see ReactiveMongoResourceHolder
*/
protected static class ReactiveMongoTransactionObject implements SmartTransactionObject {
private @Nullable ReactiveMongoResourceHolder resourceHolder;
ReactiveMongoTransactionObject(@Nullable ReactiveMongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* Set the {@link MongoResourceHolder}.
*
* @param resourceHolder can be {@literal null}.
*/
void setResourceHolder(@Nullable ReactiveMongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* @return {@literal true} if a {@link MongoResourceHolder} is set.
*/
final boolean hasResourceHolder() {
return resourceHolder != null;
}
/**
* Start a MongoDB transaction optionally given {@link TransactionOptions}.
*
* @param options can be {@literal null}
*/
void startTransaction(@Nullable TransactionOptions options) {
ClientSession session = getRequiredSession();
if (options != null) {
session.startTransaction(options);
} else {
session.startTransaction();
}
}
/**
* Commit the transaction.
*/
public Mono<Void> commitTransaction() {
return Mono.from(getRequiredSession().commitTransaction());
}
/**
* Rollback (abort) the transaction.
*/
public Mono<Void> abortTransaction() {
return Mono.from(getRequiredSession().abortTransaction());
}
/**
* Close a {@link ClientSession} without regard to its transactional state.
*/
void closeSession() {
ClientSession session = getRequiredSession();
if (session.getServerSession() != null && !session.getServerSession().isClosed()) {
session.close();
}
}
@Nullable
public ClientSession getSession() {
return resourceHolder != null ? resourceHolder.getSession() : null;
}
private ReactiveMongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "ReactiveMongoResourceHolder is required but not present. o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null.");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
throw new UnsupportedOperationException("flush() not supported");
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2018 the original author or authors.
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -57,6 +57,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
private final Class<?> targetType;
private final Class<?> collectionType;
private final Class<?> databaseType;
private final Class<? extends ClientSession> sessionType;
/**
* Create a new SessionAwareMethodInterceptor for given target.
@@ -71,12 +72,13 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
* {@code MongoCollection}.
* @param <T> target object type.
*/
public <T> SessionAwareMethodInterceptor(ClientSession session, T target, Class<D> databaseType,
ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
public <T> SessionAwareMethodInterceptor(ClientSession session, T target, Class<? extends ClientSession> sessionType,
Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null!");
Assert.notNull(sessionType, "SessionType must not be null!");
Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null!");
@@ -90,6 +92,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.databaseDecorator = databaseDecorator;
this.targetType = ClassUtils.isAssignable(databaseType, target.getClass()) ? databaseType : collectionType;
this.sessionType = sessionType;
}
/*
@@ -114,7 +117,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
return methodInvocation.proceed();
}
Optional<Method> targetMethod = METHOD_CACHE.lookup(methodInvocation.getMethod(), targetType);
Optional<Method> targetMethod = METHOD_CACHE.lookup(methodInvocation.getMethod(), targetType, sessionType);
return !targetMethod.isPresent() ? methodInvocation.proceed()
: ReflectionUtils.invokeMethod(targetMethod.get(), target,
@@ -171,18 +174,19 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
* @param targetClass
* @return
*/
Optional<Method> lookup(Method method, Class<?> targetClass) {
Optional<Method> lookup(Method method, Class<?> targetClass, Class<? extends ClientSession> sessionType) {
return cache.computeIfAbsent(new MethodClassKey(method, targetClass),
val -> Optional.ofNullable(findTargetWithSession(method, targetClass)));
val -> Optional.ofNullable(findTargetWithSession(method, targetClass, sessionType)));
}
@Nullable
private Method findTargetWithSession(Method sourceMethod, Class<?> targetType) {
private Method findTargetWithSession(Method sourceMethod, Class<?> targetType,
Class<? extends ClientSession> sessionType) {
Class<?>[] argTypes = sourceMethod.getParameterTypes();
Class<?>[] args = new Class<?>[argTypes.length + 1];
args[0] = ClientSession.class;
args[0] = sessionType;
System.arraycopy(argTypes, 0, args, 1, argTypes.length);
return ReflectionUtils.findMethod(targetType, sourceMethod.getName(), args);

View File

@@ -0,0 +1,38 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
/**
* {@link SessionSynchronization} is used along with {@link org.springframework.data.mongodb.core.MongoTemplate} to
* define in which type of transactions to participate if any.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
public enum SessionSynchronization {
/**
* Synchronize with any transaction even with empty transactions and initiate a MongoDB transaction when doing so by
* registering a MongoDB specific {@link org.springframework.transaction.support.ResourceHolderSynchronization}.
*/
ALWAYS,
/**
* Synchronize with native MongoDB transactions initiated via {@link MongoTransactionManager}.
*/
ON_ACTUAL_TRANSACTION;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,112 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoClientDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.lang.Nullable;
import com.mongodb.client.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.client.MongoClient}.
*
* @author Christoph Strobl
* @since 2.1
* @see MongoConfigurationSupport
* @see AbstractMongoConfiguration
*/
@Configuration
public abstract class AbstractMongoClientConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}.
*
* @return
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
* @return
*/
@Bean
public MongoDbFactory mongoDbFactory() {
return new SimpleMongoClientDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -29,7 +29,10 @@ import org.springframework.lang.Nullable;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.MongoClient}.
* <p />
* <strong>INFO:</strong>In case you want to use {@link com.mongodb.client.MongoClients} for configuration please refer
* to {@link AbstractMongoClientConfiguration}.
*
* @author Mark Pollack
* @author Oliver Gierke
@@ -38,10 +41,10 @@ import com.mongodb.MongoClient;
* @author Christoph Strobl
* @author Mark Paluch
* @see MongoConfigurationSupport
* @see AbstractMongoClientConfiguration
*/
@Configuration
public abstract class
AbstractMongoConfiguration extends MongoConfigurationSupport {
public abstract class AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
@@ -63,7 +66,7 @@ AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongo()}.
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
@@ -108,7 +111,9 @@ AbstractMongoConfiguration extends MongoConfigurationSupport {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(mongoDbFactory());
return converter;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -22,6 +22,7 @@ import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.NoOpDbRefResolver;
import com.mongodb.reactivestreams.client.MongoClient;
@@ -80,9 +81,9 @@ public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurat
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(ReactiveMongoTemplate.NO_OP_REF_RESOLVER,
mongoMappingContext());
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mongoMappingContext());
converter.setCustomConversions(customConversions());
converter.setCodecRegistryProvider(reactiveMongoDbFactory());
return converter;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -51,7 +51,6 @@ import org.springframework.core.type.filter.AssignableTypeFilter;
import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
@@ -60,6 +59,7 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCre
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -75,6 +75,7 @@ import org.w3c.dom.Element;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Zied Yaich
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -100,8 +101,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition, id);
createIsNewStrategyFactoryBeanDefinition(ctxRef, parserContext, element);
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {
@@ -159,6 +158,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
@Nullable
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
String disableValidation = element.getAttribute("disable-validation");
@@ -180,6 +180,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
@Nullable
private RuntimeBeanReference getValidator(Object source, ParserContext parserContext) {
if (!JSR_303_PRESENT) {
@@ -197,7 +198,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
}
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
@Nullable BeanDefinition conversionsDefinition, @Nullable String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
@@ -211,7 +212,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element);
Set<String> classesToAdd = getInitialEntityClasses(element);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
@@ -262,6 +263,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
}
}
@Nullable
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
@@ -269,7 +271,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
if (customConvertersElements.size() == 1) {
Element customerConvertersElement = customConvertersElements.get(0);
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<BeanMetadataElement>();
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<>();
List<Element> converterElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (converterElements != null) {
@@ -285,9 +287,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class),
new AssignableTypeFilter(GenericConverter.class)));
for (BeanDefinition candidate : provider.findCandidateComponents(packageToScan)) {
converterBeans.add(candidate);
}
converterBeans.addAll(provider.findCandidateComponents(packageToScan));
}
BeanDefinitionBuilder conversionsBuilder = BeanDefinitionBuilder.rootBeanDefinition(MongoCustomConversions.class);
@@ -304,7 +304,8 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
private static Set<String> getInititalEntityClasses(Element element) {
@Nullable
private static Set<String> getInitialEntityClasses(Element element) {
String basePackage = element.getAttribute(BASE_PACKAGE);
@@ -317,7 +318,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
Set<String> classes = new ManagedSet<String>();
Set<String> classes = new ManagedSet<>();
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
classes.add(candidate.getBeanClassName());
}
@@ -325,6 +326,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return classes;
}
@Nullable
public BeanMetadataElement parseConverter(Element element, ParserContext parserContext) {
String converterRef = element.getAttribute("ref");
@@ -343,20 +345,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
public static String createIsNewStrategyFactoryBeanDefinition(String mappingContextRef, ParserContext context,
Element element) {
BeanDefinitionBuilder mappingContextStrategyFactoryBuilder = BeanDefinitionBuilder
.rootBeanDefinition(MappingContextIsNewStrategyFactory.class);
mappingContextStrategyFactoryBuilder.addConstructorArgReference(mappingContextRef);
BeanComponentDefinitionBuilder builder = new BeanComponentDefinitionBuilder(element, context);
context.registerBeanComponent(
builder.getComponent(mappingContextStrategyFactoryBuilder, IS_NEW_STRATEGY_FACTORY_BEAN_NAME));
return IS_NEW_STRATEGY_FACTORY_BEAN_NAME;
}
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*
@@ -375,7 +363,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
Assert.notNull(filters, "TypeFilters must not be null");
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2018 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
@@ -26,25 +27,33 @@ import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.auditing.config.IsNewAwareAuditingHandlerBeanDefinitionParser;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
* {@link BeanDefinitionParser} to register a {@link AuditingEntityCallback} to transparently set auditing information
* on an entity.
*
* @author Oliver Gierke
* @author Mark Paluch
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEventListener.class;
return AuditingEntityCallback.class;
}
/*
@@ -80,7 +89,24 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
mappingContextRef);
parser.parse(element, parserContext);
builder.addConstructorArgValue(getObjectFactoryBeanDefinition(parser.getResolvedBeanName(),
parserContext.extractSource(element)));
AbstractBeanDefinition isNewAwareAuditingHandler = getObjectFactoryBeanDefinition(parser.getResolvedBeanName(),
parserContext.extractSource(element));
builder.addConstructorArgValue(isNewAwareAuditingHandler);
if (PROJECT_REACTOR_AVAILABLE) {
registerReactiveAuditingEntityCallback(parserContext.getRegistry(), isNewAwareAuditingHandler,
parserContext.extractSource(element));
}
}
private void registerReactiveAuditingEntityCallback(BeanDefinitionRegistry registry,
AbstractBeanDefinition isNewAwareAuditingHandler, @Nullable Object source) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(isNewAwareAuditingHandler);
builder.getRawBeanDefinition().setSource(source);
registry.registerBeanDefinition(ReactiveAuditingEntityCallback.class.getName(), builder.getBeanDefinition());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -32,17 +32,23 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
@@ -104,12 +110,27 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
.rootBeanDefinition(AuditingEntityCallback.class);
listenerBeanDefinitionBuilder
.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
AuditingEntityCallback.class.getName(), registry);
if (PROJECT_REACTOR_AVAILABLE) {
registerReactiveAuditingEntityCallback(registry, auditingHandlerDefinition.getSource());
}
}
private void registerReactiveAuditingEntityCallback(BeanDefinitionRegistry registry, Object source) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(source);
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
/**

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
@@ -28,17 +27,12 @@ import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -87,23 +81,11 @@ public abstract class MongoConfigurationSupport {
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
mappingContext.setAutoIndexCreation(autoIndexCreation());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(
new PersistentEntities(Arrays.<MappingContext<?, ?>> asList(new MappingContext[] { mongoMappingContext() }))));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
@@ -190,4 +172,16 @@ public abstract class MongoConfigurationSupport {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
/**
* Configure whether to automatically create indices for domain types by deriving the
* {@link org.springframework.data.mongodb.core.index.IndexDefinition} from the entity or not.
*
* @return {@literal true} by default. <br />
* <strong>INFO</strong>: As of 3.x the default will be set to {@literal false}.
* @since 2.2
*/
protected boolean autoIndexCreation() {
return true;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.Arrays;
@@ -26,6 +27,7 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
@@ -35,6 +37,8 @@ import com.mongodb.MongoCredential;
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -76,12 +80,23 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
} else if ("MONGODB-CR".equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
Method createCRCredentialMethod = ReflectionUtils.findMethod(MongoCredential.class,
"createMongoCRCredential", String.class, String.class, char[].class);
if (createCRCredentialMethod == null) {
throw new IllegalArgumentException("MONGODB-CR is no longer supported.");
}
MongoCredential credential = MongoCredential.class
.cast(ReflectionUtils.invokeMethod(createCRCredentialMethod, null, userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
credentials.add(credential);
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
@@ -98,6 +113,12 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_256_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha256Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -164,7 +185,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) {
int optionsSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(DATABASE_DELIMITER);
if (optionsSeparationIndex == -1 || dbSeparationIndex > optionsSeparationIndex) {
return new Properties();
@@ -173,7 +194,13 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
Properties properties = new Properties();
for (String option : text.substring(optionsSeparationIndex + 1).split(OPTION_VALUE_DELIMITER)) {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -35,6 +35,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
@@ -92,6 +93,7 @@ abstract class MongoParsingUtils {
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "encryption-settings-ref", "autoEncryptionSettings");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2018 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,170 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation,
@Nullable AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
List<Document> createPipeline(Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toPipeline(context);
}
return mapAggregationPipeline(aggregation.toPipeline(context));
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
/**
* Create a {@code $count} aggregation for {@link Query} and optionally a {@link Class entity class}.
*
* @param query must not be {@literal null}.
* @param entityClass can be {@literal null} if the {@link Query} object is empty.
* @return the {@link Aggregation} pipeline definition to run a {@code $count} aggregation.
*/
Aggregation createCountAggregation(Query query, @Nullable Class<?> entityClass) {
List<AggregationOperation> pipeline = computeCountAggregationPipeline(query, entityClass);
Aggregation aggregation = entityClass != null ? Aggregation.newAggregation(entityClass, pipeline)
: Aggregation.newAggregation(pipeline);
aggregation.withOptions(AggregationOptions.builder().collation(query.getCollation().orElse(null)).build());
return aggregation;
}
private List<AggregationOperation> computeCountAggregationPipeline(Query query, @Nullable Class<?> entityType) {
CountOperation count = Aggregation.count().as("totalEntityCount");
if (query.getQueryObject().isEmpty()) {
return Collections.singletonList(count);
}
Assert.notNull(entityType, "Entity type must not be null!");
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(),
mappingContext.getPersistentEntity(entityType));
CriteriaDefinition criteria = new CriteriaDefinition() {
@Override
public Document getCriteriaObject() {
return mappedQuery;
}
@Nullable
@Override
public String getKey() {
return null;
}
};
return Arrays.asList(Aggregation.match(criteria), count);
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {
return pipeline.stream().map(val -> queryMapper.getMappedObject(val, Optional.empty()))
.collect(Collectors.toList());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -31,6 +31,7 @@ import com.mongodb.bulk.BulkWriteResult;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Minsu Kim
* @since 1.9
*/
public interface BulkOperations {
@@ -135,6 +136,29 @@ public interface BulkOperations {
*/
BulkOperations remove(List<Query> removes);
/**
* Add a single replace operation to the bulk operation.
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @since 2.2
*/
default BulkOperations replaceOne(Query query, Object replacement) {
return replaceOne(query, replacement, FindAndReplaceOptions.empty());
}
/**
* Add a single replace operation to the bulk operation.
*
* @param query Update criteria.
* @param replacement the replacement document. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the replace added, will never be {@literal null}.
* @since 2.2
*/
BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options);
/**
* Execute all bulk operations using the default write concern.
*

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2018 the original author or authors.
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,8 +17,11 @@ package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.util.concurrent.atomic.AtomicReference;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
@@ -26,6 +29,7 @@ import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.OperationType;
/**
* {@link Message} implementation specific to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change
@@ -38,11 +42,17 @@ import com.mongodb.client.model.changestream.ChangeStreamDocument;
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") //
private static final AtomicReferenceFieldUpdater<ChangeStreamEvent, Object> CONVERTED_UPDATER = AtomicReferenceFieldUpdater
.newUpdater(ChangeStreamEvent.class, Object.class, "converted");
private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType;
private final MongoConverter converter;
private final AtomicReference<T> converted = new AtomicReference<>();
// accessed through CONVERTED_UPDATER.
private volatile @Nullable T converted;
/**
* @param raw can be {@literal null}.
@@ -67,6 +77,69 @@ public class ChangeStreamEvent<T> {
return raw;
}
/**
* Get the {@link ChangeStreamDocument#getClusterTime() cluster time} as {@link Instant} the event was emitted at.
*
* @return can be {@literal null}.
*/
@Nullable
public Instant getTimestamp() {
return getBsonTimestamp() != null ? converter.getConversionService().convert(raw.getClusterTime(), Instant.class)
: null;
}
/**
* Get the {@link ChangeStreamDocument#getClusterTime() cluster time}.
*
* @return can be {@literal null}.
* @since 2.2
*/
@Nullable
public BsonTimestamp getBsonTimestamp() {
return raw != null ? raw.getClusterTime() : null;
}
/**
* Get the {@link ChangeStreamDocument#getResumeToken() resume token} for this event.
*
* @return can be {@literal null}.
*/
@Nullable
public BsonValue getResumeToken() {
return raw != null ? raw.getResumeToken() : null;
}
/**
* Get the {@link ChangeStreamDocument#getOperationType() operation type} for this event.
*
* @return can be {@literal null}.
*/
@Nullable
public OperationType getOperationType() {
return raw != null ? raw.getOperationType() : null;
}
/**
* Get the database name the event was originated at.
*
* @return can be {@literal null}.
*/
@Nullable
public String getDatabaseName() {
return raw != null ? raw.getNamespace().getDatabaseName() : null;
}
/**
* Get the collection name the event was originated at.
*
* @return can be {@literal null}.
*/
@Nullable
public String getCollectionName() {
return raw != null ? raw.getNamespace().getCollectionName() : null;
}
/**
* Get the potentially converted {@link ChangeStreamDocument#getFullDocument()}.
*
@@ -80,36 +153,48 @@ public class ChangeStreamEvent<T> {
return null;
}
if (raw.getFullDocument() == null) {
return targetType.cast(raw.getFullDocument());
Document fullDocument = raw.getFullDocument();
if (fullDocument == null) {
return targetType.cast(fullDocument);
}
return getConverted();
return getConverted(fullDocument);
}
private T getConverted() {
@SuppressWarnings("unchecked")
private T getConverted(Document fullDocument) {
return (T) doGetConverted(fullDocument);
}
private Object doGetConverted(Document fullDocument) {
Object result = CONVERTED_UPDATER.get(this);
T result = converted.get();
if (result != null) {
return result;
}
if (ClassUtils.isAssignable(Document.class, raw.getFullDocument().getClass())) {
if (ClassUtils.isAssignable(Document.class, fullDocument.getClass())) {
result = converter.read(targetType, raw.getFullDocument());
return converted.compareAndSet(null, result) ? result : converted.get();
result = converter.read(targetType, fullDocument);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
if (converter.getConversionService().canConvert(raw.getFullDocument().getClass(), targetType)) {
if (converter.getConversionService().canConvert(fullDocument.getClass(), targetType)) {
result = converter.getConversionService().convert(raw.getFullDocument(), targetType);
return converted.compareAndSet(null, result) ? result : converted.get();
result = converter.getConversionService().convert(fullDocument, targetType);
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
raw.getFullDocument().getClass(), targetType));
fullDocument.getClass(), targetType));
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2018 the original author or authors.
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,15 +17,21 @@ package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
import org.bson.BsonDocument;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
@@ -46,6 +52,8 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
protected ChangeStreamOptions() {}
@@ -77,6 +85,37 @@ public class ChangeStreamOptions {
return Optional.ofNullable(collation);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<Instant> getResumeTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, Instant.class));
}
/**
* @return {@link Optional#empty()} if not set.
* @since 2.2
*/
public Optional<BsonTimestamp> getResumeBsonTimestamp() {
return Optional.ofNullable(resumeTimestamp).map(timestamp -> asTimestampOfType(timestamp, BsonTimestamp.class));
}
/**
* @return {@literal true} if the change stream should be started after the {@link #getResumeToken() token}.
* @since 2.2
*/
public boolean isStartAfter() {
return Resume.START_AFTER.equals(resume);
}
/**
* @return {@literal true} if the change stream should be resumed after the {@link #getResumeToken() token}.
* @since 2.2
*/
public boolean isResumeAfter() {
return Resume.RESUME_AFTER.equals(resume);
}
/**
* @return empty {@link ChangeStreamOptions}.
*/
@@ -94,6 +133,48 @@ public class ChangeStreamOptions {
return new ChangeStreamOptionsBuilder();
}
private static <T> T asTimestampOfType(Object timestamp, Class<T> targetType) {
return targetType.cast(doGetTimestamp(timestamp, targetType));
}
private static <T> Object doGetTimestamp(Object timestamp, Class<T> targetType) {
if (ClassUtils.isAssignableValue(targetType, timestamp)) {
return timestamp;
}
if (timestamp instanceof Instant) {
return new BsonTimestamp((int) ((Instant) timestamp).getEpochSecond(), 0);
}
if (timestamp instanceof BsonTimestamp) {
return Instant.ofEpochSecond(((BsonTimestamp) timestamp).getTime());
}
throw new IllegalArgumentException(
"o_O that should actually not happen. The timestamp should be an Instant or a BsonTimestamp but was "
+ ObjectUtils.nullSafeClassName(timestamp));
}
/**
* @author Christoph Strobl
* @since 2.2
*/
enum Resume {
UNDEFINED,
/**
* @see com.mongodb.client.ChangeStreamIterable#startAfter(BsonDocument)
*/
START_AFTER,
/**
* @see com.mongodb.client.ChangeStreamIterable#resumeAfter(BsonDocument)
*/
RESUME_AFTER
}
/**
* Builder for creating {@link ChangeStreamOptions}.
*
@@ -106,6 +187,8 @@ public class ChangeStreamOptions {
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private @Nullable Object resumeTimestamp;
private Resume resume = Resume.UNDEFINED;
private ChangeStreamOptionsBuilder() {}
@@ -173,6 +256,11 @@ public class ChangeStreamOptions {
Assert.notNull(resumeToken, "ResumeToken must not be null!");
this.resumeToken = resumeToken;
if (this.resume == Resume.UNDEFINED) {
this.resume = Resume.RESUME_AFTER;
}
return this;
}
@@ -200,6 +288,65 @@ public class ChangeStreamOptions {
return this;
}
/**
* Set the cluster time to resume from.
*
* @param resumeTimestamp must not be {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder resumeAt(Instant resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
}
/**
* Set the cluster time to resume from.
*
* @param resumeTimestamp must not be {@literal null}.
* @return this.
* @since 2.2
*/
public ChangeStreamOptionsBuilder resumeAt(BsonTimestamp resumeTimestamp) {
Assert.notNull(resumeTimestamp, "ResumeTimestamp must not be null!");
this.resumeTimestamp = resumeTimestamp;
return this;
}
/**
* Set the resume token after which to continue emitting notifications.
*
* @param resumeToken must not be {@literal null}.
* @return this.
* @since 2.2
*/
public ChangeStreamOptionsBuilder resumeAfter(BsonValue resumeToken) {
resumeToken(resumeToken);
this.resume = Resume.RESUME_AFTER;
return this;
}
/**
* Set the resume token after which to start emitting notifications.
*
* @param resumeToken must not be {@literal null}.
* @return this.
* @since 2.2
*/
public ChangeStreamOptionsBuilder startAfter(BsonValue resumeToken) {
resumeToken(resumeToken);
this.resume = Resume.START_AFTER;
return this;
}
/**
* @return the built {@link ChangeStreamOptions}
*/
@@ -207,10 +354,12 @@ public class ChangeStreamOptions {
ChangeStreamOptions options = new ChangeStreamOptions();
options.filter = filter;
options.resumeToken = resumeToken;
options.fullDocumentLookup = fullDocumentLookup;
options.collation = collation;
options.filter = this.filter;
options.resumeToken = this.resumeToken;
options.fullDocumentLookup = this.fullDocumentLookup;
options.collation = this.collation;
options.resumeTimestamp = this.resumeTimestamp;
options.resume = this.resume;
return options;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2002-2018 the original author or authors.
* Copyright 2002-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,9 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import java.util.function.Function;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ReadPreference;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MongoCollection;
/**
* Simple callback interface to allow customization of a {@link FindIterable}.
@@ -25,7 +31,14 @@ import com.mongodb.client.FindIterable;
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface CursorPreparer {
public interface CursorPreparer extends ReadPreferenceAware {
/**
* Default {@link CursorPreparer} just passing on the given {@link FindIterable}.
*
* @since 2.2
*/
CursorPreparer NO_OP_PREPARER = (iterable -> iterable);
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
@@ -33,4 +46,37 @@ interface CursorPreparer {
* @param cursor
*/
FindIterable<Document> prepare(FindIterable<Document> cursor);
/**
* Apply query specific settings to {@link MongoCollection} and initate a find operation returning a
* {@link FindIterable} via the given {@link Function find} function.
*
* @param collection must not be {@literal null}.
* @param find must not be {@literal null}.
* @return
* @throws IllegalArgumentException if one of the required arguments is {@literal null}.
* @since 2.2
*/
default FindIterable<Document> initiateFind(MongoCollection<Document> collection,
Function<MongoCollection<Document>, FindIterable<Document>> find) {
Assert.notNull(collection, "Collection must not be null!");
Assert.notNull(find, "Find function must not be null!");
if (hasReadPreference()) {
collection = collection.withReadPreference(getReadPreference());
}
return prepare(find.apply(collection));
}
/**
* @return the {@link ReadPreference} to apply or {@literal null} if none defined.
* @since 2.2
*/
@Override
@Nullable
default ReadPreference getReadPreference() {
return null;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,9 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@@ -26,11 +23,18 @@ import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.dao.DataAccessException;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveCallback;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
@@ -38,18 +42,13 @@ import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.BulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.DeleteOneModel;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
import com.mongodb.client.model.*;
import lombok.NonNull;
import lombok.Value;
/**
* Default implementation for {@link BulkOperations}.
@@ -58,6 +57,9 @@ import com.mongodb.client.model.WriteModel;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @author Minsu Kim
* @author Jens Schauder
* @author Michail Nikolaev
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
@@ -65,7 +67,7 @@ class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
private final BulkOperationContext bulkOperationContext;
private final List<WriteModel<Document>> models = new ArrayList<>();
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern defaultWriteConcern;
@@ -122,16 +124,9 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(document, "Document must not be null!");
if (document instanceof Document) {
models.add(new InsertOneModel<>((Document) document));
return this;
}
Document sink = new Document();
mongoOperations.getConverter().write(document, sink);
models.add(new InsertOneModel<>(sink));
maybeEmitEvent(new BeforeConvertEvent<>(document, collectionName));
Object source = maybeInvokeBeforeConvertCallback(document);
addModel(source, new InsertOneModel<>(getMappedObject(source)));
return this;
}
@@ -245,7 +240,7 @@ class DefaultBulkOperations implements BulkOperations {
DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
models.add(new DeleteManyModel<>(query.getQueryObject(), deleteOptions));
addModel(query, new DeleteManyModel<>(query.getQueryObject(), deleteOptions));
return this;
}
@@ -266,6 +261,29 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(replacement, "Replacement must not be null!");
Assert.notNull(options, "Options must not be null!");
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(options.isUpsert());
query.getCollation().map(Collation::toMongoCollation).ifPresent(replaceOptions::collation);
maybeEmitEvent(new BeforeConvertEvent<>(replacement, collectionName));
Object source = maybeInvokeBeforeConvertCallback(replacement);
addModel(source,
new ReplaceOneModel<>(getMappedQuery(query.getQueryObject()), getMappedObject(source), replaceOptions));
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
@@ -274,15 +292,49 @@ class DefaultBulkOperations implements BulkOperations {
public com.mongodb.bulk.BulkWriteResult execute() {
try {
return mongoOperations.execute(collectionName, collection -> {
return collection.bulkWrite(models.stream().map(this::mapWriteModel).collect(Collectors.toList()), bulkOptions);
});
com.mongodb.bulk.BulkWriteResult result = mongoOperations.execute(collectionName, this::bulkWriteTo);
Assert.state(result != null, "Result must not be null.");
models.forEach(this::maybeEmitAfterSaveEvent);
return result;
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
}
private BulkWriteResult bulkWriteTo(MongoCollection<Document> collection) {
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
}
private WriteModel<Document> extractAndMapWriteModel(SourceAwareWriteModelHolder it) {
maybeEmitBeforeSaveEvent(it);
if (it.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeInvokeBeforeSaveCallback(it.getSource(), target);
}
return mapWriteModel(it.getModel());
}
/**
* Performs update and upsert bulk operations.
*
@@ -302,9 +354,9 @@ class DefaultBulkOperations implements BulkOperations {
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
if (multi) {
models.add(new UpdateManyModel<>(query.getQueryObject(), update.getUpdateObject(), options));
addModel(update, new UpdateManyModel<>(query.getQueryObject(), update.getUpdateObject(), options));
} else {
models.add(new UpdateOneModel<>(query.getQueryObject(), update.getUpdateObject(), options));
addModel(update, new UpdateOneModel<>(query.getQueryObject(), update.getUpdateObject(), options));
}
return this;
@@ -353,6 +405,76 @@ class DefaultBulkOperations implements BulkOperations {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private Document getMappedObject(Object source) {
if (source instanceof Document) {
return (Document) source;
}
Document sink = new Document();
mongoOperations.getConverter().write(source, sink);
return sink;
}
private void addModel(Object source, WriteModel<Document> model) {
models.add(new SourceAwareWriteModelHolder(source, model));
}
private void maybeEmitBeforeSaveEvent(SourceAwareWriteModelHolder it) {
if (it.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeEmitEvent(new BeforeSaveEvent<>(it.getSource(), target, collectionName));
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeEmitEvent(new BeforeSaveEvent<>(it.getSource(), target, collectionName));
}
}
private void maybeEmitAfterSaveEvent(SourceAwareWriteModelHolder it) {
if (it.getModel() instanceof InsertOneModel) {
Document target = ((InsertOneModel<Document>) it.getModel()).getDocument();
maybeEmitEvent(new AfterSaveEvent<>(it.getSource(), target, collectionName));
} else if (it.getModel() instanceof ReplaceOneModel) {
Document target = ((ReplaceOneModel<Document>) it.getModel()).getReplacement();
maybeEmitEvent(new AfterSaveEvent<>(it.getSource(), target, collectionName));
}
}
private <E extends MongoMappingEvent<T>, T> E maybeEmitEvent(E event) {
if (null != bulkOperationContext.getEventPublisher()) {
bulkOperationContext.getEventPublisher().publishEvent(event);
}
return event;
}
private Object maybeInvokeBeforeConvertCallback(Object value) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeConvertCallback.class, value, collectionName);
}
private Object maybeInvokeBeforeSaveCallback(Object value, Document mappedDocument) {
if (bulkOperationContext.getEntityCallbacks() == null) {
return value;
}
return bulkOperationContext.getEntityCallbacks().callback(BeforeSaveCallback.class, value, mappedDocument,
collectionName);
}
private static BulkWriteOptions getBulkWriteOptions(BulkMode bulkMode) {
BulkWriteOptions options = new BulkWriteOptions();
@@ -382,5 +504,20 @@ class DefaultBulkOperations implements BulkOperations {
@NonNull Optional<? extends MongoPersistentEntity<?>> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
ApplicationEventPublisher eventPublisher;
EntityCallbacks entityCallbacks;
}
/**
* Value object chaining together an actual source with its {@link WriteModel} representation.
*
* @since 2.2
* @author Christoph Strobl
*/
@Value
private static class SourceAwareWriteModelHolder {
Object source;
WriteModel<Document> model;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -120,19 +120,15 @@ public class DefaultIndexOperations implements IndexOperations {
return execute(collection -> {
Document indexOptions = indexDefinition.getIndexOptions();
MongoPersistentEntity<?> entity = lookupPersistentEntity(type, collectionName);
IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
IndexOptions indexOptions = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
indexOptions = addPartialFilterIfPresent(indexOptions, indexDefinition.getIndexOptions(), entity);
indexOptions = addDefaultCollationIfRequired(indexOptions, entity);
Assert.isInstanceOf(Document.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
ops.partialFilterExpression(mapper.getMappedObject((Document) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
}
return collection.createIndex(indexDefinition.getIndexKeys(), ops);
Document mappedKeys = mapper.getMappedObject(indexDefinition.getIndexKeys(), entity);
return collection.createIndex(mappedKeys, indexOptions);
});
}
@@ -192,7 +188,7 @@ public class DefaultIndexOperations implements IndexOperations {
private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
List<IndexInfo> indexInfoList = new ArrayList<>();
while (cursor.hasNext()) {
@@ -217,4 +213,25 @@ public class DefaultIndexOperations implements IndexOperations {
return mongoOperations.execute(collectionName, callback);
}
private IndexOptions addPartialFilterIfPresent(IndexOptions ops, Document sourceOptions,
@Nullable MongoPersistentEntity<?> entity) {
if (!sourceOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
return ops;
}
Assert.isInstanceOf(Document.class, sourceOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
return ops.partialFilterExpression(
mapper.getMappedObject((Document) sourceOptions.get(PARTIAL_FILTER_EXPRESSION_KEY), entity));
}
private static IndexOptions addDefaultCollationIfRequired(IndexOptions ops, MongoPersistentEntity<?> entity) {
if (ops.getCollation() != null || entity == null || !entity.hasCollation()) {
return ops;
}
return ops.collation(entity.getCollation().toMongoCollation());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -94,23 +94,16 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
return mongoOperations.execute(collectionName, collection -> {
Document indexOptions = indexDefinition.getIndexOptions();
MongoPersistentEntity<?> entity = type
.map(val -> (MongoPersistentEntity) queryMapper.getMappingContext().getRequiredPersistentEntity(val))
.orElseGet(() -> lookupPersistentEntity(collectionName));
IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
IndexOptions indexOptions = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
indexOptions = addPartialFilterIfPresent(indexOptions, indexDefinition.getIndexOptions(), entity);
indexOptions = addDefaultCollationIfRequired(indexOptions, entity);
Assert.isInstanceOf(Document.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
MongoPersistentEntity<?> entity = type
.map(val -> (MongoPersistentEntity) queryMapper.getMappingContext().getRequiredPersistentEntity(val))
.orElseGet(() -> lookupPersistentEntity(collectionName));
ops = ops.partialFilterExpression(
queryMapper.getMappedObject(indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY, Document.class), entity));
}
return collection.createIndex(indexDefinition.getIndexKeys(), ops);
return collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
}).next();
}
@@ -126,21 +119,24 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null);
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
@@ -148,4 +144,25 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //
.map(IndexConverters.documentToIndexInfoConverter()::convert);
}
private IndexOptions addPartialFilterIfPresent(IndexOptions ops, Document sourceOptions,
@Nullable MongoPersistentEntity<?> entity) {
if (!sourceOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
return ops;
}
Assert.isInstanceOf(Document.class, sourceOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
return ops.partialFilterExpression(
queryMapper.getMappedObject((Document) sourceOptions.get(PARTIAL_FILTER_EXPRESSION_KEY), entity));
}
private static IndexOptions addDefaultCollationIfRequired(IndexOptions ops, MongoPersistentEntity<?> entity) {
if (ops.getCollation() != null || entity == null || !entity.hasCollation()) {
return ops;
}
return ops.collation(entity.getCollation().toMongoCollation());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014-2018 the original author or authors.
* Copyright 2014-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -42,13 +42,15 @@ import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ExecutableMongoScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.7
* @deprecated since 2.2. The {@code eval} command has been removed in MongoDB Server 4.2.0.
*/
@Deprecated
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,826 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.Map;
import java.util.Optional;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
/**
* Common operations performed on an entity in the context of it's mapping metadata.
*
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.1
* @see MongoTemplate
* @see ReactiveMongoTemplate
*/
@RequiredArgsConstructor
class EntityOperations {
private static final String ID_FIELD = "_id";
private final @NonNull MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
/**
* Creates a new {@link Entity} for the given bean.
*
* @param entity must not be {@literal null}.
* @return
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
public <T> Entity<T> forEntity(T entity) {
Assert.notNull(entity, "Bean must not be null!");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
}
return MappedEntity.of(entity, context);
}
/**
* Creates a new {@link AdaptibleEntity} for the given bean and {@link ConversionService}.
*
* @param entity must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @return
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
public <T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(conversionService, "ConversionService must not be null!");
if (entity instanceof String) {
return new UnmappedEntity(parse(entity.toString()));
}
if (entity instanceof Map) {
return new SimpleMappedEntity((Map<String, Object>) entity);
}
return AdaptibleMappedEntity.of(entity, context, conversionService);
}
public String determineCollectionName(@Nullable Class<?> entityClass) {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
"No class parameter provided, entity collection can't be determined!");
}
return context.getRequiredPersistentEntity(entityClass).getCollection();
}
public Query getByIdInQuery(Collection<?> entities) {
MultiValueMap<String, Object> byIds = new LinkedMultiValueMap<>();
entities.stream() //
.map(this::forEntity) //
.forEach(it -> byIds.add(it.getIdFieldName(), it.getId()));
Criteria[] criterias = byIds.entrySet().stream() //
.map(it -> Criteria.where(it.getKey()).in(it.getValue())) //
.toArray(Criteria[]::new);
return new Query(criterias.length == 1 ? criterias[0] : new Criteria().orOperator(criterias));
}
/**
* Returns the name of the identifier property. Considers mapping information but falls back to the MongoDB default of
* {@code _id} if no identifier property can be found.
*
* @param type must not be {@literal null}.
* @return
*/
public String getIdPropertyName(Class<?> type) {
Assert.notNull(type, "Type must not be null!");
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(type);
if (persistentEntity != null && persistentEntity.getIdProperty() != null) {
return persistentEntity.getRequiredIdProperty().getName();
}
return ID_FIELD;
}
/**
* Return the name used for {@code $geoNear.distanceField} avoiding clashes with potentially existing properties.
*
* @param domainType must not be {@literal null}.
* @return the name of the distanceField to use. {@literal dis} by default.
* @since 2.2
*/
public String nearQueryDistanceFieldName(Class<?> domainType) {
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(domainType);
if (persistentEntity == null || persistentEntity.getPersistentProperty("dis") == null) {
return "dis";
}
String distanceFieldName = "calculated-distance";
int counter = 0;
while (persistentEntity.getPersistentProperty(distanceFieldName) != null) {
distanceFieldName += "-" + (counter++);
}
return distanceFieldName;
}
private static Document parse(String source) {
try {
return Document.parse(source);
} catch (org.bson.json.JsonParseException o_O) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
} catch (RuntimeException o_O) {
// legacy 3.x exception
if (ClassUtils.matchesTypeName(o_O.getClass(), "JSONParseException")) {
throw new MappingException("Could not parse given String to save into a JSON document!", o_O);
}
throw o_O;
}
}
public <T> TypedOperations<T> forType(@Nullable Class<T> entityClass) {
if (entityClass != null) {
MongoPersistentEntity<?> entity = context.getPersistentEntity(entityClass);
if (entity != null) {
return new TypedEntityOperations(entity);
}
}
return UntypedOperations.instance();
}
/**
* A representation of information about an entity.
*
* @author Oliver Gierke
* @since 2.1
*/
interface Entity<T> {
/**
* Returns the field name of the identifier of the entity.
*
* @return
*/
String getIdFieldName();
/**
* Returns the identifier of the entity.
*
* @return
*/
Object getId();
/**
* Returns the {@link Query} to find the entity by its identifier.
*
* @return
*/
Query getByIdQuery();
/**
* Returns the {@link Query} to remove an entity by its {@literal id} and if applicable {@literal version}.
*
* @return the {@link Query} to use for removing the entity. Never {@literal null}.
* @since 2.2
*/
default Query getRemoveByQuery() {
return isVersionedEntity() ? getQueryForVersion() : getByIdQuery();
}
/**
* Returns the {@link Query} to find the entity in its current version.
*
* @return
*/
Query getQueryForVersion();
/**
* Maps the backing entity into a {@link MappedDocument} using the given {@link MongoWriter}.
*
* @param writer must not be {@literal null}.
* @return
*/
MappedDocument toMappedDocument(MongoWriter<? super T> writer);
/**
* Asserts that the identifier type is updatable in case its not already set.
*/
default void assertUpdateableIdIfNotSet() {}
/**
* Returns whether the entity is versioned, i.e. if it contains a version property.
*
* @return
*/
default boolean isVersionedEntity() {
return false;
}
/**
* Returns the value of the version if the entity {@link #isVersionedEntity() has a version property}.
*
* @return the entity version. Can be {@literal null}.
* @throws IllegalStateException if the entity does not define a {@literal version} property. Make sure to check
* {@link #isVersionedEntity()}.
*/
@Nullable
Object getVersion();
/**
* Returns the underlying bean.
*
* @return
*/
T getBean();
/**
* Returns whether the entity is considered to be new.
*
* @return
* @since 2.1.2
*/
boolean isNew();
}
/**
* Information and commands on an entity.
*
* @author Oliver Gierke
* @since 2.1
*/
interface AdaptibleEntity<T> extends Entity<T> {
/**
* Populates the identifier of the backing entity if it has an identifier property and there's no identifier
* currently present.
*
* @param id must not be {@literal null}.
* @return
*/
@Nullable
T populateIdIfNecessary(@Nullable Object id);
/**
* Initializes the version property of the of the current entity if available.
*
* @return the entity with the version property updated if available.
*/
T initializeVersionProperty();
/**
* Increments the value of the version property if available.
*
* @return the entity with the version property incremented if available.
*/
T incrementVersion();
/**
* Returns the current version value if the entity has a version property.
*
* @return the current version or {@literal null} in case it's uninitialized.
* @throws IllegalStateException if the entity does not define a {@literal version} property.
*/
@Nullable
Number getVersion();
}
@RequiredArgsConstructor
private static class UnmappedEntity<T extends Map<String, Object>> implements AdaptibleEntity<T> {
private final T map;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return ID_FIELD;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return map.get(ID_FIELD);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
map.put(ID_FIELD, id);
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion()
*/
@Override
public Query getQueryForVersion() {
throw new MappingException("Cannot query for version on plain Documents!");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document //
? (Document) map //
: new Document(map));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#incrementVersion()
*/
@Override
public T incrementVersion() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return map.get(ID_FIELD) != null;
}
}
private static class SimpleMappedEntity<T extends Map<String, Object>> extends UnmappedEntity<T> {
SimpleMappedEntity(T map) {
super(map);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
@SuppressWarnings("unchecked")
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
T bean = getBean();
bean = (T) (bean instanceof Document //
? (Document) bean //
: new Document(bean));
Document document = new Document();
writer.write(bean, document);
return MappedDocument.of(document);
}
}
@RequiredArgsConstructor(access = AccessLevel.PROTECTED)
private static class MappedEntity<T> implements Entity<T> {
private final @NonNull MongoPersistentEntity<?> entity;
private final @NonNull IdentifierAccessor idAccessor;
private final @NonNull PersistentPropertyAccessor<T> propertyAccessor;
private static <T> MappedEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return entity.getRequiredIdProperty().getFieldName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return idAccessor.getRequiredIdentifier();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
if (!entity.hasIdProperty()) {
throw new MappingException("No id property found for object of type " + entity.getType() + "!");
}
MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
return Query.query(Criteria.where(idProperty.getName()).is(getId()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion(java.lang.Object)
*/
@Override
public Query getQueryForVersion() {
MongoPersistentProperty idProperty = entity.getRequiredIdProperty();
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
return new Query(Criteria.where(idProperty.getName()).is(getId())//
.and(versionProperty.getName()).is(getVersion()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
T bean = propertyAccessor.getBean();
Document document = new Document();
writer.write(bean, document);
if (document.containsKey(ID_FIELD) && document.get(ID_FIELD) == null) {
document.remove(ID_FIELD);
}
return MappedDocument.of(document);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#assertUpdateableIdIfNotSet()
*/
public void assertUpdateableIdIfNotSet() {
if (!entity.hasIdProperty()) {
return;
}
MongoPersistentProperty property = entity.getRequiredIdProperty();
Object propertyValue = idAccessor.getIdentifier();
if (propertyValue != null) {
return;
}
if (!MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(property.getType())) {
throw new InvalidDataAccessApiUsageException(
String.format("Cannot autogenerate id of type %s for entity of type %s!", property.getType().getName(),
entity.getType().getName()));
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#isVersionedEntity()
*/
@Override
public boolean isVersionedEntity() {
return entity.hasVersionProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getVersion()
*/
@Override
@Nullable
public Object getVersion() {
return propertyAccessor.getProperty(entity.getRequiredVersionProperty());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
}
}
private static class AdaptibleMappedEntity<T> extends MappedEntity<T> implements AdaptibleEntity<T> {
private final MongoPersistentEntity<?> entity;
private final ConvertingPropertyAccessor<T> propertyAccessor;
private final IdentifierAccessor identifierAccessor;
private AdaptibleMappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor identifierAccessor,
ConvertingPropertyAccessor<T> propertyAccessor) {
super(entity, identifierAccessor, propertyAccessor);
this.entity = entity;
this.propertyAccessor = propertyAccessor;
this.identifierAccessor = identifierAccessor;
}
private static <T> AdaptibleEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
ConversionService conversionService) {
MongoPersistentEntity<?> entity = context.getRequiredPersistentEntity(bean.getClass());
IdentifierAccessor identifierAccessor = entity.getIdentifierAccessor(bean);
PersistentPropertyAccessor<T> propertyAccessor = entity.getPropertyAccessor(bean);
return new AdaptibleMappedEntity<>(entity, identifierAccessor,
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
if (id == null) {
return propertyAccessor.getBean();
}
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty == null) {
return propertyAccessor.getBean();
}
if (identifierAccessor.getIdentifier() != null) {
return propertyAccessor.getBean();
}
propertyAccessor.setProperty(idProperty, id);
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MappedEntity#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
return propertyAccessor.getProperty(versionProperty, Number.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
if (!entity.hasVersionProperty()) {
return propertyAccessor.getBean();
}
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
propertyAccessor.setProperty(versionProperty, versionProperty.getType().isPrimitive() ? 1 : 0);
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#incrementVersion()
*/
@Override
public T incrementVersion() {
MongoPersistentProperty versionProperty = entity.getRequiredVersionProperty();
Number version = getVersion();
Number nextVersion = version == null ? 0 : version.longValue() + 1;
propertyAccessor.setProperty(versionProperty, nextVersion);
return propertyAccessor.getBean();
}
}
/**
* Type-specific operations abstraction.
*
* @author Mark Paluch
* @param <T>
* @since 2.2
*/
interface TypedOperations<T> {
/**
* Return the optional {@link Collation} for the underlying entity.
*
* @return
*/
Optional<Collation> getCollation();
/**
* Return the optional {@link Collation} from the given {@link Query} and fall back to the collation configured for
* the underlying entity.
*
* @return
*/
Optional<Collation> getCollation(Query query);
}
/**
* {@link TypedOperations} for generic entities that are not represented with {@link PersistentEntity} (e.g. custom
* conversions).
*/
@RequiredArgsConstructor
enum UntypedOperations implements TypedOperations<Object> {
INSTANCE;
@SuppressWarnings({ "unchecked", "rawtypes" })
public static <T> TypedOperations<T> instance() {
return (TypedOperations) INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
if (query == null) {
return Optional.empty();
}
return query.getCollation();
}
}
/**
* {@link TypedOperations} backed by {@link MongoPersistentEntity}.
*
* @param <T>
*/
@RequiredArgsConstructor
static class TypedEntityOperations<T> implements TypedOperations<T> {
private final @NonNull MongoPersistentEntity<T> entity;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.ofNullable(entity.getCollation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
if (query.getCollation().isPresent()) {
return query.getCollation();
}
return Optional.ofNullable(entity.getCollation());
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -35,22 +35,10 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableAggregationOperationSupport implements ExecutableAggregationOperation {
private final MongoTemplate template;
/**
* Create new instance of {@link ExecutableAggregationOperationSupport}.
*
* @param template must not be {@literal null}.
* @throws IllegalArgumentException if template is {@literal null}.
*/
ExecutableAggregationOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)
@@ -131,11 +119,11 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
TypedAggregation<?> typedAggregation = (TypedAggregation<?>) aggregation;
if (typedAggregation.getInputType() != null) {
return template.determineCollectionName(typedAggregation.getInputType());
return template.getCollectionName(typedAggregation.getInputType());
}
}
return template.determineCollectionName(domainType);
return template.getCollectionName(domainType);
}
}
}

Some files were not shown because too many files have changed in this diff Show More