Compare commits

..

176 Commits

Author SHA1 Message Date
Mark Paluch
c8846d3d1c DATAMONGO-2034 - Release version 2.0.10 (Kay SR10). 2018-09-10 12:52:18 +02:00
Mark Paluch
a4835c8fcf DATAMONGO-2034 - Prepare 2.0.10 (Kay SR10). 2018-09-10 12:51:27 +02:00
Mark Paluch
7875c8399f DATAMONGO-2034 - Updated changelog. 2018-09-10 12:51:22 +02:00
Mark Paluch
9046857721 DATAMONGO-2035 - Updated changelog. 2018-09-10 10:20:56 +02:00
Oliver Gierke
e8bb63c9f7 DATAMONGO-2076 - Fixed attribute substitution in reactive MongoDB section.
We now redeclare the Asciidoctor Maven plugin to register the store specific attributes. Apparently they must not contain dots, so we replaced them with dashes.
2018-08-30 11:45:08 +02:00
Oliver Gierke
b431a56a95 DATAMONGO-2076 - Fixed attribute substitution in getting started section. 2018-08-30 09:32:05 +02:00
Oliver Gierke
dc820017e0 DATAMONGO-2033 - Updated changelog. 2018-08-20 11:07:56 +02:00
Oliver Gierke
34ce87b80c DATAMONGO-2046 - Performance improvements in mapping and conversion subsystem.
In MappingMongoConverter, we now avoid the creation of a ParameterValueProvider for parameter-less constructors. We also skip property population if entity can be constructed entirely through constructor creation. Replaced the lambda in MappingMongoConverter.readAndPopulateIdentifier(…) with direct call to ….readIdValue(…). Objectpath now uses decomposed ObjectPathItems to avoid array copying and creation. It now stores a reference to its parent and ObjectPathItem fields are now merged into ObjectPath, which reduces the number of created objects during reads.

Extended CachingMongoPersistentProperty with DBRef caching. Turned key access in DocumentAccessor into an optimistic lookup. DbRefResolverCallbacks are now created lazily.

Related tickets: DATACMNS-1366.
Original pull request: #602.
2018-08-15 16:12:22 +02:00
Mark Paluch
9098d509a5 DATAMONGO-2055 - Polishing.
Move test to UpdateMapperUnitTests.

Original pull request: #600.
2018-08-15 15:59:55 +02:00
Christoph Strobl
861c8279a3 DATAMONGO-2055 - Allow position modifier to be negative using push at position on Update.
Original pull request: #600.
2018-08-15 15:53:59 +02:00
Mark Paluch
e545787e7e DATAMONGO-2050 - Polishing.
Tweak Javadoc.

Original pull request: #596.
2018-08-15 15:18:19 +02:00
Christoph Strobl
38ccdc5dfc DATAMONGO-2050 - Polishing.
Move to AssertJ.

Original pull request: #596.
2018-08-15 15:05:06 +02:00
Christoph Strobl
7a34cc73d8 DATAMONGO-2050 - Allow to specify the index to use for $geoNear aggregation operation.
Original pull request: #596.
2018-08-15 15:05:04 +02:00
Mark Paluch
ba6fa834e5 DATAMONGO-2051 - Polishing.
Use method argument types to avoid false positives with different method signatures.

Original pull request: #597.
Related pull request: #598.
2018-08-14 16:37:32 +02:00
Christoph Strobl
7100cd17be DATAMONGO-2051 - Add support for SCRAM-SHA-256 authentication mechanism to MongoCredentialPropertyEditor.
Original pull request: #597.
Related pull request: #598.
2018-08-14 16:33:39 +02:00
Mark Paluch
7c65472e2d DATAMONGO-2049 - Polishing.
Add static import for assertThat(…).

Original pull request: #594.
2018-08-14 10:51:41 +02:00
Christoph Strobl
f98f586a23 DATAMONGO-2049 - Add support for $ltrim, $rtrim, and $trim.
Original pull request: #594.
2018-08-14 10:51:41 +02:00
Mark Paluch
19b5b6b6f0 DATAMONGO-2048 - Polishing.
Javadoc tweaks.

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Christoph Strobl
b9ffa9b89d DATAMONGO-2048 - Add support for MongoDB 4.0 $convert aggregation operator.
We now support the following type conversion aggregation operators:

* $convert
* $toBool
* $toDate
* $toDecimal
* $toDouble
* $toInt
* $toLong
* $toObjectId
* $toString

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Mark Paluch
3ba589072f DATAMONGO-2047 - Polishing.
Retain previous options when calling withTimezone(…)/onNull…(…). Add tests. Javadoc.

Original pull request: #593.
2018-08-13 13:27:27 +02:00
Christoph Strobl
e237c5dfc4 DATAMONGO-2047 - Update $dateToString and $dateFromString aggregation operators to match MongoDB 4.0 changes.
We added the format and onNull options to DateFromString and changed format to an optional parameter.

Original pull request: #593.
2018-08-13 13:27:26 +02:00
Mark Paluch
ecb560cdbc DATAMONGO-2045 - Polishing.
Use diamond syntax where possible. Add initial size to HashMap instances with known number of elements. Fix typos in private constant names. Fix duplicate error code ids.

Original pull request: #592.
2018-08-13 10:31:28 +02:00
Mark Paluch
fc4a21775a DATAMONGO-2043 - Polishing.
Slightly tweak Javadoc.

Original pull request: #589.
2018-08-08 11:01:14 +02:00
Christoph Strobl
ae62e70c52 DATAMONGO-2043 - Omit type hint when mapping simple types.
Original pull request: #589.
2018-08-08 11:01:09 +02:00
Christoph Strobl
f83622709d DATAMONGO-2027 - Polishing.
Remove duplicate tests and fix assertions on existing ones. Move tests over to AssertJ and fix output database not applied correctly.

Original Pull Request: #588
2018-08-07 13:37:22 +02:00
Mark Paluch
83d218081c DATAMONGO-2027 - Consider MapReduce output type.
We now consider the output type (collection output) when rendering the MapReduce command. Previously, all output was returned inline without storing the results in the configured collection.

Original Pull Request: #588

# Conflicts:
#	spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/ReactiveMongoTemplate.java
#	spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/mapreduce/ReactiveMapReduceTests.java
2018-08-07 13:20:46 +02:00
Mark Paluch
70fe406602 DATAMONGO-2006 - Updated changelog. 2018-07-27 11:45:23 +02:00
Mark Paluch
18046e9040 DATAMONGO-2007 - After release cleanups. 2018-07-26 15:23:24 +02:00
Mark Paluch
69310552e3 DATAMONGO-2007 - Prepare next development iteration. 2018-07-26 15:23:22 +02:00
Mark Paluch
b8f093269d DATAMONGO-2007 - Release version 2.0.9 (Kay SR9). 2018-07-26 14:44:00 +02:00
Mark Paluch
172db96fea DATAMONGO-2007 - Prepare 2.0.9 (Kay SR9). 2018-07-26 14:43:06 +02:00
Mark Paluch
c8381c734b DATAMONGO-2007 - Updated changelog. 2018-07-26 14:42:54 +02:00
Mark Paluch
bf82964474 DATAMONGO-1982 - Updated changelog. 2018-07-26 14:03:20 +02:00
Mark Paluch
2d0495874f DATAMONGO-2029 - Encode collections of UUID and byte array query method arguments to their binary form.
We now convert collections that only contain UUID or byte array items to a BSON list that contains the encoded form of these items. Previously, we only converted single UUID and byte arrays into $binary so lists rendered to e.g. $uuid which does not work for queries.

Encoding is now encapsulated in strategy objects that implement the encoding only for their type. This allows to break up the conditional flow and improve organization of responsibilities.
2018-07-25 15:16:15 +02:00
Mark Paluch
82c91cbb71 DATAMONGO-2030 - Reinstantiate existsBy queries for reactive repositories.
We now support existsBy queries for reactive repositories to align with blocking repository support. ExistsBy support got lost during merging and is now back in place.

Extract boolean flag counting into BooleanUtil.
2018-07-23 16:34:12 +02:00
Christoph Strobl
4d309bd7f0 DATAMONGO-2011 - Relax type check when mapping collections.
Original pull request: #587.
2018-07-13 12:55:07 +02:00
Mark Paluch
6f011b0fa1 DATAMONGO-2021 - Polishing.
Adapt getResources(…) to use the file id and no longer the file name when opening a download stream. Add author tag.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Niklas Helge Hanft
1a3b9e3c42 DATAMONGO-2021 - Use getObjectId() instead of getFilename() for opening the GridFS download stream.
Using the file name leads to duplicate resource streams as file names are not unique therefore we're using the file's ObjectId to lookup the file content.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Mark Paluch
5a37468103 DATAMONGO-2016 - Polishing.
Fail gracefully if query string parameter has no value. Reformat test. Convert assertions to AssertJ.

Original pull request: #578.
2018-07-04 11:25:38 +02:00
Stephen Tyler Conrad
d4b0963550 DATAMONGO-2016 - Fix username/password extraction in MongoCredentialPropertyEditor.
MongoCredentialPropertyEditor inspects now the connection URI for the appropriate delimiter tokens. Previously, inspection used the char questionmark for username/password delimiter inspection.

Original pull request: #578.
2018-07-04 11:25:35 +02:00
Mark Paluch
468c497525 DATAMONGO-1969 - After release cleanups. 2018-06-13 21:24:35 +02:00
Mark Paluch
4562f39d7a DATAMONGO-1969 - Prepare next development iteration. 2018-06-13 21:24:33 +02:00
Mark Paluch
49957e8c6e DATAMONGO-1969 - Release version 2.0.8 (Kay SR8). 2018-06-13 15:13:01 +02:00
Mark Paluch
b462b35284 DATAMONGO-1969 - Prepare 2.0.8 (Kay SR8). 2018-06-13 15:12:06 +02:00
Mark Paluch
445388bb5f DATAMONGO-1969 - Updated changelog. 2018-06-13 15:12:00 +02:00
Mark Paluch
61e9eac49b DATAMONGO-1967 - Updated changelog. 2018-06-13 15:01:58 +02:00
Mark Paluch
c219f6e7f2 DATAMONGO-2003 - Polishing.
Add nullability annotation to MongoParameterAccessor.getPoint(). Remove superfluous casts.

Convert MongoQueryCreatorUnitTests to user AssertJ assertions.

Original pull request: #570.
2018-06-11 14:19:02 +02:00
Christoph Strobl
1ab130ffca DATAMONGO-2003 - Fix derived query using regex pattern with options.
We now consider regex pattern options when using the pattern as a derived finder argument.

Original pull request: #570.
2018-06-11 14:19:01 +02:00
Oliver Gierke
a4d6a0cf8a DATAMONGO-2002 - Fixed Criteria.equals(…) for usage with Pattern instances.
For Criteria instances that use regular expressions we now properly compare the two Pattern instances produced by also including the pattern flags in the comparison.
2018-06-07 19:12:44 +02:00
Mark Paluch
c28f725f48 DATAMONGO-1979 - Polishing.
Convert ReactiveMongoRepositoryTests to AssertJ. Add missing verifyComplete() steps to StepVerifier.

Original pull request: #566.
2018-06-07 10:06:37 +02:00
Mark Paluch
a71f50f15c DATAMONGO-1998 - Polishing.
Switch id field name check to equals or to match the last property path segment.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Christoph Strobl
0ad715f806 DATAMONGO-1998 - Fix Querydsl id handling for nested property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when parsing Querydsl queries.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Mark Paluch
ba559c223a DATAMONGO-1986 - Polishing.
Refactor duplicated code into AggregationUtil.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Christoph Strobl
5f3ad68114 DATAMONGO-1986 - Always provide a typed AggregationOperationContext for TypedAggregation.
We now initialize a TypeBasedAggregationOperationContext for TypedAggregations if no context is provided. This makes sure that potential Criteria objects are run trough the QueryMapper.
In case the default context is used we now also make sure to at least run the aggregation pipeline through the QueryMapper to avoid passing on non MongoDB simple types to the driver.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Mark Paluch
28b18d25cb DATAMONGO-1988 - Polishing.
Match exactly for either top-level properties of leaf-properties instead of accepting the property/field name suffix.

Original pull request: #565.
2018-06-05 11:14:11 +02:00
Christoph Strobl
22c0e5029c DATAMONGO-1988 - Fix query creation for id property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when creating queries. Prior to this change e.g. String values would have been turned into ObejctIds when saving a document, but not when querying the latter.

Original pull request: #565.
2018-06-05 11:13:01 +02:00
Christoph Strobl
4582d3152c DATAMONGO-1927 - Updated changelog. 2018-05-17 10:32:57 +02:00
Sébastien Deleuze
d219e8ed7c DATAMONGO-1980 - Fix a typo in CriteriaExtensions.kt.
Original pull request: #563.
2018-05-16 09:43:34 +02:00
Victor
ab7740faf5 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:30:06 +02:00
Mark Paluch
0fba00311d DATAMONGO-1466 - Polishing.
Switch conditionals to Map-based Function registry to pick the appropriate converter. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
33863999e6 DATAMONGO-1466 - Polishing.
Just some minor code style improvements.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
ae18958955 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:30:21 +02:00
Mark Paluch
489d637a00 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:29:57 +02:00
Jay Bryant
1b5ce651be DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:29:18 +02:00
Mark Paluch
e035210917 DATAMONGO-1971 - Polishing.
Remove outdated profiles.

Original pull request: #554.
2018-05-09 16:35:07 +02:00
Mark Paluch
57fc260c43 DATAMONGO-1971 - Install MongoDB 3.7.9 on TravisCI.
We now download and unpack MongoDB directly instead of using TravisCI's outdated MongoDB version.

Original pull request: #554.
2018-05-09 16:35:04 +02:00
Mark Paluch
f9ec63425e DATAMONGO-1918 - After release cleanups. 2018-05-08 15:04:28 +02:00
Mark Paluch
aedb50d728 DATAMONGO-1918 - Prepare next development iteration. 2018-05-08 15:04:26 +02:00
Mark Paluch
dbf4990f60 DATAMONGO-1918 - Release version 2.0.7 (Kay SR7). 2018-05-08 14:15:27 +02:00
Mark Paluch
c5c43158c2 DATAMONGO-1918 - Prepare 2.0.7 (Kay SR7). 2018-05-08 14:14:32 +02:00
Mark Paluch
56ffe7913d DATAMONGO-1918 - Updated changelog. 2018-05-08 14:14:23 +02:00
Mark Paluch
eae263eebc DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:53 +02:00
Mark Paluch
0dd2fa3dce DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Christoph Strobl
e648ea5903 DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Mark Paluch
f389812b7c DATAMONGO-1869 - Updated changelog. 2018-04-13 15:11:29 +02:00
Mark Paluch
2127ddcbb8 DATAMONGO-1893 - Polishing.
Inherit fields from previous operation if at least one field is excluded. Extend FieldsExposingAggregationOperation to conditionally inherit fields.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Christoph Strobl
7f9ab3bb44 DATAMONGO-1893 - Allow exclusion of other fields than _id in aggregation $project.
As of MongoDB 3.4 exclusion of fields other than _id is allowed so we removed the limitation in our code.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Mark Paluch
aea40ca490 DATAMONGO-1888 - After release cleanups. 2018-04-04 16:42:33 +02:00
Mark Paluch
fb8d03db31 DATAMONGO-1888 - Prepare next development iteration. 2018-04-04 16:42:30 +02:00
Mark Paluch
890f08f19a DATAMONGO-1888 - Release version 2.0.6 (Kay SR6). 2018-04-04 15:53:22 +02:00
Mark Paluch
be58472777 DATAMONGO-1888 - Prepare 2.0.6 (Kay SR6). 2018-04-04 15:52:31 +02:00
Mark Paluch
b082d4ad98 DATAMONGO-1888 - Updated changelog. 2018-04-04 15:52:22 +02:00
Mark Paluch
e80b031f54 DATAMONGO-1857 - Updated changelog. 2018-04-04 15:16:20 +02:00
Mark Paluch
50b017c08b DATAMONGO-1903 - Polishing.
Remove client side operating system check as operating system-dependant constraints depend on the server. Add check on whitespaces. Add author tags. Extend tests.

Adapt check in SimpleReactiveMongoDatabaseFactory accordingly. Remove superfluous UnknownHostException declaration in reactive database factory. Replace references to legacy types in Javadoc with references to current ones.

Original pull request: #546.
2018-04-03 13:44:19 +02:00
George Moraitis
78429eb33d DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
We now test database names against the current (3.6) MongoDB specifications for database names.

Original pull request: #546.
2018-04-03 13:44:16 +02:00
Mark Paluch
3ed0bd7a18 DATAMONGO-1916 - Polishing.
Remove unused final keywords from method parameters and unused variables. Add nullable annotations to parameters that can be null. Fix generics.

Original pull request: #547.
2018-04-03 11:33:13 +02:00
Christoph Strobl
cbc923c727 DATAMONGO-1916 - Fix potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
Original pull request: #547.
2018-04-03 11:32:53 +02:00
Mark Paluch
f6ca0049b6 DATAMONGO-1834 - Polishing.
Increase visibility of Timezone factory methods. Add missing nullable annotation. Tweaked Javadoc. Add tests for Timezone using expressions/field references.

Original Pull Request: #539
2018-03-28 11:40:14 +02:00
Christoph Strobl
82c9b0c662 DATAMONGO-1834 - Polishing.
Remove DateFactory and split up tests.
Introduce dedicated Timezone abstraction and update existing factories to apply the timezone if appropriate. Update builders and align code style.

Original Pull Request: #539
2018-03-28 11:39:58 +02:00
Matt Morrissette
3ca2349ce3 DATAMONGO-1834 - Add support for MongoDB 3.6 DateOperators $dateFromString, $dateFromParts and $dateToParts including timezones.
Original Pull Request: #539
2018-03-28 11:34:57 +02:00
Oliver Gierke
a76f157457 DATAMONGO-1915 - Removed explicit declaration of Jackson library versions. 2018-03-27 19:35:24 +02:00
Christoph Strobl
560a6a5bc2 DATAMONGO-1911 - Polishing.
Use native MongoDB Codec facilities to render binary and uuid.

Original Pull Request: #544
2018-03-27 14:17:46 +02:00
Mark Paluch
51d5c52193 DATAMONGO-1911 - Fix UUID serialization in String-based queries.
We now render to the correct UUID representation in String-based queries. Unquoted values render to $binary representation, quoted UUIDs are rendered with their toString() value.

Previously we used JSON.serialize() to encode values to JSON. The com.mongodb.util.JSON serializer does not produce JSON that is compatible with Document.parse. It uses an older JSON format that preceded the MongoDB Extended JSON specification.

Original Pull Request: #544
2018-03-27 14:04:34 +02:00
Mark Paluch
56b6748068 DATAMONGO-1913 - Add missing nullable annotations to GridFsTemplate. 2018-03-26 14:10:53 +02:00
Felipe Zanardo Affonso
1e19f405cc DATAMONGO-1909 - Fix typo on return statement.
Original pull request: #523.
2018-03-21 16:05:25 +01:00
Mark Paluch
54d2c122eb DATAMONGO-1907 - Polishing.
Rename test method to reflect test subject.

Switch from flatMap(…) to map(…) to avoid overhead of Mono creation.

Original pull request: #541.
2018-03-21 09:54:07 +01:00
Ruben J Garcia
b47c5704e7 DATAMONGO-1907 - Adjust SimpleReactiveMongoRepository.findOne(…) to complete without exception on empty result
We now no longer emit an exception via SimpleReactiveMongoRepository.findOne(Example) if the query completes without yielding a result. Previously findOne(Example) emitted a NoSuchElementException if the query returned no result.

Original pull request: #541.
2018-03-21 09:51:32 +01:00
Oliver Gierke
6b0b1cd97d DATAMONGO-1904 - Optimizations in MappingMongoConverter.readCollectionOrArray(…).
Switched to ClassUtils.isAssignableValue(…) in getPotentiallyConvertedSimpleRead(…) as it transparently handles primitives and their wrapper types so that we can avoid the superfluous invocation of the converter infrastructure.
2018-03-15 15:03:53 +01:00
Oliver Gierke
35bbc604aa DATAMONGO-1904 - Fixed handling of nested arrays on reads in MappingMongoConverter.
We now properly forward the component type information into recursive calls to MappingMongoConverter.readCollectionOrArray(…).
2018-03-15 15:03:51 +01:00
Oliver Gierke
9ade830a10 DATAMONGO-1901 - Added project.root configuration to make JavaDoc generation work again.
Related ticket: https://github.com/spring-projects/spring-data-build/issues/527.
2018-03-14 09:37:46 +01:00
Oliver Gierke
8fbff50f4f DATAMONGO-1898 - Added unit tests for the conversion handling of enums implementing interfaces.
Related tickets: DATACMNS-1278.
2018-03-12 11:07:40 +01:00
Oliver Gierke
14b49638a0 DATAMONGO-1896 - SimpleMongoRepository.saveAll(…) now consistently uses aggregate collection for inserts.
We previously used MongoTemplate.insertAll(…) which determines the collection to insert the individual elements based on the type, which - in cases of entity inheritance - will use dedicated collections for sub-types of the aggregate root. Subsequent lookups of the entities will then fail, as those are executed against the collection the aggregate root is mapped to.

We now rather use ….insert(Collection, String) handing the collection of the aggregate root explicitly.
2018-03-09 00:03:44 +01:00
Mark Paluch
dc31f4f32f DATAMONGO-1882 - After release cleanups. 2018-02-28 10:43:35 +01:00
Mark Paluch
708f9ac7b3 DATAMONGO-1882 - Prepare next development iteration. 2018-02-28 10:43:34 +01:00
Mark Paluch
17d6100426 DATAMONGO-1882 - Release version 2.0.5 (Kay SR5). 2018-02-28 10:14:58 +01:00
Mark Paluch
27a4e25880 DATAMONGO-1882 - Prepare 2.0.5 (Kay SR5). 2018-02-28 10:14:05 +01:00
Mark Paluch
d378bcb442 DATAMONGO-1882 - Updated changelog. 2018-02-28 10:13:57 +01:00
Mark Paluch
f6505c7758 DATAMONGO-1859 - After release cleanups. 2018-02-19 20:29:08 +01:00
Mark Paluch
d25f88c70e DATAMONGO-1859 - Prepare next development iteration. 2018-02-19 20:29:06 +01:00
Mark Paluch
cec6edfa26 DATAMONGO-1859 - Release version 2.0.4 (Kay SR4). 2018-02-19 19:46:53 +01:00
Mark Paluch
3261936e8a DATAMONGO-1859 - Prepare 2.0.4 (Kay SR4). 2018-02-19 19:46:05 +01:00
Mark Paluch
d2d471d135 DATAMONGO-1859 - Updated changelog. 2018-02-19 19:45:58 +01:00
Mark Paluch
bcd2de000c DATAMONGO-1870 - Polishing.
Extend copyright license years. Slightly reword documentation. Use IntStream and insertAll to create test fixture.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:14 +01:00
Christoph Strobl
c873e49d71 DATAMONGO-1870 - Consider skip/limit on MongoOperations.remove(Query, Class).
We now use _id lookup for remove operations that query with limit or skip parameters. This allows more fine grained control over documents removed.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:05 +01:00
Christoph Strobl
4ebcac19bc DATAMONGO-1860 - Polishing.
Fix references to QuerydslPredicateExecutor.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
78212948bc DATAMONGO-1860 - Polishing.
Fix type references in Javadoc. Change lambdas to method references where applicable.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
38575baec1 DATAMONGO-1860 - Retrieve result count via QuerydslMongoPredicateExecutor only for paging.
We now use AbstractMongodbQuery.fetch() instead of AbstractMongodbQuery.fetchResults() to execute MongoDB queries. fetchResults() executes a find(…) and a count(…) query. Retrieving the record count is an expensive operation in MongoDB and the count is not always required. For regular find(…) method, the count is ignored, for paging the count(…) is only required in certain result/request scenarios.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
f1a3c37a79 DATAMONGO-1865 - Polishing.
Adapt to collection name retrieval during query execution. Slightly reword documentation and JavaDoc.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Christoph Strobl
c668a47243 DATAMONGO-1865 - Avoid IncorrectResultSizeDataAccessException for derived findFirst/findTop queries.
We now return the first result when executing findFirst/findTop queries. This fixes a glitch introduced in the Kay release throwing IncorrectResultSizeDataAccessException for single entity executions returning more than one result, which is explicitly not the desired behavior in this case.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Mark Paluch
6a20ddf5a2 DATAMONGO-1871 - Polishing.
Migrate test to AssertJ.

Original pull request: #533.
2018-02-14 11:05:20 +01:00
Christoph Strobl
cec6526543 DATAMONGO-1871 - Fix AggregationExpression aliasing.
We now make sure to allow a nested property alias by setting the target.

Original pull request: #533.
2018-02-14 11:05:17 +01:00
Oliver Gierke
46ea58f3b9 DATAMONGO-1872 - Polishing.
Fixed @since tag for newly introduced method in MongoEntityMetadata.
2018-02-13 12:24:09 +01:00
Oliver Gierke
ebaea8d22f DATAMONGO-1872 - Repository query execution doesn't prematurely fix collection to be queried.
We now avoid calling ….inCollection(…) with a fixed, one-time calculated collection name to make sure we dynamically resolve the collections. That's necessary to make sure SpEL expressions in @Document are evaluated for every query execution.
2018-02-13 12:18:19 +01:00
Christoph Strobl
ed6aaeed25 DATAMONGO-1794 - Updated changelog. 2018-02-06 11:14:01 +01:00
Mark Paluch
89b1b6fbb2 DATAMONGO-1830 - After release cleanups. 2018-01-24 13:46:10 +01:00
Mark Paluch
23769301b5 DATAMONGO-1830 - Prepare next development iteration. 2018-01-24 13:46:08 +01:00
Mark Paluch
3399160acf DATAMONGO-1830 - Release version 2.0.3 (Kay SR3). 2018-01-24 13:21:24 +01:00
Mark Paluch
32a8ee9b31 DATAMONGO-1830 - Prepare 2.0.3 (Kay SR3). 2018-01-24 13:20:39 +01:00
Mark Paluch
17cea70abc DATAMONGO-1830 - Updated changelog. 2018-01-24 13:20:34 +01:00
Mark Paluch
07731c39ba DATAMONGO-1858 - Fix line endings to LF. 2018-01-24 12:57:24 +01:00
Mark Paluch
c5b580b82b DATAMONGO-1829 - Updated changelog. 2018-01-24 12:22:10 +01:00
Mark Paluch
9a1385186e DATAMONGO-1843 - Polishing.
Convert anonymous classes to lambdas. Typo fixes. Migrate test to AssertJ.

Original pull request: #526.
2018-01-23 10:34:46 +01:00
Christoph Strobl
704524d7f4 DATAMONGO-1843 - Fix parameter shadowing in ArrayOperators reduce.
Original pull request: #526.
2018-01-23 10:34:35 +01:00
Christoph Strobl
cc9a3ac8da DATAMONGO-1850 - Polishing.
Remove blank line, add tests and migrate to AssertJ.

Original Pull Request: #527
2018-01-22 16:16:52 +01:00
Mark Paluch
acb68f3ca4 DATAMONGO-1850 - Guard GridFsResource.getContentType() against absent file metadata.
We now consider fall back to GridFS.getContentType() if GridFS metadata is absent to prevent null dereference.

Original Pull Request: #527
2018-01-22 15:14:35 +01:00
Mark Paluch
3088f0469e DATAMONGO-1824 - Polishing.
Move method from AggregationCommandPreparer and AggregationResultPostProcessor to BatchAggregationLoader. Extract field names to constants. Tiny renames to variables. Add unit test for aggregation response without cursor use. Migrate test to AssertJ.

Original pull request: #521.
2017-12-15 14:30:24 +01:00
Christoph Strobl
a1ae04881d DATAMONGO-1824 - Skip tests no longer applicable for MongoDB 3.6.
Original pull request: #521.
2017-12-15 14:25:31 +01:00
Christoph Strobl
6f55c66060 DATAMONGO-1824 - Fix aggregation execution for MongoDB 3.6.
We now send aggregation commands along a cursor batch size for compatibility with MongoDB 3.6 that no longer supports aggregations without cursor. We consume the whole cursor before returning and converting results and omit the 16MB aggregation result limit. For MongoDB versions not supporting aggregation cursors we return results directly.

Original pull request: #521.
2017-12-15 14:25:20 +01:00
Christoph Strobl
f86447bd04 DATAMONGO-1831 - Fix array type conversion for empty source.
We now make sure that we convert empty sources to the corresponding target type. This prevents entity instantiation from failing due to incorrect argument types when invoking the constructor.

Original pull request: #520.
2017-12-02 12:18:55 -08:00
Mark Paluch
1bb4324b2e DATAMONGO-1816 - After release cleanups. 2017-11-27 16:42:54 +01:00
Mark Paluch
856506f121 DATAMONGO-1816 - Prepare next development iteration. 2017-11-27 16:42:52 +01:00
Mark Paluch
2a81dc75a8 DATAMONGO-1816 - Release version 2.0.2 (Kay SR2). 2017-11-27 16:12:34 +01:00
Mark Paluch
58cd4c08ca DATAMONGO-1816 - Prepare 2.0.2 (Kay SR2). 2017-11-27 16:11:21 +01:00
Mark Paluch
344e019143 DATAMONGO-1816 - Updated changelog. 2017-11-27 16:11:15 +01:00
Mark Paluch
918b7e96bb DATAMONGO-1799 - Updated changelog. 2017-11-27 15:58:45 +01:00
Christoph Strobl
fce7a5c1cb DATAMONGO-1818 - Polishing.
Move overlapping/duplicate documentation into one place.

Original Pull Request: #512
2017-11-27 07:53:22 +01:00
Mark Paluch
dbd2de8e0f DATAMONGO-1818 - Reword tailable cursors documentation.
Fix reference to @Tailable annotation. Slightly reword documentation.

Original Pull Request: #512
2017-11-27 07:53:08 +01:00
Mark Paluch
0dbe331ab0 DATAMONGO-1823 - Polishing.
Replace constructor with lombok's RequiredArgsConstructor. Add Nullable annotation. Tiny reformatting. Align license header. Migrate test to AssertJ.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Christoph Strobl
846ebcd91d DATAMONGO-1823 - Emit ApplicationEvents using projecting find methods.
We now again emit application events when using finder methods that apply projection.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Oliver Gierke
9e0b5caeac DATAMONGO-1737 - BasicMongoPersistentEntity now correctly initializes comparator.
In BasicMongoPersistentEntity.verify() we now properly call the super method to make sure the comparators that honor the @Field's order value are initialized properly.
2017-11-17 14:55:00 +01:00
Mark Paluch
cf70f5e5eb DATAMONGO-1819 - Polishing.
Use native field names for NamedMongoScript query instead of relying on metadata-based mapping as NamedMongoScript is considered a simple top-level type.

Related pull request: #513.
2017-11-17 13:49:06 +01:00
Mark Paluch
331dc6df6f DATAMONGO-1821 - Fix method ambiguity in tests when compiling against MongoDB 3.6 2017-11-07 12:47:51 +01:00
Mark Paluch
a51dce2c90 DATAMONGO-1820 - Set Mongo's Feature Compatibility flag for TravisCI build to 3.4.
Apply setFeatureCompatibilityVersion to upgrade MongoDB to 3.4 features.
2017-11-06 10:28:10 +01:00
Mark Paluch
c0cf1aa95b DATAMONGO-1817 - Polishing.
Remove blank line.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Sola
7104ffa543 DATAMONGO-1817 - Align nullability in Kotlin MongoOperationsExtensions with Java API.
Return types in MongoOperationsExtensions are now aligned to the nullability of MongoOperations.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Oliver Gierke
28d2fb6680 DATAMONGO-1793 - After release cleanups. 2017-10-27 15:50:48 +02:00
Oliver Gierke
140e26946f DATAMONGO-1793 - Prepare next development iteration. 2017-10-27 15:50:45 +02:00
Oliver Gierke
f4e730ce87 DATAMONGO-1793 - Release version 2.0.1 (Kay SR1). 2017-10-27 15:25:11 +02:00
Oliver Gierke
e3a83ebc42 DATAMONGO-1793 - Prepare 2.0.1 (Kay SR1). 2017-10-27 15:24:24 +02:00
Oliver Gierke
f65c1e324e DATAMONGO-1793 - Updated changelog. 2017-10-27 15:24:14 +02:00
Oliver Gierke
1dd0061f03 DATAMONGO-1815 - Adapt API changes in Property in test cases. 2017-10-27 11:13:31 +02:00
Mark Paluch
5ea860700c DATAMONGO-1814 - Update reference documentation for faceted classification.
Original pull request: #426.
Original ticket: DATAMONGO-1552.
2017-10-26 09:44:50 +02:00
Christoph Strobl
3dd653a702 DATAMONGO-1811 - Update documentation of MongoOperations.executeCommand.
Update Javadoc and reference documentation.
2017-10-24 14:59:47 +02:00
Christoph Strobl
f87847407b DATAMONGO-1805 - Update GridFsOperations documentation.
Fix return type in reference documentation and update Javadoc.
2017-10-24 14:59:40 +02:00
Christoph Strobl
433a125c9e DATAMONGO-1806 - Polishing.
Remove unused import, trailing whitespaces and update Javadoc.

Original Pull Request: #506
2017-10-24 14:59:33 +02:00
hartmut
5827cb0971 DATAMONGO-1806 - Fix Javadoc for GridFsResource.
Original Pull Request: #506
2017-10-24 14:59:24 +02:00
Mark Paluch
0109bf6858 DATAMONGO-1809 - Introduce AssertJ assertions for Document.
Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
49d1555576 DATAMONGO-1809 - Polishing.
Move tests to AssertJ.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
fdbb305b8e DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Mark Paluch
49dd03311a DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:45:33 +02:00
Mark Paluch
a86a3210e1 DATAMONGO-1802 - Polishing.
Reduce converter visibility to MongoConverters's package-scope visibility. Tiny alignment in Javadoc wording. Copyright year, create empty byte array with element count instead initializer.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Christoph Strobl
4b655abfb6 DATAMONGO-1802 - Add Binary to byte array converter.
We now provide and register a Binary to byte[] converter to provide conversion of binary data to a byte array. MongoDB deserializes binary data using the document API to its Binary type. With this converter, we reinstantiated the previous capability to use byte arrays for binary data within domain types.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Oliver Gierke
0963e6cf77 DATAMONGO-1775 - Updated changelog. 2017-10-11 19:03:29 +02:00
Oliver Gierke
3e1b2c4bdb DATAMONGO-1795 - Removed obsolete Kotlin build setup. 2017-10-04 11:05:27 +02:00
Mark Paluch
03e0e0c431 DATAMONGO-1776 - After release cleanups. 2017-10-02 11:38:04 +02:00
Mark Paluch
51900021a1 DATAMONGO-1776 - Prepare next development iteration. 2017-10-02 11:38:03 +02:00
766 changed files with 6517 additions and 19560 deletions

View File

@@ -3,44 +3,33 @@ language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
before_install:
- mkdir -p downloads
- mkdir -p var/db var/log
- if [[ ! -d downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION} ]] ; then cd downloads && wget https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && tar xzf mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && cd ..; fi
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --version
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --dbpath var/db --replSet rs0 --fork --logpath var/log/mongod.log
- sleep 10
- |-
echo "replication:
replSetName: rs0" | sudo tee -a /etc/mongod.conf
- sudo service mongod restart
- sleep 20
- |-
mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
- sleep 15
services:
- mongodb
downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
sleep 15
env:
matrix:
- PROFILE=ci
- PROFILE=mongo36-next
global:
- MONGO_VERSION=3.7.9
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install:
- |-
mongo admin --eval "db.adminCommand({setFeatureCompatibilityVersion: '3.4'});"
- downloads
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

View File

@@ -83,7 +83,7 @@ You can have Spring automatically create a proxy for the interface by using the
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() throws Exception {
public Mongo mongo() throws Exception {
return new MongoClient();
}

69
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
</parent>
<modules>
@@ -27,9 +27,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.1.0.M2</springdata.commons>
<mongo>3.6.3</mongo>
<mongo.reactivestreams>1.7.1</mongo.reactivestreams>
<springdata.commons>2.0.10.RELEASE</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -115,38 +115,6 @@
<profiles>
<!-- not-yet available profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile -->
<profile>
<id>mongo36-next</id>
<properties>
<mongo>3.6.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
@@ -170,6 +138,24 @@
</modules>
</profile>
<profile>
<id>distribute</id>
<build>
<plugins>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -183,8 +169,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>
@@ -193,11 +179,6 @@
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -50,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -204,7 +204,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
/**
* Returns the collection the given entity type shall be persisted to.
*
*
* @param entityClass must not be {@literal null}.
* @return
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,7 +39,7 @@ import org.springframework.transaction.support.TransactionTemplate;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
*
* @author Thomas Risberg
* @author Oliver Gierke
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -20,7 +20,6 @@
<equalsverifier>1.7.8</equalsverifier>
<java-module-name>spring.data.mongodb</java-module-name>
<project.root>${basedir}/..</project.root>
<multithreadedtc>1.01</multithreadedtc>
</properties>
<dependencies>
@@ -246,13 +245,6 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>edu.umd.cs.mtc</groupId>
<artifactId>multithreadedtc</artifactId>
<version>${multithreadedtc}</version>
<scope>test</scope>
</dependency>
<!-- Kotlin extension -->
<dependency>
<groupId>org.jetbrains.kotlin</groupId>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
@@ -39,7 +39,7 @@ public class BulkOperationException extends DataAccessException {
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,48 +0,0 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.NonTransientDataAccessException;
import org.springframework.lang.Nullable;
/**
* {@link NonTransientDataAccessException} specific to MongoDB {@link com.mongodb.session.ClientSession} related data
* access failures such as reading data using an already closed session.
*
* @author Christoph Strobl
* @since 2.1
*/
public class ClientSessionException extends NonTransientDataAccessException {
/**
* Constructor for {@link ClientSessionException}.
*
* @param msg the detail message. Must not be {@literal null}.
*/
public ClientSessionException(String msg) {
super(msg);
}
/**
* Constructor for {@link ClientSessionException}.
*
* @param msg the detail message. Can be {@literal null}.
* @param cause the root cause. Can be {@literal null}.
*/
public ClientSessionException(@Nullable String msg, @Nullable Throwable cause) {
super(msg, cause);
}
}

View File

@@ -1,74 +0,0 @@
/*
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Optional;
import org.bson.codecs.Codec;
import org.bson.codecs.configuration.CodecConfigurationException;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.util.Assert;
/**
* Provider interface to obtain {@link CodecRegistry} from the underlying MongoDB Java driver.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@FunctionalInterface
public interface CodecRegistryProvider {
/**
* Get the underlying {@link CodecRegistry} used by the MongoDB Java driver.
*
* @return never {@literal null}.
* @throws IllegalStateException if {@link CodecRegistry} cannot be obtained.
*/
CodecRegistry getCodecRegistry();
/**
* Checks if a {@link Codec} is registered for a given type.
*
* @param type must not be {@literal null}.
* @return true if {@link #getCodecRegistry()} holds a {@link Codec} for given type.
* @throws IllegalStateException if {@link CodecRegistry} cannot be obtained.
*/
default boolean hasCodecFor(Class<?> type) {
return getCodecFor(type).isPresent();
}
/**
* Get the {@link Codec} registered for the given {@literal type} or an {@link Optional#empty() empty Optional}
* instead.
*
* @param type must not be {@literal null}.
* @param <T>
* @return never {@literal null}.
* @throws IllegalArgumentException if {@literal type} is {@literal null}.
*/
default <T> Optional<Codec<T>> getCodecFor(Class<T> type) {
Assert.notNull(type, "Type must not be null!");
try {
return Optional.of(getCodecRegistry().get(type));
} catch (CodecConfigurationException e) {
// ignore
}
return Optional.empty();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import org.springframework.util.StringUtils;
* <p/>
* <p/>
* Mainly intended for internal use within the framework.
*
*
* @author Thomas Risberg
* @since 1.0
*/
@@ -38,7 +38,7 @@ public abstract class MongoCollectionUtils {
/**
* Obtains the collection name to use for the provided class
*
*
* @param entityClass The class to determine the preferred collection name for
* @return The preferred collection name
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,24 +15,20 @@
*/
package org.springframework.data.mongodb;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface MongoDbFactory extends CodecRegistryProvider {
public interface MongoDbFactory {
/**
* Creates a default {@link MongoDatabase} instance.
@@ -59,45 +55,4 @@ public interface MongoDbFactory extends CodecRegistryProvider {
PersistenceExceptionTranslator getExceptionTranslator();
DB getLegacyDb();
/**
* Get the underlying {@link CodecRegistry} used by the MongoDB Java driver.
*
* @return never {@literal null}.
*/
@Override
default CodecRegistry getCodecRegistry() {
return getDb().getCodecRegistry();
}
/**
* Obtain a {@link ClientSession} for given ClientSessionOptions.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
ClientSession getSession(ClientSessionOptions options);
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoDbFactory} returning {@link MongoDatabase} instances
* that are aware and bound to a new session with given {@link ClientSessionOptions options}.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default MongoDbFactory withSession(ClientSessionOptions options) {
return withSession(getSession(options));
}
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoDbFactory} returning {@link MongoDatabase} instances
* that are aware and bound to the given session.
*
* @param session must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
MongoDbFactory withSession(ClientSession session);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,25 +16,19 @@
package org.springframework.data.mongodb;
import reactor.core.publisher.Mono;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating reactive {@link MongoDatabase} instances.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
public interface ReactiveMongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
@@ -59,33 +53,4 @@ public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
/**
* Get the underlying {@link CodecRegistry} used by the reactive MongoDB Java driver.
*
* @return never {@literal null}.
*/
@Override
default CodecRegistry getCodecRegistry() {
return getMongoDatabase().getCodecRegistry();
}
/**
* Obtain a {@link Mono} emitting a {@link ClientSession} for given {@link ClientSessionOptions options}.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
Mono<ClientSession> getSession(ClientSessionOptions options);
/**
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoDatabaseFactory} returning
* {@link MongoDatabase} instances that are aware and bound to the given session.
*
* @param session must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
ReactiveMongoDatabaseFactory withSession(ClientSession session);
}

View File

@@ -1,211 +0,0 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.lang.reflect.Method;
import java.lang.reflect.Proxy;
import java.util.Optional;
import java.util.function.BiFunction;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
import org.springframework.core.MethodClassKey;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ConcurrentReferenceHashMap;
import org.springframework.util.ReflectionUtils;
import com.mongodb.WriteConcern;
import com.mongodb.session.ClientSession;
/**
* {@link MethodInterceptor} implementation looking up and invoking an alternative target method having
* {@link ClientSession} as its first argument. This allows seamless integration with the existing code base.
* <p />
* The {@link MethodInterceptor} is aware of methods on {@code MongoCollection} that my return new instances of itself
* like (eg. {@link com.mongodb.reactivestreams.client.MongoCollection#withWriteConcern(WriteConcern)} and decorate them
* if not already proxied.
*
* @param <D> Type of the actual Mongo Database.
* @param <C> Type of the actual Mongo Collection.
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
private static final MethodCache METHOD_CACHE = new MethodCache();
private final ClientSession session;
private final ClientSessionOperator collectionDecorator;
private final ClientSessionOperator databaseDecorator;
private final Object target;
private final Class<?> targetType;
private final Class<?> collectionType;
private final Class<?> databaseType;
/**
* Create a new SessionAwareMethodInterceptor for given target.
*
* @param session the {@link ClientSession} to be used on invocation.
* @param target the original target object.
* @param databaseType the MongoDB database type
* @param databaseDecorator a {@link ClientSessionOperator} used to create the proxy for an imperative / reactive
* {@code MongoDatabase}.
* @param collectionType the MongoDB collection type.
* @param collectionDecorator a {@link ClientSessionOperator} used to create the proxy for an imperative / reactive
* {@code MongoCollection}.
* @param <T> target object type.
*/
public <T> SessionAwareMethodInterceptor(ClientSession session, T target, Class<D> databaseType,
ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null!");
Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null!");
Assert.notNull(collectionDecorator, "Collection ClientSessionOperator must not be null!");
this.session = session;
this.target = target;
this.databaseType = ClassUtils.getUserClass(databaseType);
this.collectionType = ClassUtils.getUserClass(collectionType);
this.collectionDecorator = collectionDecorator;
this.databaseDecorator = databaseDecorator;
this.targetType = ClassUtils.isAssignable(databaseType, target.getClass()) ? databaseType : collectionType;
}
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable
@Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable {
if (requiresDecoration(methodInvocation.getMethod())) {
Object target = methodInvocation.proceed();
if (target instanceof Proxy) {
return target;
}
return decorate(target);
}
if (!requiresSession(methodInvocation.getMethod())) {
return methodInvocation.proceed();
}
Optional<Method> targetMethod = METHOD_CACHE.lookup(methodInvocation.getMethod(), targetType);
return !targetMethod.isPresent() ? methodInvocation.proceed()
: ReflectionUtils.invokeMethod(targetMethod.get(), target,
prependSessionToArguments(session, methodInvocation));
}
private boolean requiresDecoration(Method method) {
return ClassUtils.isAssignable(databaseType, method.getReturnType())
|| ClassUtils.isAssignable(collectionType, method.getReturnType());
}
@SuppressWarnings("unchecked")
protected Object decorate(Object target) {
return ClassUtils.isAssignable(databaseType, target.getClass()) ? databaseDecorator.apply(session, target)
: collectionDecorator.apply(session, target);
}
private static boolean requiresSession(Method method) {
if (method.getParameterCount() == 0
|| !ClassUtils.isAssignable(ClientSession.class, method.getParameterTypes()[0])) {
return true;
}
return false;
}
private static Object[] prependSessionToArguments(ClientSession session, MethodInvocation invocation) {
Object[] args = new Object[invocation.getArguments().length + 1];
args[0] = session;
System.arraycopy(invocation.getArguments(), 0, args, 1, invocation.getArguments().length);
return args;
}
/**
* Simple {@link Method} to {@link Method} caching facility for {@link ClientSession} overloaded targets.
*
* @since 2.1
* @author Christoph Strobl
*/
static class MethodCache {
private final ConcurrentReferenceHashMap<MethodClassKey, Optional<Method>> cache = new ConcurrentReferenceHashMap<>();
/**
* Lookup the target {@link Method}.
*
* @param method
* @param targetClass
* @return
*/
Optional<Method> lookup(Method method, Class<?> targetClass) {
return cache.computeIfAbsent(new MethodClassKey(method, targetClass),
val -> Optional.ofNullable(findTargetWithSession(method, targetClass)));
}
@Nullable
private Method findTargetWithSession(Method sourceMethod, Class<?> targetType) {
Class<?>[] argTypes = sourceMethod.getParameterTypes();
Class<?>[] args = new Class<?>[argTypes.length + 1];
args[0] = ClientSession.class;
System.arraycopy(argTypes, 0, args, 1, argTypes.length);
return ReflectionUtils.findMethod(targetType, sourceMethod.getName(), args);
}
/**
* Check whether the cache contains an entry for {@link Method} and {@link Class}.
*
* @param method
* @param targetClass
* @return
*/
boolean contains(Method method, Class<?> targetClass) {
return cache.containsKey(new MethodClassKey(method, targetClass));
}
}
/**
* Represents an operation upon two operands of the same type, producing a result of the same type as the operands
* accepting {@link ClientSession}. This is a specialization of {@link BiFunction} for the case where the operands and
* the result are all of the same type.
*
* @param <T> the type of the operands and result of the operator
*/
public interface ClientSessionOperator<T> extends BiFunction<ClientSession, T, T> {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,7 +17,7 @@ package org.springframework.data.mongodb.config;
/**
* Constants to declare bean names used by the namespace configuration.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,7 @@ import org.springframework.data.domain.AuditorAware;
/**
* Annotation to enable auditing in MongoDB via annotation configuration.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
@@ -41,21 +41,21 @@ public @interface EnableMongoAuditing {
/**
* Configures the {@link AuditorAware} bean to be used to lookup the current principal.
*
*
* @return
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
*
* @return
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
*
* @return
*/
boolean modifyOnCreate() default true;
@@ -63,7 +63,7 @@ public @interface EnableMongoAuditing {
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
*
*
* @return
*/
String dateTimeProviderRef() default "";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import org.springframework.data.web.config.SpringDataJacksonModules;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
*
* @author Oliver Gierke
* @author Jens Schauder
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,7 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
*
*
* @author Martin Baumgartner
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2018 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,12 +33,12 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
*
*
* @author Oliver Gierke
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@@ -47,7 +47,7 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
return AuditingEventListener.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@@ -56,7 +56,7 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
return true;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,13 +37,13 @@ import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@@ -52,7 +52,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return EnableMongoAuditing.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@@ -61,7 +61,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return "mongoAuditingHandler";
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@@ -74,7 +74,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@@ -92,7 +92,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@@ -125,14 +125,14 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@@ -141,7 +141,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return converter.getMappingContext();
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@@ -150,7 +150,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return MappingContext.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,13 +29,13 @@ import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.Arrays;
@@ -26,6 +27,7 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
@@ -35,6 +37,8 @@ import com.mongodb.MongoCredential;
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -98,6 +102,20 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if ("SCRAM-SHA-256".equals(authMechanism)) {
Method createScramSha256Credential = ReflectionUtils.findMethod(MongoCredential.class,
"createScramSha256Credential", String.class, String.class, char[].class);
if (createScramSha256Credential == null) {
throw new IllegalArgumentException(
"SCRAM-SHA-256 auth mechanism is available as of MongoDB 4 and MongoDB Java Driver 3.8! Please make sure to use at least those versions.");
}
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.class.cast(ReflectionUtils.invokeMethod(createScramSha256Credential, null,
userNameAndPassword[0], database, userNameAndPassword[1].toCharArray())));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -164,7 +182,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) {
int optionsSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(DATABASE_DELIMITER);
if (optionsSeparationIndex == -1 || dbSeparationIndex > optionsSeparationIndex) {
return new Properties();
@@ -173,7 +191,13 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
Properties properties = new Properties();
for (String option : text.substring(optionsSeparationIndex + 1).split(OPTION_VALUE_DELIMITER)) {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,7 +19,7 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,7 +30,7 @@ import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
@@ -43,7 +43,7 @@ abstract class MongoParsingUtils {
/**
* Parses the mongo replica-set element.
*
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
@@ -56,7 +56,7 @@ abstract class MongoParsingUtils {
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
@@ -102,7 +102,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
@@ -135,7 +135,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
*
* @return
* @since 1.7
*/
@@ -153,7 +153,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
*
* @return
* @since 1.7
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,13 +33,13 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
*
* @author Martin Baumgartner
* @author Oliver Gierke
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
*
* @author Christoph Strobl
* @since 1.7
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2018 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,12 +21,12 @@ import com.mongodb.WriteConcern;
/**
* Converter to create {@link WriteConcern} instances from String representations.
*
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,7 @@ import com.mongodb.WriteConcern;
* {@link WriteConcern#valueOf(String)}, use the well known {@link WriteConcern} value, otherwise pass the string as is
* to the constructor of the write concern. There is no support for other constructor signatures when parsing from a
* string value.
*
*
* @author Mark Pollack
* @author Christoph Strobl
*/

View File

@@ -0,0 +1,118 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.0.8
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation,
@Nullable AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
Document createPipeline(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toDocument(collectionName, context);
}
Document command = aggregation.toDocument(collectionName, context);
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {
return pipeline.stream().map(val -> queryMapper.getMappedObject(val, Optional.empty()))
.collect(Collectors.toList());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,117 +0,0 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.util.concurrent.atomic.AtomicReference;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
/**
* {@link Message} implementation specific to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change
* Streams</a>.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType;
private final MongoConverter converter;
private final AtomicReference<T> converted = new AtomicReference<>();
/**
* @param raw can be {@literal null}.
* @param targetType must not be {@literal null}.
* @param converter must not be {@literal null}.
*/
public ChangeStreamEvent(@Nullable ChangeStreamDocument<Document> raw, Class<T> targetType,
MongoConverter converter) {
this.raw = raw;
this.targetType = targetType;
this.converter = converter;
}
/**
* Get the raw {@link ChangeStreamDocument} as emitted by the driver.
*
* @return can be {@literal null}.
*/
@Nullable
public ChangeStreamDocument<Document> getRaw() {
return raw;
}
/**
* Get the potentially converted {@link ChangeStreamDocument#getFullDocument()}.
*
* @return {@literal null} when {@link #getRaw()} or {@link ChangeStreamDocument#getFullDocument()} is
* {@literal null}.
*/
@Nullable
public T getBody() {
if (raw == null) {
return null;
}
if (raw.getFullDocument() == null) {
return targetType.cast(raw.getFullDocument());
}
return getConverted();
}
private T getConverted() {
T result = converted.get();
if (result != null) {
return result;
}
if (ClassUtils.isAssignable(Document.class, raw.getFullDocument().getClass())) {
result = converter.read(targetType, raw.getFullDocument());
return converted.compareAndSet(null, result) ? result : converted.get();
}
if (converter.getConversionService().canConvert(raw.getFullDocument().getClass(), targetType)) {
result = converter.getConversionService().convert(raw.getFullDocument(), targetType);
return converted.compareAndSet(null, result) ? result : converted.get();
}
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
raw.getFullDocument().getClass(), targetType));
}
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';
}
}

View File

@@ -1,218 +0,0 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.util.Arrays;
import java.util.Optional;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
/**
* Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended
* to be used along with {@link org.springframework.data.mongodb.core.messaging.ChangeStreamRequest} in a sync world as
* well {@link ReactiveMongoOperations} if you prefer it that way.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
protected ChangeStreamOptions() {}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<Object> getFilter() {
return Optional.ofNullable(filter);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<BsonValue> getResumeToken() {
return Optional.ofNullable(resumeToken);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<FullDocument> getFullDocumentLookup() {
return Optional.ofNullable(fullDocumentLookup);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<Collation> getCollation() {
return Optional.ofNullable(collation);
}
/**
* @return empty {@link ChangeStreamOptions}.
*/
public static ChangeStreamOptions empty() {
return ChangeStreamOptions.builder().build();
}
/**
* Obtain a shiny new {@link ChangeStreamOptionsBuilder} and start defining options in this fancy fluent way. Just
* don't forget to call {@link ChangeStreamOptionsBuilder#build() build()} when your're done.
*
* @return new instance of {@link ChangeStreamOptionsBuilder}.
*/
public static ChangeStreamOptionsBuilder builder() {
return new ChangeStreamOptionsBuilder();
}
/**
* Builder for creating {@link ChangeStreamOptions}.
*
* @author Christoph Strobl
* @since 2.1
*/
public static class ChangeStreamOptionsBuilder {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private ChangeStreamOptionsBuilder() {}
/**
* Set the collation to use.
*
* @param collation must not be {@literal null} nor {@literal empty}.
* @return this.
*/
public ChangeStreamOptionsBuilder collation(Collation collation) {
Assert.notNull(collation, "Collation must not be null nor empty!");
this.collation = collation;
return this;
}
/**
* Set the filter to apply.
* <p/>
* Fields on aggregation expression root level are prefixed to map to fields contained in
* {@link ChangeStreamDocument#getFullDocument() fullDocument}. However {@literal operationType}, {@literal ns},
* {@literal documentKey} and {@literal fullDocument} are reserved words that will be omitted, and therefore taken
* as given, during the mapping procedure. You may want to have a look at the
* <a href="https://docs.mongodb.com/manual/reference/change-events/">structure of Change Events</a>.
* <p/>
* Use {@link org.springframework.data.mongodb.core.aggregation.TypedAggregation} to ensure filter expressions are
* mapped to domain type fields.
*
* @param filter the {@link Aggregation Aggregation pipeline} to apply for filtering events. Must not be
* {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder filter(Aggregation filter) {
Assert.notNull(filter, "Filter must not be null!");
this.filter = filter;
return this;
}
/**
* Set the plain filter chain to apply.
*
* @param filter must not be {@literal null} nor contain {@literal null} values.
* @return this.
*/
public ChangeStreamOptionsBuilder filter(Document... filter) {
Assert.noNullElements(filter, "Filter must not contain null values");
this.filter = Arrays.asList(filter);
return this;
}
/**
* Set the resume token (typically a {@link org.bson.BsonDocument} containing a {@link org.bson.BsonBinary binary
* token}) after which to start with listening.
*
* @param resumeToken must not be {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) {
Assert.notNull(resumeToken, "ResumeToken must not be null!");
this.resumeToken = resumeToken;
return this;
}
/**
* Set the {@link FullDocument} lookup to {@link FullDocument#UPDATE_LOOKUP}.
*
* @return this.
* @see #fullDocumentLookup(FullDocument)
*/
public ChangeStreamOptionsBuilder returnFullDocumentOnUpdate() {
return fullDocumentLookup(FullDocument.UPDATE_LOOKUP);
}
/**
* Set the {@link FullDocument} lookup to use.
*
* @param lookup must not be {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) {
Assert.notNull(lookup, "Lookup must not be null!");
this.fullDocumentLookup = lookup;
return this;
}
/**
* @return the built {@link ChangeStreamOptions}
*/
public ChangeStreamOptions build() {
ChangeStreamOptions options = new ChangeStreamOptions();
options.filter = filter;
options.resumeToken = resumeToken;
options.fullDocumentLookup = fullDocumentLookup;
options.collation = collation;
return options;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,27 +15,18 @@
*/
package org.springframework.data.mongodb.core;
import lombok.RequiredArgsConstructor;
import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.model.ValidationAction;
import com.mongodb.client.model.ValidationLevel;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
* @author Christoph Strobl
* @author Mark Paluch
* @author Andreas Zink
*/
public class CollectionOptions {
@@ -43,7 +34,6 @@ public class CollectionOptions {
private @Nullable Long size;
private @Nullable Boolean capped;
private @Nullable Collation collation;
private ValidationOptions validationOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
@@ -56,17 +46,16 @@ public class CollectionOptions {
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none());
this(size, maxDocuments, capped, null);
}
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions) {
@Nullable Collation collation) {
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
this.collation = collation;
this.validationOptions = validationOptions;
}
/**
@@ -80,7 +69,7 @@ public class CollectionOptions {
Assert.notNull(collation, "Collation must not be null!");
return new CollectionOptions(null, null, null, collation, ValidationOptions.none());
return new CollectionOptions(null, null, null, collation);
}
/**
@@ -90,7 +79,7 @@ public class CollectionOptions {
* @since 2.0
*/
public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null, ValidationOptions.none());
return new CollectionOptions(null, null, null, null);
}
/**
@@ -101,7 +90,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, true, collation);
}
/**
@@ -112,7 +101,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation);
}
/**
@@ -123,7 +112,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation);
}
/**
@@ -134,127 +123,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationOptions} set to given
* {@link MongoJsonSchema}.
*
* @param schema can be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schema(@Nullable MongoJsonSchema schema) {
return validator(Validator.schema(schema));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationOptions} set to given
* {@link Validator}.
*
* @param validator can be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions validator(@Nullable Validator validator) {
return validation(validationOptions.validator(validator));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#OFF}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions disableValidation() {
return schemaValidationLevel(ValidationLevel.OFF);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#STRICT}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions strictValidation() {
return schemaValidationLevel(ValidationLevel.STRICT);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#MODERATE}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions moderateValidation() {
return schemaValidationLevel(ValidationLevel.MODERATE);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set to
* {@link ValidationAction#WARN}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions warnOnValidationError() {
return schemaValidationAction(ValidationAction.WARN);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set to
* {@link ValidationAction#ERROR}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions failOnValidationError() {
return schemaValidationAction(ValidationAction.ERROR);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set given
* {@link ValidationLevel}.
*
* @param validationLevel must not be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) {
Assert.notNull(validationLevel, "ValidationLevel must not be null!");
return validation(validationOptions.validationLevel(validationLevel));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set given
* {@link ValidationAction}.
*
* @param validationAction must not be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schemaValidationAction(ValidationAction validationAction) {
Assert.notNull(validationAction, "ValidationAction must not be null!");
return validation(validationOptions.validationAction(validationAction));
}
/**
* Create new {@link CollectionOptions} with the given {@link ValidationOptions}.
*
* @param validationOptions must not be {@literal null}. Use {@link ValidationOptions#none()} to remove validation.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
return new CollectionOptions(size, maxDocuments, capped, collation);
}
/**
@@ -294,104 +163,4 @@ public class CollectionOptions {
public Optional<Collation> getCollation() {
return Optional.ofNullable(collation);
}
/**
* Get the {@link MongoJsonSchema} for the collection.
*
* @return {@link Optional#empty()} if not set.
* @since 2.1
*/
public Optional<ValidationOptions> getValidationOptions() {
return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions);
}
/**
* Encapsulation of ValidationOptions options.
*
* @author Christoph Strobl
* @author Andreas Zink
* @since 2.1
*/
@RequiredArgsConstructor
public static class ValidationOptions {
private static final ValidationOptions NONE = new ValidationOptions(null, null, null);
private final @Nullable Validator validator;
private final @Nullable ValidationLevel validationLevel;
private final @Nullable ValidationAction validationAction;
/**
* Create an empty {@link ValidationOptions}.
*
* @return never {@literal null}.
*/
public static ValidationOptions none() {
return NONE;
}
/**
* Define the {@link Validator} to be used for document validation.
*
* @param validator can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validator(@Nullable Validator validator) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Define the validation level to apply.
*
* @param validationLevel can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validationLevel(ValidationLevel validationLevel) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Define the validation action to take.
*
* @param validationAction can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validationAction(ValidationAction validationAction) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Get the {@link Validator} to use.
*
* @return never {@literal null}.
*/
public Optional<Validator> getValidator() {
return Optional.ofNullable(validator);
}
/**
* Get the {@code validationLevel} to apply.
*
* @return {@link Optional#empty()} if not set.
*/
public Optional<ValidationLevel> getValidationLevel() {
return Optional.ofNullable(validationLevel);
}
/**
* Get the {@code validationAction} to perform.
*
* @return @return {@link Optional#empty()} if not set.
*/
public Optional<ValidationAction> getValidationAction() {
return Optional.ofNullable(validationAction);
}
/**
* @return {@literal true} if no arguments set.
*/
boolean isEmpty() {
return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2018 the original author or authors.
* Copyright 2002-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import com.mongodb.client.FindIterable;
/**
* Simple callback interface to allow customization of a {@link FindIterable}.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@@ -29,7 +29,7 @@ interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
*
* @param cursor
*/
FindIterable<Document> prepare(FindIterable<Document> cursor);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -274,10 +274,19 @@ class DefaultBulkOperations implements BulkOperations {
public com.mongodb.bulk.BulkWriteResult execute() {
try {
return mongoOperations.execute(collectionName, collection -> {
return collection.bulkWrite(models.stream().map(this::mapWriteModel).collect(Collectors.toList()), bulkOptions);
});
MongoCollection<Document> collection = mongoOperations.getCollection(collectionName);
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite(models.stream().map(this::mapWriteModel).collect(Collectors.toList()), bulkOptions);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.MongoTemplate.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
@@ -48,22 +50,18 @@ public class DefaultIndexOperations implements IndexOperations {
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private final MongoDbFactory mongoDbFactory;
private final String collectionName;
private final QueryMapper mapper;
private final @Nullable Class<?> type;
private MongoOperations mongoOperations;
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoDbFactory must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param queryMapper must not be {@literal null}.
* @deprecated since 2.1. Please use
* {@link DefaultIndexOperations#DefaultIndexOperations(MongoOperations, String, Class)}.
*/
@Deprecated
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
this(mongoDbFactory, collectionName, queryMapper, null);
}
@@ -76,10 +74,7 @@ public class DefaultIndexOperations implements IndexOperations {
* @param queryMapper must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
* @deprecated since 2.1. Please use
* {@link DefaultIndexOperations#DefaultIndexOperations(MongoOperations, String, Class)}.
*/
@Deprecated
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
@Nullable Class<?> type) {
@@ -87,29 +82,10 @@ public class DefaultIndexOperations implements IndexOperations {
Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.mongoDbFactory = mongoDbFactory;
this.collectionName = collectionName;
this.mapper = queryMapper;
this.type = type;
this.mongoOperations = new MongoTemplate(mongoDbFactory);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param type can be {@literal null}.
* @since 2.1
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, @Nullable Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.collectionName = collectionName;
this.type = type;
}
/*
@@ -211,10 +187,11 @@ public class DefaultIndexOperations implements IndexOperations {
Assert.notNull(callback, "CollectionCallback must not be null!");
if (type != null) {
return mongoOperations.execute(type, callback);
try {
MongoCollection<Document> collection = mongoDbFactory.getDb().getCollection(collectionName);
return callback.doInCollection(collection);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, mongoDbFactory.getExceptionTranslator());
}
return mongoOperations.execute(collectionName, callback);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2018 the original author or authors.
* Copyright 2014-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,7 @@ import com.mongodb.MongoException;
* about exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate An
* DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for
* later inspection.
*
*
* @author Mark Pollack
* @author Grame Rocher
* @author Oliver Gierke

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,14 +19,11 @@ import java.util.List;
import java.util.Optional;
import java.util.stream.Stream;
import org.springframework.dao.DataAccessException;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import com.mongodb.client.MongoCollection;
/**
* {@link ExecutableFindOperation} allows creation and execution of MongoDB find operations in a fluent API style.
* <br />
@@ -205,7 +202,7 @@ public interface ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
interface FindWithProjection<T> extends FindWithQuery<T>, FindDistinct {
interface FindWithProjection<T> extends FindWithQuery<T> {
/**
* Define the target type fields should be mapped to. <br />
@@ -217,101 +214,6 @@ public interface ExecutableFindOperation {
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> FindWithQuery<R> as(Class<R> resultType);
}
/**
* Distinct Find support.
*
* @author Christoph Strobl
* @since 2.1
*/
interface FindDistinct {
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view.
*
* @param field name of the field. Must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if field is {@literal null}.
*/
TerminatingDistinct<Object> distinct(String field);
}
/**
* Result type override. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithProjection {
/**
* Define the target type the result should be mapped to. <br />
* Skip this step if you are anyway fine with the default conversion.
* <dl>
* <dt>{@link Object} (the default)</dt>
* <dd>Result is mapped according to the {@link org.bson.BsonType} converting eg. {@link org.bson.BsonString} into
* plain {@link String}, {@link org.bson.BsonInt64} to {@link Long}, etc. always picking the most concrete type with
* respect to the domain types property.<br />
* Any {@link org.bson.BsonType#DOCUMENT} is run through the {@link org.springframework.data.convert.EntityReader}
* to obtain the domain type. <br />
* Using {@link Object} also works for non strictly typed fields. Eg. a mixture different types like fields using
* {@link String} in one {@link org.bson.Document} while {@link Long} in another.</dd>
* <dt>Any Simple type like {@link String} or {@link Long}.</dt>
* <dd>The result is mapped directly by the MongoDB Java driver and the {@link org.bson.codecs.CodeCodec Codecs} in
* place. This works only for results where all documents considered for the operation use the very same type for
* the field.</dd>
* <dt>Any Domain type</dt>
* <dd>Domain types can only be mapped if the if the result of the actual {@code distinct()} operation returns
* {@link org.bson.BsonType#DOCUMENT}.</dd>
* <dt>{@link org.bson.BsonValue}</dt>
* <dd>Using {@link org.bson.BsonValue} allows retrieval of the raw driver specific format, which returns eg.
* {@link org.bson.BsonString}.</dd>
* </dl>
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> TerminatingDistinct<R> as(Class<R> resultType);
}
/**
* Result restrictions. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithQuery<T> extends DistinctWithProjection {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
TerminatingDistinct<T> matching(Query query);
}
/**
* Terminating distinct find operations.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingDistinct<T> extends DistinctWithQuery<T> {
/**
* Get all matching distinct field values.
*
* @return empty {@link List} if not match found. Never {@literal null}.
* @throws DataAccessException if eg. result cannot be converted correctly which may happen if the document contains
* {@link String} whereas the result type is specified as {@link Long}.
*/
List<T> all();
}
/**
@@ -320,5 +222,5 @@ public interface ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
interface ExecutableFind<T> extends FindWithCollection<T>, FindWithProjection<T>, FindDistinct {}
interface ExecutableFind<T> extends FindWithCollection<T>, FindWithProjection<T> {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -205,19 +205,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked")
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
return new DistinctOperationSupport(this, field);
}
private List<T> doFind(@Nullable CursorPreparer preparer) {
Document queryObject = query.getQueryObject();
@@ -227,12 +214,6 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
getCursorPreparer(query, preparer));
}
private List<T> doFindDistinct(String field) {
return template.findDistinct(query, field, getCollectionName(), domainType,
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private CloseableIterator<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType);
}
@@ -280,54 +261,4 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return this;
}
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class DistinctOperationSupport<T> implements TerminatingDistinct<T> {
private final String field;
private final ExecutableFindSupport<T> delegate;
public DistinctOperationSupport(ExecutableFindSupport<T> delegate, String field) {
this.delegate = delegate;
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override
public List<T> all() {
return delegate.doFindDistinct(field);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import org.springframework.util.Assert;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
@@ -34,7 +34,7 @@ class GeoCommandStatistics {
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(Document source) {
@@ -45,7 +45,7 @@ class GeoCommandStatistics {
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
*
* @param commandResult must not be {@literal null}.
* @return
*/
@@ -60,7 +60,7 @@ class GeoCommandStatistics {
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
*
* @return
* @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,7 @@ import com.mongodb.WriteConcern;
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityType, document, and query</li>
* </ul>
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
@@ -46,7 +46,7 @@ public class MongoAction {
/**
* Create an instance of a {@link MongoAction}.
*
*
* @param defaultWriteConcern the default write concern. Can be {@literal null}.
* @param mongoActionOperation action being taken against the collection. Must not be {@literal null}.
* @param collectionName the collection name, must not be {@literal null} or empty.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,7 @@ package org.springframework.data.mongodb.core;
/**
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the WriteConcern to use
* for a given mutating operation
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @see MongoAction

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import com.mongodb.client.MongoDatabase;
/**
* Mongo server administration exposed via JMX annotations
*
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2018 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2018 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2018 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import com.mongodb.WriteResult;
/**
* Mongo-specific {@link DataIntegrityViolationException}.
*
*
* @author Oliver Gierke
*/
public class MongoDataIntegrityViolationException extends DataIntegrityViolationException {
@@ -35,7 +35,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Creates a new {@link MongoDataIntegrityViolationException} using the given message and {@link WriteResult}.
*
*
* @param message the exception message
* @param writeResult the {@link WriteResult} that causes the exception, must not be {@literal null}.
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
@@ -54,7 +54,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Returns the {@link WriteResult} that caused the exception.
*
*
* @return the writeResult
*/
public WriteResult getWriteResult() {
@@ -63,7 +63,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Returns the {@link MongoActionOperation} in which the current exception occured.
*
*
* @return the actionOperation
*/
public MongoActionOperation getActionOperation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2010-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -29,7 +30,6 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.ClientSessionException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.lang.Nullable;
@@ -45,24 +45,24 @@ import com.mongodb.bulk.BulkWriteError;
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
* exception from the {@code org.springframework.dao} hierarchy. Return {@literal null} if no translation is
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
private static final Set<String> DUPLICATE_KEY_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<>(
Collections.singletonList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
@@ -80,7 +80,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
if (DUPLICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
@@ -92,7 +92,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
if (DATA_INTEGRITY_EXCEPTIONS.contains(exception)) {
if (ex instanceof MongoServerException) {
if (((MongoServerException) ex).getCode() == 11000) {
@@ -120,28 +120,18 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
int code = ((MongoException) ex).getCode();
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
return new DuplicateKeyException(ex.getMessage(), ex);
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
return new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
return new PermissionDeniedDataAccessException(ex.getMessage(), ex);
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}
// may interfere with OmitStackTraceInFastThrow (enabled by default).
// see https://jira.spring.io/browse/DATAMONGO-1905
if (ex instanceof IllegalStateException) {
for (StackTraceElement elm : ex.getStackTrace()) {
if (elm.getClassName().contains("ClientSession")) {
return new ClientSessionException(ex.getMessage(), ex);
}
}
}
// If we get here, we have an exception that resulted from user code,
// rather than the persistence provider, so we return null to indicate
// that translation should not occur.

View File

@@ -18,8 +18,6 @@ package org.springframework.data.mongodb.core;
import java.util.Collection;
import java.util.List;
import java.util.Set;
import java.util.function.Consumer;
import java.util.function.Supplier;
import org.bson.Document;
import org.springframework.data.geo.GeoResults;
@@ -41,15 +39,12 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ClientSessionOptions;
import com.mongodb.Cursor;
import com.mongodb.ReadPreference;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.session.ClientSession;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
@@ -156,64 +151,6 @@ public interface MongoOperations extends FluentMongoOperations {
@Nullable
<T> T execute(String collectionName, CollectionCallback<T> action);
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding a new {@link ClientSession}
* with given {@literal sessionOptions} to each and every command issued against MongoDB.
*
* @param sessionOptions must not be {@literal null}.
* @return new instance of {@link SessionScoped}. Never {@literal null}.
* @since 2.1
*/
SessionScoped withSession(ClientSessionOptions sessionOptions);
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <p />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use the
* {@link SessionScoped#execute(SessionCallback, Consumer)} hook to potentially close the {@link ClientSession}.
*
* @param sessionProvider must not be {@literal null}.
* @since 2.1
*/
default SessionScoped withSession(Supplier<ClientSession> sessionProvider) {
Assert.notNull(sessionProvider, "SessionProvider must not be null!");
return new SessionScoped() {
private final Object lock = new Object();
private @Nullable ClientSession session = null;
@Override
public <T> T execute(SessionCallback<T> action, Consumer<ClientSession> onComplete) {
synchronized (lock) {
if (session == null) {
session = sessionProvider.get();
}
}
try {
return action.doInSession(MongoOperations.this.withSession(session));
} finally {
onComplete.accept(session);
}
}
};
}
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoOperations}.
* <p />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
* @return {@link ClientSession} bound instance of {@link MongoOperations}.
* @since 2.1
*/
MongoOperations withSession(ClientSession session);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
@@ -773,67 +710,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param entityClass the domain type used for determining the actual {@link MongoCollection}. Must not be
* {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default <T> List<T> findDistinct(String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(new Query(), field, entityClass, resultClass);
}
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param entityClass the domain type used for determining the actual {@link MongoCollection} and mapping the
* {@link Query} to the domain type fields. Must not be {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
<T> List<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param collectionName the explicit name of the actual {@link MongoCollection}. Must not be {@literal null}.
* @param entityClass the domain type used for mapping the {@link Query} to the domain type fields.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
<T> List<T> findDistinct(Query query, String field, String collectionName, Class<?> entityClass,
Class<T> resultClass);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param collection the explicit name of the actual {@link MongoCollection}. Must not be {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default <T> List<T> findDistinct(Query query, String field, String collection, Class<T> resultClass) {
return findDistinct(query, field, collection, Object.class, resultClass);
}
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -845,8 +723,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -859,8 +737,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -876,8 +754,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -1016,7 +894,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param collectionToSave the list of objects to save. Must not be {@literal null}.
* @param objectsToSave the list of objects to save. Must not be {@literal null}.
*/
void insertAll(Collection<? extends Object> objectsToSave);

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import com.mongodb.client.model.MapReduceAction;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.NonNull;
@@ -27,11 +28,8 @@ import java.io.IOException;
import java.util.*;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.bson.conversions.Bson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -57,8 +55,6 @@ import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -71,7 +67,14 @@ import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
@@ -98,7 +101,6 @@ import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
@@ -115,7 +117,6 @@ import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.Cursor;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
@@ -125,17 +126,21 @@ import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.AggregateIterable;
import com.mongodb.client.DistinctIterable;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MapReduceIterable;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.MongoDatabase;
import com.mongodb.client.MongoIterable;
import com.mongodb.client.model.*;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.Filters;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.session.ClientSession;
import com.mongodb.util.JSONParseException;
/**
@@ -158,8 +163,6 @@ import com.mongodb.util.JSONParseException;
* @author Laszlo Csontos
* @author Maninder Singh
* @author Borislav Rangelov
* @author duozhilin
* @author Andreas Zink
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware, IndexOperationsProvider {
@@ -185,7 +188,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final PersistenceExceptionTranslator exceptionTranslator;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final JsonSchemaMapper schemaMapper;
private final SpelAwareProxyProjectionFactory projectionFactory;
private @Nullable WriteConcern writeConcern;
@@ -203,7 +205,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param databaseName must not be {@literal null} or empty.
*/
public MongoTemplate(MongoClient mongoClient, String databaseName) {
this(new SimpleMongoDbFactory(mongoClient, databaseName), (MongoConverter) null);
this(new SimpleMongoDbFactory(mongoClient, databaseName), null);
}
/**
@@ -212,7 +214,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoTemplate(MongoDbFactory mongoDbFactory) {
this(mongoDbFactory, (MongoConverter) null);
this(mongoDbFactory, null);
}
/**
@@ -230,7 +232,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
this.queryMapper = new QueryMapper(this.mongoConverter);
this.updateMapper = new UpdateMapper(this.mongoConverter);
this.schemaMapper = new MongoJsonSchemaMapper(this.mongoConverter);
this.projectionFactory = new SpelAwareProxyProjectionFactory();
// We always have a mapping context in the converter, whether it's a simple one or not
@@ -245,19 +246,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
}
private MongoTemplate(MongoDbFactory dbFactory, MongoTemplate that) {
this.mongoDbFactory = dbFactory;
this.exceptionTranslator = that.exceptionTranslator;
this.mongoConverter = that.mongoConverter instanceof MappingMongoConverter ? getDefaultMongoConverter(dbFactory)
: that.mongoConverter;
this.queryMapper = that.queryMapper;
this.updateMapper = that.updateMapper;
this.schemaMapper = that.schemaMapper;
this.projectionFactory = that.projectionFactory;
this.mappingContext = that.mappingContext;
}
/**
* Configures the {@link WriteResultChecking} to be used with the template. Setting {@literal null} will reset the
* default of {@link #DEFAULT_WRITE_RESULT_CHECKING}.
@@ -391,7 +379,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
FindIterable<Document> cursor = new QueryCursorPreparer(query, entityType)
.prepare(collection.find(mappedQuery, Document.class).projection(mappedFields));
.prepare(collection.find(mappedQuery).projection(mappedFields));
return new CloseableIterableCursorAdapter<T>(cursor, exceptionTranslator,
new ProjectingReadCallback<>(mongoConverter, entityType, returnType, collectionName));
@@ -503,10 +491,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
public <T> T execute(DbCallback<T> action) {
Assert.notNull(action, "DbCallback must not be null!");
Assert.notNull(action, "DbCallbackmust not be null!");
try {
MongoDatabase db = prepareDatabase(this.doGetDatabase());
MongoDatabase db = this.getDb();
return action.doInDB(db);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
@@ -533,37 +521,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(callback, "CollectionCallback must not be null!");
try {
MongoCollection<Document> collection = getAndPrepareCollection(doGetDatabase(), collectionName);
MongoCollection<Document> collection = getAndPrepareCollection(getDb(), collectionName);
return callback.doInCollection(collection);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#withSession(com.mongodb.ClientSessionOptions)
*/
@Override
public SessionScoped withSession(ClientSessionOptions options) {
Assert.notNull(options, "ClientSessionOptions must not be null!");
return withSession(() -> mongoDbFactory.getSession(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoTemplate withSession(ClientSession session) {
Assert.notNull(session, "ClientSession must not be null!");
return new SessionBoundMongoTemplate(session, MongoTemplate.this);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#createCollection(java.lang.Class)
@@ -578,9 +542,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
public <T> MongoCollection<Document> createCollection(Class<T> entityClass,
@Nullable CollectionOptions collectionOptions) {
Assert.notNull(entityClass, "EntityClass must not be null!");
return doCreateCollection(determineCollectionName(entityClass), convertToDocument(collectionOptions, entityClass));
return createCollection(determineCollectionName(entityClass), collectionOptions);
}
/*
@@ -607,7 +569,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#getCollection(java.lang.String)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#getCollection(java.lang.String)
*/
public MongoCollection<Document> getCollection(final String collectionName) {
@@ -638,7 +600,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return execute(new DbCallback<Boolean>() {
public Boolean doInDB(MongoDatabase db) throws MongoException, DataAccessException {
for (String name : db.listCollectionNames()) {
if (name.equals(collectionName)) {
return true;
@@ -681,7 +642,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#indexOps(java.lang.String)
*/
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(this, collectionName, null);
return new DefaultIndexOperations(getMongoDbFactory(), collectionName, queryMapper);
}
/*
@@ -689,7 +650,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#indexOps(java.lang.Class)
*/
public IndexOperations indexOps(Class<?> entityClass) {
return new DefaultIndexOperations(this, determineCollectionName(entityClass), entityClass);
return new DefaultIndexOperations(getMongoDbFactory(), determineCollectionName(entityClass), queryMapper,
entityClass);
}
/*
@@ -836,87 +798,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return doFindOne(collectionName, new Document(idKey, id), new Document(), entityClass);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findDistinct(org.springframework.data.mongodb.core.query.Query, java.lang.String, java.lang.Class, java.lang.Class)
*/
@Override
public <T> List<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(query, field, determineCollectionName(entityClass), entityClass, resultClass);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findDistinct(org.springframework.data.mongodb.core.query.Query, java.lang.String, java.lang.String, java.lang.Class, java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <T> List<T> findDistinct(Query query, String field, String collectionName, Class<?> entityClass,
Class<T> resultClass) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(field, "Field must not be null!");
Assert.notNull(collectionName, "CollectionName must not be null!");
Assert.notNull(entityClass, "EntityClass must not be null!");
Assert.notNull(resultClass, "ResultClass must not be null!");
MongoPersistentEntity<?> entity = entityClass != Object.class ? getPersistentEntity(entityClass) : null;
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), entity);
String mappedFieldName = queryMapper.getMappedFields(new Document(field, 1), entity).keySet().iterator().next();
Class<T> mongoDriverCompatibleType = getMongoDbFactory().getCodecFor(resultClass).map(Codec::getEncoderClass)
.orElse((Class) BsonValue.class);
MongoIterable<?> result = execute(collectionName, (collection) -> {
DistinctIterable<T> iterable = collection.distinct(mappedFieldName, mappedQuery, mongoDriverCompatibleType);
return query.getCollation().map(Collation::toMongoCollation).map(iterable::collation).orElse(iterable);
});
if (resultClass == Object.class || mongoDriverCompatibleType != resultClass) {
MongoConverter converter = getConverter();
DefaultDbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
result = result.map((source) -> converter.mapValueToTargetType(source,
getMostSpecificConversionTargetType(resultClass, entityClass, field), dbRefResolver));
}
try {
return (List<T>) result.into(new ArrayList<>());
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
/**
* @param userType must not be {@literal null}.
* @param domainType must not be {@literal null}.
* @param field must not be {@literal null}.
* @return the most specific conversion target type depending on user preference and domain type property.
* @since 2.1
*/
private static Class<?> getMostSpecificConversionTargetType(Class<?> userType, Class<?> domainType, String field) {
Class<?> conversionTargetType = userType;
try {
Class<?> propertyType = PropertyPath.from(field, domainType).getLeafProperty().getLeafType();
// use the more specific type but favor UserType over property one
if (ClassUtils.isAssignable(userType, propertyType)) {
conversionTargetType = propertyType;
}
} catch (PropertyReferenceException e) {
// just don't care about it as we default to Object.class anyway.
}
return conversionTargetType;
}
@Override
public <T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass) {
return geoNear(near, entityClass, determineCollectionName(entityClass));
@@ -1075,13 +956,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(query, "Query must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
CountOptions options = new CountOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
Document document = queryMapper.getMappedObject(query.getQueryObject(),
Optional.ofNullable(entityClass).map(it -> mappingContext.getPersistentEntity(entityClass)));
return execute(collectionName, collection -> collection.count(document, options));
return execute(collectionName, collection -> collection.count(document));
}
/*
@@ -1128,9 +1006,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
protected MongoCollection<Document> prepareCollection(MongoCollection<Document> collection) {
if (this.readPreference != null) {
collection = collection.withReadPreference(readPreference);
return collection.withReadPreference(readPreference);
}
return collection;
}
@@ -1396,7 +1273,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, document, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (writeConcernToUse == null) {
collection.insertOne(document);
} else {
@@ -1751,6 +1627,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
if (writeConcernToUse == null) {
dr = collection.deleteMany(removeQuery, options);
} else {
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(removeQuery, options);
@@ -1806,10 +1683,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
MongoCollection<Document> inputCollection = getAndPrepareCollection(doGetDatabase(), inputCollectionName);
MongoCollection<Document> inputCollection = getCollection(inputCollectionName);
// MapReduceOp
MapReduceIterable<Document> result = inputCollection.mapReduce(mapFunc, reduceFunc, Document.class);
MapReduceIterable<Document> result = inputCollection.mapReduce(mapFunc, reduceFunc);
if (query != null && result != null) {
if (query.getLimit() > 0 && mapReduceOptions.getLimit() == null) {
@@ -1839,18 +1716,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
if (!CollectionUtils.isEmpty(mapReduceOptions.getScopeVariables())) {
result = result.scope(new Document(mapReduceOptions.getScopeVariables()));
}
if (mapReduceOptions.getLimit() != null && mapReduceOptions.getLimit().intValue() > 0) {
result = result.limit(mapReduceOptions.getLimit());
}
if (mapReduceOptions.getFinalizeFunction().filter(StringUtils::hasText).isPresent()) {
result = result.finalizeFunction(mapReduceOptions.getFinalizeFunction().get());
}
if (mapReduceOptions.getJavaScriptMode() != null) {
result = result.jsMode(mapReduceOptions.getJavaScriptMode());
}
if (mapReduceOptions.getOutputSharded().isPresent()) {
result = result.sharded(mapReduceOptions.getOutputSharded().get());
}
if (StringUtils.hasText(mapReduceOptions.getOutputCollection()) && !mapReduceOptions.usesInlineOutput()) {
result = result.collectionName(mapReduceOptions.getOutputCollection())
.action(mapReduceOptions.getMapReduceAction());
if (mapReduceOptions.getOutputDatabase().isPresent()) {
result = result.databaseName(mapReduceOptions.getOutputDatabase().get());
}
}
}
result = collation.map(Collation::toMongoCollation).map(result::collation).orElse(result);
@@ -1892,13 +1783,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
if (document.containsKey("$reduce")) {
document.put("$reduce", replaceWithResourceIfNecessary(ObjectUtils.nullSafeToString(document.get("$reduce"))));
document.put("$reduce", replaceWithResourceIfNecessary(document.get("$reduce").toString()));
}
if (document.containsKey("$keyf")) {
document.put("$keyf", replaceWithResourceIfNecessary(ObjectUtils.nullSafeToString(document.get("$keyf"))));
document.put("$keyf", replaceWithResourceIfNecessary(document.get("$keyf").toString()));
}
if (document.containsKey("finalize")) {
document.put("finalize", replaceWithResourceIfNecessary(ObjectUtils.nullSafeToString(document.get("finalize"))));
document.put("finalize", replaceWithResourceIfNecessary(document.get("finalize").toString()));
}
Document commandObject = new Document("group", document);
@@ -1907,7 +1798,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
LOGGER.debug("Executing Group with Document [{}]", serializeToJsonSafely(commandObject));
}
Document commandResult = executeCommand(commandObject, this.readPreference);
Document commandResult = executeCommand(commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [{}]", commandResult);
@@ -2057,94 +1948,84 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
Document commandResult = new BatchAggregationLoader(this, queryMapper, mappingContext, readPreference,
Integer.MAX_VALUE)
.aggregate(collectionName, aggregation, context);
return doAggregate(aggregation, collectionName, outputType, rootContext);
return new AggregationResults<>(returnPotentiallyMappedResults(outputType, commandResult, collectionName),
commandResult);
}
@SuppressWarnings("ConstantConditions")
protected <O> AggregationResults<O> doAggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
AggregationOperationContext context) {
/**
* Returns the potentially mapped results of the given {@code commandResult}.
*
* @param outputType
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, Document commandResult,
String collectionName) {
@SuppressWarnings("unchecked")
Iterable<Document> resultSet = (Iterable<Document>) commandResult.get("result");
if (resultSet == null) {
return Collections.emptyList();
}
DocumentCallback<O> callback = new UnwrapAndReadDocumentCallback<>(mongoConverter, outputType, collectionName);
AggregationOptions options = aggregation.getOptions();
if (options.isExplain()) {
Document command = aggregation.toDocument(collectionName, context);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
Document commandResult = executeCommand(command);
return new AggregationResults<>(commandResult.get("results", new ArrayList<Document>(0)).stream()
.map(callback::doWith).collect(Collectors.toList()), commandResult);
List<O> mappedResults = new ArrayList<>();
for (Document document : resultSet) {
mappedResults.add(callback.doWith(document));
}
List<Document> pipeline = aggregation.toPipeline(context);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {} in collection {}", serializeToJsonSafely(pipeline), collectionName);
}
return execute(collectionName, collection -> {
List<Document> rawResult = new ArrayList<>();
AggregateIterable<Document> aggregateIterable = collection.aggregate(pipeline, Document.class) //
.collation(options.getCollation().map(Collation::toMongoCollation).orElse(null)) //
.allowDiskUse(options.isAllowDiskUse());
if (options.getCursorBatchSize() != null) {
aggregateIterable = aggregateIterable.batchSize(options.getCursorBatchSize());
}
MongoIterable<O> iterable = aggregateIterable.map(val -> {
rawResult.add(val);
return callback.doWith(val);
});
return new AggregationResults<>(iterable.into(new ArrayList<>()),
new Document("results", rawResult).append("ok", 1.0D));
});
return mappedResults;
}
@SuppressWarnings("ConstantConditions")
protected <O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName,
Class<O> outputType, @Nullable AggregationOperationContext context) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
Assert.isTrue(!aggregation.getOptions().isExplain(), "Can't use explain option with streaming!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
AggregationOptions options = aggregation.getOptions();
List<Document> pipeline = aggregation.toPipeline(rootContext);
AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
Document command = aggregationUtil.createCommand(collectionName, aggregation, rootContext);
assertNotExplain(command);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Streaming aggregation: {} in collection {}", serializeToJsonSafely(pipeline), collectionName);
LOGGER.debug("Streaming aggregation: {}", serializeToJsonSafely(command));
}
ReadDocumentCallback<O> readCallback = new ReadDocumentCallback<>(mongoConverter, outputType, collectionName);
return execute(collectionName, (CollectionCallback<CloseableIterator<O>>) collection -> {
return execute(collectionName, new CollectionCallback<CloseableIterator<O>>() {
AggregateIterable<Document> cursor = collection.aggregate(pipeline, Document.class) //
.allowDiskUse(options.isAllowDiskUse()) //
.useCursor(true);
@Override
public CloseableIterator<O> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (options.getCursorBatchSize() != null) {
cursor = cursor.batchSize(options.getCursorBatchSize());
List<Document> pipeline = (List<Document>) command.get("pipeline");
AggregationOptions options = AggregationOptions.fromDocument(command);
AggregateIterable<Document> cursor = collection.aggregate(pipeline).allowDiskUse(options.isAllowDiskUse())
.useCursor(true);
Integer cursorBatchSize = options.getCursorBatchSize();
if (cursorBatchSize != null) {
cursor = cursor.batchSize(cursorBatchSize);
}
if (options.getCollation().isPresent()) {
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
}
return new CloseableIterableCursorAdapter<>(cursor.iterator(), exceptionTranslator, readCallback);
}
if (options.getCollation().isPresent()) {
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
}
return new CloseableIterableCursorAdapter<>(cursor.iterator(), exceptionTranslator, readCallback);
});
}
@@ -2193,6 +2074,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return new ExecutableInsertOperationSupport(this).insert(domainType);
}
/**
* Assert that the {@link Document} does not enable Aggregation explain mode.
*
* @param command the command {@link Document}.
*/
private void assertNotExplain(Document command) {
Boolean explain = command.get("explain", Boolean.class);
if (explain != null && explain) {
throw new IllegalArgumentException("Can't use explain option with streaming!");
}
}
protected String replaceWithResourceIfNecessary(String function) {
String func = function;
@@ -2239,17 +2134,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
public MongoDatabase getDb() {
return doGetDatabase();
}
protected MongoDatabase doGetDatabase() {
return mongoDbFactory.getDb();
}
protected MongoDatabase prepareDatabase(MongoDatabase database) {
return database;
}
protected <T> void maybeEmitEvent(MongoMappingEvent<T> event) {
if (null != eventPublisher) {
eventPublisher.publishEvent(event);
@@ -2284,21 +2171,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
co.collation(IndexConverters.fromDocument(collectionOptions.get("collation", Document.class)));
}
if (collectionOptions.containsKey("validator")) {
com.mongodb.client.model.ValidationOptions options = new com.mongodb.client.model.ValidationOptions();
if (collectionOptions.containsKey("validationLevel")) {
options.validationLevel(ValidationLevel.fromString(collectionOptions.getString("validationLevel")));
}
if (collectionOptions.containsKey("validationAction")) {
options.validationAction(ValidationAction.fromString(collectionOptions.getString("validationAction")));
}
options.validator(collectionOptions.get("validator", Document.class));
co.validationOptions(options);
}
db.createCollection(collectionName, co);
MongoCollection<Document> coll = db.getCollection(collectionName, Document.class);
@@ -2411,70 +2283,19 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
new ProjectingReadCallback<>(mongoConverter, sourceClass, targetClass, collectionName), collectionName);
}
/**
* Convert given {@link CollectionOptions} to a document and take the domain type information into account when
* creating a mapped schema for validation. <br />
* This method calls {@link #convertToDocument(CollectionOptions)} for backwards compatibility and potentially
* overwrites the validator with the mapped validator document. In the long run
* {@link #convertToDocument(CollectionOptions)} will be removed so that this one becomes the only source of truth.
*
* @param collectionOptions can be {@literal null}.
* @param targetType must not be {@literal null}. Use {@link Object} type instead.
* @return never {@literal null}.
* @since 2.1
*/
protected Document convertToDocument(@Nullable CollectionOptions collectionOptions, Class<?> targetType) {
Document doc = convertToDocument(collectionOptions);
if (collectionOptions != null) {
collectionOptions.getValidationOptions().ifPresent(it -> it.getValidator() //
.ifPresent(val -> doc.put("validator", getMappedValidator(val, targetType))));
}
return doc;
}
/**
* @param collectionOptions can be {@literal null}.
* @return never {@literal null}.
* @deprecated since 2.1 in favor of {@link #convertToDocument(CollectionOptions, Class)}.
*/
@Deprecated
protected Document convertToDocument(@Nullable CollectionOptions collectionOptions) {
Document document = new Document();
if (collectionOptions != null) {
collectionOptions.getCapped().ifPresent(val -> document.put("capped", val));
collectionOptions.getSize().ifPresent(val -> document.put("size", val));
collectionOptions.getMaxDocuments().ifPresent(val -> document.put("max", val));
collectionOptions.getCollation().ifPresent(val -> document.append("collation", val.toDocument()));
collectionOptions.getValidationOptions().ifPresent(it -> {
it.getValidationLevel().ifPresent(val -> document.append("validationLevel", val.getValue()));
it.getValidationAction().ifPresent(val -> document.append("validationAction", val.getValue()));
it.getValidator().ifPresent(val -> document.append("validator", getMappedValidator(val, Object.class)));
});
}
return document;
}
Document getMappedValidator(Validator validator, Class<?> domainType) {
Document validationRules = validator.toDocument();
if (validationRules.containsKey("$jsonSchema")) {
return schemaMapper.mapSchema(validationRules, domainType);
}
return queryMapper.getMappedObject(validationRules, mappingContext.getPersistentEntity(domainType));
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
* The first document that matches the query is returned and also removed from the collection in the database.
@@ -2536,18 +2357,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param savedObject
* @param id
*/
@SuppressWarnings("unchecked")
protected void populateIdIfNecessary(Object savedObject, Object id) {
if (id == null) {
return;
}
if (savedObject instanceof Map) {
Map<String, Object> map = (Map<String, Object>) savedObject;
map.put(ID_FIELD, id);
if (savedObject instanceof Document) {
Document document = (Document) savedObject;
document.put(ID_FIELD, id);
return;
}
@@ -2595,7 +2413,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
try {
T result = objectCallback
.doWith(collectionCallback.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName)));
.doWith(collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName)));
return result;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
@@ -2630,7 +2448,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
try {
FindIterable<Document> iterable = collectionCallback
.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName));
.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
iterable = preparer.prepare(iterable);
@@ -2666,7 +2484,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
try {
FindIterable<Document> iterable = collectionCallback
.doInCollection(getAndPrepareCollection(doGetDatabase(), collectionName));
.doInCollection(getAndPrepareCollection(getDb(), collectionName));
if (preparer != null) {
iterable = preparer.prepare(iterable);
@@ -2775,6 +2593,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return fields;
}
/**
* Tries to convert the given {@link RuntimeException} into a {@link DataAccessException} but returns the original
* exception if the conversation failed. Thus allows safe re-throwing of the return value.
@@ -2811,7 +2630,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public Document doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
FindIterable<Document> iterable = collection.find(query, Document.class);
FindIterable<Document> iterable = collection.find(query);
if (LOGGER.isDebugEnabled()) {
@@ -2856,7 +2675,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public FindIterable<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
return collection.find(query, Document.class).projection(fields);
return collection.find(query).projection(fields);
}
}
@@ -3288,12 +3107,18 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private static final String OK = "ok";
private final MongoTemplate template;
private final QueryMapper queryMapper;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ReadPreference readPreference;
private final int batchSize;
BatchAggregationLoader(MongoTemplate template, ReadPreference readPreference, int batchSize) {
BatchAggregationLoader(MongoTemplate template, QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
ReadPreference readPreference, int batchSize) {
this.template = template;
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
this.readPreference = readPreference;
this.batchSize = batchSize;
}
@@ -3316,11 +3141,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* Pre process the aggregation command sent to the server by adding {@code cursor} options to match execution on
* different server versions.
*/
private static Document prepareAggregationCommand(String collectionName, Aggregation aggregation,
private Document prepareAggregationCommand(String collectionName, Aggregation aggregation,
@Nullable AggregationOperationContext context, int batchSize) {
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
Document command = aggregation.toDocument(collectionName, rootContext);
AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
Document command = aggregationUtil.createCommand(collectionName, aggregation, rootContext);
if (!aggregation.getOptions().isExplain()) {
command.put(CURSOR_FIELD, new Document(BATCH_SIZE_FIELD, batchSize));
@@ -3410,52 +3237,4 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return ((Document) commandResult.get(CURSOR_FIELD)).get("id");
}
}
/**
* {@link MongoTemplate} extension bound to a specific {@link ClientSession} that is applied when interacting with the
* server through the driver API.
* <p />
* The prepare steps for {@link MongoDatabase} and {@link MongoCollection} proxy the target and invoke the desired
* target method matching the actual arguments plus a {@link ClientSession}.
*
* @author Christoph Strobl
* @since 2.1
*/
static class SessionBoundMongoTemplate extends MongoTemplate {
private final MongoTemplate delegate;
/**
* @param session must not be {@literal null}.
* @param that must not be {@literal null}.
*/
SessionBoundMongoTemplate(ClientSession session, MongoTemplate that) {
super(that.getMongoDbFactory().withSession(session), that);
this.delegate = that;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoTemplate#getCollection(java.lang.String)
*/
@Override
public MongoCollection<Document> getCollection(String collectionName) {
// native MongoDB objects that offer methods with ClientSession must not be proxied.
return delegate.getCollection(collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoTemplate#getDb()
*/
@Override
public MongoDatabase getDb() {
// native MongoDB objects that offer methods with ClientSession must not be proxied.
return delegate.getDb();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2018 the original author or authors.
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -156,7 +156,7 @@ public interface ReactiveFindOperation {
/**
* Result type override (optional).
*/
interface FindWithProjection<T> extends FindWithQuery<T>, FindDistinct {
interface FindWithProjection<T> extends FindWithQuery<T> {
/**
* Define the target type fields should be mapped to. <br />
@@ -170,101 +170,8 @@ public interface ReactiveFindOperation {
<R> FindWithQuery<R> as(Class<R> resultType);
}
/**
* Distinct Find support.
*
* @author Christoph Strobl
* @since 2.1
*/
interface FindDistinct {
/**
* Finds the distinct values for a specified {@literal field} across a single
* {@link com.mongodb.reactivestreams.client.MongoCollection} or view.
*
* @param field name of the field. Must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if field is {@literal null}.
*/
TerminatingDistinct<Object> distinct(String field);
}
/**
* Result type override. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithProjection {
/**
* Define the target type the result should be mapped to. <br />
* Skip this step if you are anyway fine with the default conversion.
* <dl>
* <dt>{@link Object} (the default)</dt>
* <dd>Result is mapped according to the {@link org.bson.BsonType} converting eg. {@link org.bson.BsonString} into
* plain {@link String}, {@link org.bson.BsonInt64} to {@link Long}, etc. always picking the most concrete type with
* respect to the domain types property.<br />
* Any {@link org.bson.BsonType#DOCUMENT} is run through the {@link org.springframework.data.convert.EntityReader}
* to obtain the domain type. <br />
* Using {@link Object} also works for non strictly typed fields. Eg. a mixture different types like fields using
* {@link String} in one {@link org.bson.Document} while {@link Long} in another.</dd>
* <dt>Any Simple type like {@link String}, {@link Long}, ...</dt>
* <dd>The result is mapped directly by the MongoDB Java driver and the {@link org.bson.codecs.CodeCodec Codecs} in
* place. This works only for results where all documents considered for the operation use the very same type for
* the field.</dd>
* <dt>Any Domain type</dt>
* <dd>Domain types can only be mapped if the if the result of the actual {@code distinct()} operation returns
* {@link org.bson.BsonType#DOCUMENT}.</dd>
* <dt>{@link org.bson.BsonValue}</dt>
* <dd>Using {@link org.bson.BsonValue} allows retrieval of the raw driver specific format, which returns eg.
* {@link org.bson.BsonString}.</dd>
* </dl>
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> TerminatingDistinct<R> as(Class<R> resultType);
}
/**
* Result restrictions. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithQuery<T> extends DistinctWithProjection {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
TerminatingDistinct<T> matching(Query query);
}
/**
* Terminating distinct find operations.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingDistinct<T> extends DistinctWithQuery<T> {
/**
* Get all matching distinct field values.
*
* @return empty {@link Flux} if not match found. Never {@literal null}.
*/
Flux<T> all();
}
/**
* {@link ReactiveFind} provides methods for constructing lookup operations in a fluent way.
*/
interface ReactiveFind<T> extends FindWithCollection<T>, FindWithProjection<T>, FindDistinct {}
interface ReactiveFind<T> extends FindWithCollection<T>, FindWithProjection<T> {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.lang.Nullable;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -27,7 +28,6 @@ import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -196,18 +196,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindDistinct#distinct(java.lang.String)
*/
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
return new DistinctOperationSupport<>(this, field);
}
private Flux<T> doFind(@Nullable FindPublisherPreparer preparer) {
Document queryObject = query.getQueryObject();
@@ -217,13 +205,6 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
preparer != null ? preparer : getCursorPreparer(query));
}
@SuppressWarnings("unchecked")
private Flux<T> doFindDistinct(String field) {
return template.findDistinct(query, field, getCollectionName(), domainType,
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private FindPublisherPreparer getCursorPreparer(Query query) {
return template.new QueryFindPublisherPreparer(query, domainType);
}
@@ -235,55 +216,5 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
private String asString() {
return SerializationUtils.serializeToJsonSafely(query);
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class DistinctOperationSupport<T> implements TerminatingDistinct<T> {
private final String field;
private final ReactiveFindSupport delegate;
public DistinctOperationSupport(ReactiveFindSupport delegate, String field) {
this.delegate = delegate;
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new DistinctOperationSupport<>((ReactiveFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings("unchecked")
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new DistinctOperationSupport<>((ReactiveFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core..ReactiveFindOperation.TerminatingDistinct#all()
*/
@Override
public Flux<T> all() {
return delegate.doFindDistinct(field);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017-2018 the original author or authors.
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

Some files were not shown because too many files have changed in this diff Show More